
Overview
These sensors are of the latest generation and measure:
- Atmospheric pressure
- Temperature and humidity
- Light intensity (in LUX, max 650)
To help you build projects and store the data collected locally, this shield has a slot for a microSD card (not provided).
There is a ready to use library with examples and methods to read values from the different sensors, that provides an easy and smooth integration path.
Sensors used :
Tech specs
ICs |
LPS22HB |
Input Voltage | 3.3V |
Operating Voltage | 3.3V |
Ranges |
Pressure: 260 to 1260 hPa |
Communication | I2C/Analog |
Length | 61 mm |
Width | 25 mm |
Weight | 32 gr. |
Conformities
Resources for Safety and Products
Manufacturer Information
The production information includes the address and related details of the product manufacturer.
Arduino S.r.l.
Via Andrea Appiani, 25
Monza, MB, IT, 20900
https://www.arduino.cc/
Responsible Person in the EU
An EU-based economic operator who ensures the product's compliance with the required regulations.
Arduino S.r.l.
Via Andrea Appiani, 25
Monza, MB, IT, 20900
Phone: +39 0113157477
Email: support@arduino.cc
Documentation
OSH: Schematics
The Arduino MKR ENV Shield is open-source hardware! You can build your own board using the following files:
EAGLE FILES IN .ZIP SCHEMATICS IN .PDF FRITZING IN .FZPZSensors used :
Learn more
Get Inspired
A quick tutorial on how to interface the voice recognition module with few examples.

For people not familiar with American Sign Language (ASL), being able to recognize what certain hand motions and positions mean is a nearly impossible task. To make this process easier, Hackster.io user ayooluwa98 came up with the idea to integrate various motion, resistive, and touch sensors into a single glove that could convert these signals into understandable text and speech. The system is based around a single Arduino Nano board, which is responsible for taking in sensor data and outputting the phrase that best matches the inputs. The orientation of the hand is ascertained by reading values from the X, Y, and Z axes of a single accelerometer and applying a small change based upon prior calibration. Meanwhile, resistive flex sensors spanning the length of each finger produce a different voltage level according to the bend’s extent. At each iteration of the program’s main loop, a series of Boolean statements are evaluated to pick the phrase that best matches the current finger bends and hand orientation, and this data is then outputted via the UART pins to an attached Bluetooth® HC-05 module. The final component is a connected phone running a custom app that takes the incoming words from Bluetooth® and saves them for text-to-speech output when the button is pressed. To see more about this project, you can read ayooluwa98’s write-up here on Hackster.io.