Overview
Go interactive with your applications with this new sensor kit! 37 sensors are included in one kit, which definitely gives you a bigger bang for your buck.
In the version of this sensor kit, we add more sensor modules like, color sensor, digital wattmeter, heart rate monitor sensor, conductivity sensor switch, and digital shake sensor. Besides that, there are RGB LED module, speaker, vibration motor module, DC motor module and RGB backlight display to help you build up your interactive projects.
Use a variety of sensors to realize your brilliant idea! All of our sensors employ Gravity standard interface, plug and play, also we have a huge amount of code data for your reference. Plug everything in our IO expansion board, burn the codes into it, and then you are ready to go.
With everything included in this kit, you will be able to sense color, heart rate, light, temperature, gas, humidity, flame, direction or let you project speak, shine, and display text.
This Kit Includes:
- Gravity: TCS34725 RGB Color Sensor
- Gravity: Heart Rate Monitor Sensor
- Gravity: I2C Digital Wattmeter
- Gravity: Conductivity Sensor Switch
- Gravity: Digital Shake Sensor
- Gravity: Analog Grayscale Sensor
- Gravity: Analog LM35 Temperature Sensor
- Gravity: Analog Ambient Light Sensor
- Gravity: Digital Vibration Sensor
- Gravity: Digital Tilt Sensor
- Gravity: Digital Capacitive Touch Sensor
- Gravity: Digital Magnetic Sensor
- Gravity: Analog Sound Sensor For Arduino
- Gravity: Analog Carbon Monoxide Sensor (MQ7)
- Gravity: Analog Voltage Divider V2
- Gravity: Digital Piezo Disk Vibration Sensor
- Gravity:Analog Rotation Potentiometer Sensor V2
- Gravity: Joystick Module V2
- Gravity: Analog Flame Sensor
- Gravity: Triple Axis Accelerometer MMA7361
- Gravity: Digital Infrared Motion Sensor
- Gravity: URM09 Analog Ultrasonic Sensor
- Gravity: Analog Soil Moisture Sensor
- Gravity: Steam Sensor
- Gravity:Digital Push Button (White)
- Gravity: Digital Push Button (Red)-DFRobot
- Gravity:Digital Push Button (Yellow)
- Gravity:Digital White LED Light Module
- Gravity:Digital RED LED Light Module
- Gravity: Digital Green LED Light Module
- Gravity: Digital Blue LED Light Module
- Gravity: Digital RGB LED Module
- Gravity: Digital 5A Relay Module
- Gravity: Digital Speaker Module
- Gravity: Vibration Motor Module
- Gravity: 130 DC Motor Module
- Gravity: I2C 16x2 Arduino LCD with RGB Backlight Display
Discover more about the sensors included in the set!
Get Inspired
An intelligent device to track moves with responses during an interactive space with mapping, backlight, music and smart sculptures. This project makes use of a machine learning algorithm capable of tracking and detecting moves to identify associated gesture recognition through a microcontroller. Smart sculptures, lighting, music and video projection to trigger with each assigned gesture, creating a powerful AV experience highlighting the incredible potential of TinyML for the performing arts. This allows the corresponding media set Tiny ML in interactive to play when the right move was made because all these elements interact to create a new experience. This allows us to create Interactive installations, these sculptures use a combination of motors, sensors, and other electronics to create an immersive and interactive experience for the viewer. They may include projections, sound, and other sensory elements to create a complete experience.
With an array of onboard sensors, Bluetooth® Low Energy connectivity, and the ability to perform edge AI tasks thanks to its nRF52840 SoC, the Arduino Nano 33 BLE Sense is a great choice for a wide variety of embedded applications. Further demonstrating this point, a group of students from the Introduction to Embedded Deep Learning course at Carnegie Mellon University have published the culmination of their studies through 10 excellent projects that each use the Tiny Machine Learning Kit and Edge Impulse ML platform. Wrist-based human activity recognition Traditional human activity tracking has relied on the use of smartwatches and phones to recognize certain exercises based on IMU data. However, few have achieved both continuous and low-power operation, which is why Omkar Savkur, Nicholas Toldalagi, and Kevin Xie explored training an embedded model on combined accelerometer and microphone data to distinguish between handwashing, brushing one’s teeth, and idling. Their project continuously runs inferencing on incoming data and then displays the action on both a screen and via two LEDs. Categorizing trash with sound In some circumstances, such as smart cities or home recycling, knowing what types of materials are being thrown away can provide a valuable datapoint for waste management systems. Students Jacky Wang and Gordonson Yan created their project, called SBTrashCat, to recognize trash types by the sounds they make when being thrown into a bin. Currently, the model can three different kinds, along with background noise and human voices to eliminate false positives. Distributed edge machine learning The abundance of Internet of Things (IoT) devices has meant an explosion of computational power and the amount of data needing to be processed before it can become useful. Because a single low-cost edge device does not possess enough power on its own for some tasks, Jong-Ik Park, Chad Taylor, and Anudeep Bolimera have designed a system where