Getting Started

Adafruit Playground is a wonderful and safe place to share your interests with Adafruit's vibrant community of makers and doers. Have a cool project you are working on? Have a bit of code that you think others will find useful? Want to show off your electronics workbench? You have come to the right place.
The goal of Adafruit Playground is to make it as simple as possible to share your work. On the Adafruit Playground users can create Notes. A note is a single-page space where you can document your topic using Adafruit's easy-to-use editor. Notes are like Guides on the Adafruit Learning System but guides are high-fidelity content curated and maintained by Adafuit. Notes are whatever you want them to be. Have fun and be kind.
Click here to learn more about Adafruit Playground and how to get started.
-
Media hub: Media control w/opt Bluetooth Overview
Physical controls for a more enjoyable media playback experience.
Features
- Responsive volume knob & mute button.
- Transport controls (play/pause, stop, FF/REW, skip tracks).
- Pair up with your favourite Bluetooth® speakers.
- Quick, physical connection (don't have to go through menu system to pair with keypad & speakers).
- Customizable controls/scheme.
- Customizable setup: Optionally connect USB hub/dock for added features, for example:
- mirror phone to TV w/HDMI out: Watch videos on a bigger screen
- add keyboard: Better typing experience for texts/emails.
- Solderless project.
Main hardware
Program your device
To get your RP2040 macropad to act as a media controller, it needs to:
- detect keypresses and changes in the knob position
- send out corresponding keyboard messages to the attached USB host device (ex: a phone).
My own implementation for this "media hub" is written for Adafruit's CircuitPython, and can be found here:
- 💾 MediaController project README (see MediaHub_AFMacropad). 🚀 Installation instructions.
-
supervisor.runtime.display in CircuitPython 9.2.5+ CircuitPython 9.2.5 adds a new property to the supervisor Runtime object,
display
.If your board has a built in display that is automatically configured by the CircuitPython core (e.g., boards like the Feather ESP32-S3 Reverse TFT), then this display is available as
supervisor.runtime.display
in addition toboard.DISPLAY
.So what's different about the new
supervisor.runtime.display
?- This property is available on all boards that support
displayio
, not just boards with built in displays - Unlike
board.DISPLAY
, this property is settable, and remembers its value after your code file finishes running. This means you can set this property once in boot.py and then use the display each time yourcode.py
runs, or re-use a display set by a previous run of code.py. - Due to technical limitations in CircuitPython, when a display is released,
board.DISPLAY
becomes a "None-like object": one that prints asNone
but fails the checkboard.DISPLAY is None
.supervisor.runtime.display is None
works correctly to check whether a default display is configured.
Setting
supervisor.runtime.display
There are two approaches:
- Do it unconditionally in boot.py and depend on this in code.py
- Do it conditionally in code.py, if
supervisor.runtime.display is None
... in boot.py
Here's a code snippet that shows configuring a 240x240 ST7789 display connected to an EyeSpi BFF on a QT Py board like the QT Py RP2040:
- This property is available on all boards that support
-
Zephyr Quest: ST7789 Display with Feather RP2350 As part of a series on Zephyr with Adafruit hardware, this guide shows how to configure Zephyr to use an ST7789 TFT display with a Feather RP2350. By connecting the display with a breadboard, we can use a logic analyzer to verify that the Zephyr display driver pin configuration agrees with the CircuitPython display driver. This guide is meant for people interested in adding support for Adafruit displays to Zephyr.
Parts
-
A Node Based CAD Add-in for Fusion 360 Gradient is a node-based editor designed for use with Autodesk Fusion 360.
I started working on Gradient after playing with the node based geometry in Blender. The nodes in Blender are fantastic (frankly they work much better than mine do right now) but I wanted to create algorithmically defined geometry right in Fusion 360 using solids and surfaces which are better suited to making 3D models for technical and engineering parts. I wanted to be able to make algorithemic structures that are user defined but and can be finely controlled and tuned to the design needs.
Models like this would be very time consuming to model by hand. However they can be quickly generated and re-generated with different parameters or random seeds.
If you are interested in trying Gradient for yourself you can download it from the github repository below. Please be aware currently Gradient is in an early development stage, meaning only a small fraction of its functionality has been implemented, and there are likely a few bugs to sort through.
-
Monitor Indoor Air Quality with Blues, IFTTT, Adafruit IO and a Hue LED Strip If you're working an office job, you're...in an office. Whether that's in your home or not, I'm going to hazard a guess that it's also indoors - and being indoors for an extended period of time without well-ventilated spaces can legitimately lead to low air quality (and in theory health issues).
Now, I'm not hear to spread fear and make you all think you're dying a slow death by breathing in your co-worker's exhalations. However, I am here to show off an easy way to build a cloud-connected indoor air quality system with:
- A variety of Adafruit air quality sensors.
- A Blues Notecard to cloud-connect the project with LTE connectivity.
- Adafruit IO to visualize air quality data and integrate with other services.
- A Philips Hue LED strip to provide real-time visuals for low air quality alerts.
-
No-Code Easy Ambient Smart Lights Overview
With all the newest features of Adafruit IO, WipperSnapper firmware, and the new Blockly Actions, you can create really complex projects without writing a single line of code. I've been working on a couple versions of smart lights to notify me of different things using NeoPixels. This project will focus on a super easy smart ambient lighting system that you can stick to the back of your computer monitor.
Components
Using just a few Adafruit components is all you need to create a really simple, but pretty powerful little notification system. We are gonna use a QT Py ESP32-S2 WiFi Dev Board to talk to Adafruit IO, a NeoPixel BFF for the QT Py, and finally a short NeoPixel strip with a JST connector pre-attached (so you can connect right up to the BFF board.
The only soldering you will need to do is to attach the boards together. You can solder the boards together with the included pins, or pick up some female headers so you can detach the boards and use them on any future projects.
-
🎵️ Media hub 2.0: Media control w/opt Bluetooth Overview
Physical controls for a more enjoyable media playback experience.
Features
- Responsive volume knob & mute button.
- Transport controls (play/pause, stop, FF/REW, skip tracks).
- Pair up with your favourite Bluetooth® speakers.
- Quick, physical connection (don't have to go through menu system to pair with keypad & speakers).
- Customizable controls/scheme.
This project tries to improve over the original "Media hub" presented here.
- More compact design
- Bigger volume knob - not interfering with macropad keys.
🛒️List of main material/hardware
(See section "More tools/materials/hardware" near the end of this note for extras)
-
Rust on the Adafruit Feather RP2350 HSTX Let's start with the Debug Header, the little 3 pin interface lets you peek into the Pons of the Raspberry Pi brain. You can find this header on the Raspberry Pi 5, the Raspberry Pi Pico (H, and WH), and now happily the Adafruit RP23XX boards thus far. By connecting a Pico Probe to this interface, you can get a lot of debug information back on your big main computer. ARM refers to this type of interfacing / debugging as Semihosting.
With the 3-Pin JST cable, cable plugged into the D header on the Pico Probe and the other end of that cable onto the Debug Header on your Feather (Shown above); With the Pico Probe connected to your computer, and you can connect the Feather to any USB power source, or back to your computer as well. Once you've done that, you're ready to issue the
cargo run
command, so long as you have the required software installed below.Back to the code we cloned all of the way at the top of this article. With all of the software above installed, and your computer connected to the debug probe, the probe connected to the feather, and the feather connected to power or your computer you now have a system that is ready to go. We really can run that
cargo run
command and have Rust code on our feather. But what are all of these files and what do they do?.cargo/config.toml
andcargo run
The
.cargo/config.toml
file configurescargo run
to useprobe-rs
to flash to the boards ROM. You will need a Raspberry Pi Debug Probe in order to use this, but it makes development MUCH easier, faster, and more fun! You connect the Debug Probe'sD
(for debug,D
fordefmt
😉) side to the board's Debug Port. Once done connect the Adafruit board, and Debug Probe to your computer. You can flash at will withcargo run
and see any debug messages in your computer's terminal thanks todefmt
.memory.x
If you've never seen a
memory.x
file before, and have no clue what it is; I don't blame you for being curious. It's an odd file, filled with things that aren't Rust or C, or anything else that fits the norm. This file actually tells the linker where to put sections of the binary. It makes sure everything is in order so that when the microcontroller jumps to flash memory, the expected data is there ready for it. It also tells the the linker how much RAM the target board or chip has.build.rs
This build script copies the
memory.x
file from the crate root into a directory where the linker can always find it at build time. For many projects this is optional, as the linker always searches the project root directory -- whereverCargo.toml
is. However, if you are using a workspace or have a more complicated build setup, this build script becomes required. Additionally, by requesting that Cargo re-run the build script whenevermemory.x
is changed, updatingmemory.x
ensures a rebuild of the application with the new memory settings. -
Zephyr Quest: IoT Toggle Switch for Feather TFT This guide shows how to make an IoT toggle switch with an Adafruit Feather TFT ESP32-S3, Zephyr, and Adafruit IO. Key features include: GPIO input for Boot button, LVGL graphics, MQTT over WiFi with TLSv1.2, and USB serial shell commands for saving WiFi and MQTT configuration settings to NVM flash. This guide is intended for people who want to learn how to write applications in C using Zephyr APIs.
Demo video: IoT toggle switch: Zephyr + Feather TFT + Adafruit IO
Previously in this series of guides about using Zephyr on Adafruit hardware, I focused on setting up developer tools and writing Devicetree board definitions. This time, I'm moving up the stack to show how to build an application tying together several Zephyr APIs along with a custom board definition.
Building an IoT app with WiFi, TLS, and graphics is unavoidably a bit complicated. It took me about three weeks to write the code, which totals a bit over 2100 lines. Listing all of that here would be awkward. If you want the details, you can browse the code in my zphqst-03 GitHub repo. The code has lots of comments, including citations for the references I used while learning to use the Zephyr APIs.
This guide will focus on:
- How to build, run, and configure the IoT toggle switch app
- High level tour of the source code with GitHub links: which files do what?
- Understanding C language features that you'll need to use Zephyr APIs effectively: structs, function pointers, etc.
- Zephyr troubleshooting tips: diagnose and fix memory allocation issues, enable various types of debug logging, etc.
- MQTT testing with
openssl
and themosquitto
MQTT broker with its companion command line tools,mosquitto_pub
andmosquitto_sub
-
Adafruit Memento Time-lapse w/ online upload & email notification Adafruit Memento Time-lapse Camera with Online Upload and Email Notification
This guide shows how to turn your Adafruit Memento (ESP32-S3) board into a time-lapse camera that:
- Captures images on a schedule or with a button press
- Uploads them to Adafruit IO over Wi-Fi
- Triggers email notifications using a feed
- All using CircuitPython and the PyCamera library!What You Need
- Adafruit Memento Board: https://www.adafruit.com/product/5843
- microSD card (optional, for GIF recording)
- USB-C cable
- Wi-Fi network
- Adafruit IO account: https://io.adafruit.com
- CircuitPython is installed on the MementoSetup
1. Installing CircuitPython
To install CircuitPython on the Adafruit Memento, I followed this official guide by Anne Barela and John Park:
🔗 [Memento Camera Quick Start Guide – Install CircuitPython](https://learn.adafruit.com/memento-camera-quick-start-guide/install-circuitpython)
That page walks you through how to:
- Put the board into bootloader mode (double-tap reset)
- Drag the `.uf2` file onto the board
- Verify that the **CIRCUITPY** drive appearsMake sure you use **CircuitPython 9.0.0 or later** to avoid filesystem corruption issues.
2. Install Libraries
- adafruit_io
- adafruit_requests.mpy
- adafruit_ntp.mpy
- adafruit_logging.mpy
- adafruit_pycamera
- adafruit_ov5640
- adafruit_connection_manager.mpy
- gifio.mpy
- bitmaptools.mpy
- ulab (folder)3. Create settings.toml file on CIRCUITPY with this:
CIRCUITPY_WIFI_SSID = "YourNetworkName"
CIRCUITPY_WIFI_PASSWORD = "YourNetworkPassword"
ADAFRUIT_AIO_USERNAME = "your_username"
ADAFRUIT_AIO_KEY = "your_aio_key"4. Main Code
Paste the following into code.py on your CIRCUITPY drive:
What I did
- I used the fancy camera CircuitPython code by Anne Barela and John Park, and the Doorbell Camera code by Brent Rubell to help create the code above.
- My main focus was to get photos uploaded to Adafruit IO with the timelapse and camera shutter options.
How It Works
- The camera connects to Wi-Fi and Adafruit IO
- Takes snapshots using the shutter button or on a time-lapse interval
- Encodes image data as base64 and uploads to Adafruit IO feed
- Sends a "trigger" signal to another feed to notify you via email5. Set up Feeds on Adafruit IO
- Go to io.adafruit.com
- Click on the tab 'IO'
- Go to Feeds and create two. I set up a 'camera' feed and a 'camera-trigger' feed.
- The "camera" feed will take photo uploads, and the "camera-trigger" feed gets a 1 or 0; this is used to trigger an automated email action.