Getting Started
Adafruit Playground is a wonderful and safe place to share your interests with Adafruit's vibrant community of makers and doers. Have a cool project you are working on? Have a bit of code that you think others will find useful? Want to show off your electronics workbench? You have come to the right place.
The goal of Adafruit Playground is to make it as simple as possible to share your work. On the Adafruit Playground users can create Notes. A note is a single-page space where you can document your topic using Adafruit's easy-to-use editor. Notes are like Guides on the Adafruit Learning System but guides are high-fidelity content curated and maintained by Adafuit. Notes are whatever you want them to be. Have fun and be kind.
Click here to learn more about Adafruit Playground and how to get started.
-
Using ESP-Claw with a Local LLM ESP-Claw is a new tiny AI device for Espressif microcontroller-based boards. Supported boards must have at least 8MB Flash and 8MB PSRAM to run ESP-Claw.
ESP-Claw devices require a connection to the real world. This includes:
- WiFi (local) for communications
- A Large Language Model (LLM) for reasoning (OpenAI, Anthropic, DeepSeek or local)
- (Optional) A connection to Telegram for communication (there is now a crude web interface if Telegram is not desired)
- (Optional) A connection key to search engines Brave Search or Tavily (both services have fees)
I have tested ESP-Claw with a Claude API key. While some folks have it working, I found, at the lowest API access paid level, ESP-Claw could exceed tokens per second limits. And I don't want to put $100 on Anthropic's books to get that squared away.
HTTP 429: This request would exceed your organization's rate limit of 30,000 input tokens per minute (org: fb68d6db-0824-4998-b920-fe36549c9cae, model: claude-sonnet-4-6). For details, refer to: https://docs.claude.com/en/api/rate-limits. You can see the response headers for current usage. Please reduce the prompt length or the maximum tokens requested, or try again later.
Using a Local Model Instead
I had set up my local computer to run an LLM per the guide Setting up an LLM model on your own computer. It can generate up to 40 tokens per second which is decent but not using crazy hardware.
I wanted to use this local LLM over my home network to connect to my ESP-Claws (yes, I have four 'Claws).
The latest builds of the flasher software allow for a local connection to an LLM model with OpenAI, Qwen, Deepseek or Anthropic calling conventions. It took me a bit of time, but I found the right combination of settings to get things working. This is what I'll share below.
-
Setting Up an LLM Model on Your Own Computer Many of us have been there: working with Claude, ChatGPT, Copilot, etc. and BOOM, the token allowance runs out and you have to wait a few hours for it to reset. All you want to do is finish your project.
Well, you are on your personal computer which can compute things. Perhaps your machine can even play some fairly recent games? If so, you have free compute right at your fingertips.
Some of the AI models used by the big companies are available for free. This is true. The trouble is knowing which model to get, where to get it, what to run it with and how to tune the model to your machine. If this sounds complicated, it's less complicated than installing current games and likely takes up less space. But you'll want to follow this guide for some tips on how to do this.
There are many models and many ways to run them. I'll show you some of what I have done as an example for you to do similar.
-
Homebridge Plugin for Adafruit IO Feeds Recently I wanted to try setting up Homebridge to interface with our Apple Home setup. What is Homebridge?
Homebridge is a lightweight Node.js server you can run on your home network that emulates the iOS HomeKit API.
It basically lets you connect non-Apple HomeKit supported devices to Apple Home. There are plugins that you install within Homebridge that can expose more features or let you customize how a device appears within the Apple Home app.
I decided that I wanted to try creating a plugin for Adafruit IO to send data from Adafruit IO feeds to Apple Home.
This was inspired by the work that Trevor did a little while ago with the itsaSNAP app. That iOS app lets you interface with Adafruit IO feeds within iOS. We wrote a lot of fun guides experimenting with the use of Apple Shortcuts combined with Adafruit IO.
One piece that was missing was the ability to have data go from IO to Apple Home. We could control devices already in Apple Home (usually Matter devices) with Shortcuts that integrated with itsaSNAP, but we couldn't send, for example, temperature sensor data from a feed to Apple Home and have it show up in a Room.
-
How to Repair Maschine Mikro Mk3 Pads Non-responsive Pads
Repairing Native Instruments Maschine Mikro MK3 pads involved opening the enclosure to clean the contact traces on the PCB, sensor pad and rubber elastomers with isopropyl alcohol and a Q-tip to remove debris (mostly cat hair).
This playground note will walk through the device teardown and cleaning the contacts and rubber elastomer pads.
Remove Bottom Panel screws
Start flipping the device over with the bottom panel facing up. Then, use a Phillips screwdriver to remove 14 screws from the bottom panel.
-
Planck V7 mount for Apple IIe Fruit Jam Enclosure I recently came across the absolutely amazing Apple IIe enclosure for the Fruit Jam designed by the Ruiz Brothers. Since I did not have a KPR BM43 on hand and didn't want to spend $43, I did a deep dive into AutoDesk Fusion and I have created a mount for the Planck v7 keyboard PCB for the Apple IIe enclosure.
Thanks to everyone in the Adafruit Discord server for the helpful pointers and banter while I worked on this!
You will also need the updated keyboard case, because the USB-C port is in a different place on the Planck v7.
-
Bringing My Godzilla Collectible To Life So this all started right after Thanksgiving 2025 when it was time to put up the Christmas tree. Normally, we put the old Lionel train in a boring circle around the tree. That is about all the room I have for it. So I decided to get some N-scale trains so I could be more ambitious with a train layout. So what kind of scene do we build? Sleepy Christmas village? Nah, Godzilla smashing things up, of course. So I had the idea of taking an 8" Bandai Godzilla Minus 1 Ichibansho Figure
Godzilla and building a diorama around him. But then I had the idea of making his dorsal fins light up with a NeoPixel strip. This meant learning how to use a microcontroller and making the lights do the atomic breath pattern. And I was hooked. So the trains are now put away, and I have been working on a system for putting action figures on a stepper motor controlled turntable. That system has a dedicated esp32-s3 with a tmc2209 stepper driver and a basic NEMA 17 stepper motor that is accurately positioned using an AS5600 magnetic encoder. Using WiFi, I control the turntable with a built-in web app or via Python software running on my Mac or Raspberry Pi 5. The software can choreograph mock battles, or silly dance routines. It also allows me to use a PS4 Dual Shock controller to manually control the turntable.
-
Getting my Fruit Jam Clock to Speak The Idea
The Fruit Jam, with its built‑in audio, Wi‑Fi, HSTX display, SD card, generous memory, and other little goodies, is a fantastic platform for mixing creativity with a bit of technical magic. Add CircuitPython to the mix and you’ve got a wonderfully flexible development environment.
With one of my Fruit Jams, I built a display that shows an analog clock face, rotates through an album of photos of my grandkids as the background, and pulls weather data from Adafruit IO that I can scroll through using an IR remote.
With the built‑in TLV320 DAC, I also wanted it to sound like a clock — a really big clock. I wrote functions to synthesize polyphonic tones that mimic the classic Westminster Quarters every fifteen minutes, and at the top of each hour it plays a WAV file of a single church‑bell toll, repeated once for each hour. It turned into a great little project that let me explore different elements of CircuitPython and many parts of the Fruit Jam board.
It’s been running for months now and has become a charming addition to my office. And the best part is that I can keep extending it whenever a new idea strikes.
I Discovered Copilot Audio Expression
The other day, while playing with Copilot, I noticed a link to Labs at the top of the screen. Clicking through, I found Copilot Audio Expression, described as “an experimental tool designed for effortless audio creation using Copilot’s latest voice‑generation models.” You simply type a word or phrase, and it “speaks” it for you — then you can download the result as an MP3.
A few quick prompts and some experimenting with the different settings sparked an idea: Why not make my Fruit Jam clock talk?
Everything old is new again...
Stringing together audio samples to form phrases is nothing new — people have been doing it for decades, long before digital audio. One of my earliest Learn Guide projects, the Titano Weather Station, used prerecorded samples as alarms (“time for bed”). But once you go beyond a handful of words, the challenge grows quickly. A talking clock or calendar needs nearly a hundred different clips: dates, day names, months, ordinals, and more. Gathering, recording, editing, and normalizing all of that becomes a real chore.
That’s where this new AI tool shines. I can generate clean, consistent audio samples for every word or phrase I need in just a few minutes.
So let’s build a Fruit Jam clock that can announce the time and date.
Step by Step
This article focuses specifically on adding the talking capability to my existing Fruit Jam clock project. The display, weather integration, IR remote navigation, and Westminster chimes are all already in place — and I can cover those in a future article if there’s interest. For now, we’ll walk through the audio‑generation workflow that makes the clock speak.
The overall process breaks down into four main steps:
1. Generate all the required words and phrases using Copilot Audio Expression
2. Convert the MP3 files to WAV format using FFmpeg
3. Organize the files on the SD card and create logical CircuitPython lists
4. Write functions that assemble and play the spoken phrasesGenerating the samples
For my project, I needed the Fruit Jam to be able to speak:
- The time (hours, minutes, AM, PM, noon, midnight)
- The date (day name, month name, ordinal date, year)
- The quarter hours (“quarter past”, “half past”, “quarter til”)
- And as a bonus: holidays, special days, and fun extras
Copilot Audio Expression made this surprisingly easy.
Before we dive in, note that Copilot Audio Expression is part of Copilot AI Labs and is marked as experimental. It may change or disappear at any time. The site I used is: Copilot Audio Expression Website.
Copilot Audio Expression Website
-
Christmas Tree & Star Lights with Node-Red Introduction
We have a 1960's era aluminum Christmas tree. For the time that we've had it we've relied on newer and slightly less dangerous color wheels, then stage lighting. When we moved we didn't bring any of the lighting with us.
I decided to string the poles of the tree with LEDs.
The prior year I made a star topper which used a Bluefruit for lighting. This year I added 24 LEDs for the star that plug into the highest string on the poles.
Design
The current design is written in CircuitPython. For communication it relies on MQTT and Node-Red.
Everything worked great, however, after I worked on a different LED project that used WLED, I think I'll convert this project to WLED. I'm not going to delete this code base, I think it has some interesting implementation features and is worth preserving.
Infrastructure-wise with either implmentation you will need:
- MQTT - you can use AdafruitIO or build your own. In my implementation I set up a broker on a RaspberryPi 4 Model B running Debian Bookworm and Mosquitto.
- Node-Red - my Node-Red journey is chronicled here.
Parts
Code
Tree & Star
The code can be found on Github
As I was coding this project, I found myself thinking I was going to be reusing a lot of it. So I repurposed an old project and replaced what was there with new helper classes.
I did run into an issue where I could not get the LED Animation library to play nicely with MQTT. I tried a few things, including asyncio, and ended up posting on the forums. mikeysklar came to the rescue and you'll see his contribution in the code.
The tree has 196 LEDs and the star has 24.
data = { 'tree_animations': ['multi_chase'] ,'star_animations': ['rainbow_sparkle'] ,'num_pixels' : 220 ,'tree_pixel_subset': [0, 195] ,'star_pixel_subset': [196, 24] ... }Libraries Needed
import json import board import time import os import neopixel import adafruit_logging import wifi import supervisor import adafruit_connection_manager import adafruit_minimqtt.adafruit_minimqtt from adafruit_led_animation.group import AnimationGroup from adafruit_minimqtt.adafruit_minimqtt import MMQTTException from adafruit_led_animation.sequence import AnimationSequence from adafruit_led_animation.helper import PixelSubset from circuitpy_helpers.led_animations import animationBuilder from circuitpy_helpers.led_animations import controlLights from circuitpy_helpers.led_animations import updateAnimationData from circuitpy_helpers.file_helpers import updateFiles from circuitpy_helpers.calendar_time_helpers import timeHelper from circuitpy_helpers.network_helpers import wanChecker
Featured Highlights
The tree subscribes to the following MQTT feeds:
- tree.lights - listens for events from Node-Red to update animation, start time, stop time
- home.time - listens for events from home hub in order to know when to stop or start the lights
- home.sunset - listens for events from home hub in order to know current sunset, uses this for start time
In order to take events from tree.lights and actually change the running animation and colors, I needed to be able to do a couple of things:
- Have a list of supported animations - this is done via the animations.json file
- Be able to override the default values for animations
- Be able to on-the-fly update the data file to implement the new animation/colors choice
When the code first starts it will check if there are overrides for the chosen animation and do an in-line replacement of those values. It will then check to see what the color selection is. Color choice is specified in the animations.json file and can be:
- random - a random color will be selected
- data - in the data file, for each animation is an entry for color. A string representation of the color can be provided or a the name of a color_palette.
Once up and running, about every 30 seconds, the animations will pause long enough for the code to query MQTT for any updates. If a new message arrives the code will update the data file with the appropriate information and perform a
supervisor.reload().You can also change the start and stop times or disable them so that the lights will run continually.
When using start and stop times; when the criteria is met to do either, the code will light sleep until the alarm is met.
Case
I used OpenSCAD and the YAPP Box (Yet Another Parametric Project Box Generator) library from mrWheel for the Feather case. This library is well documented and has a lot of features.
I found a star on Thingiverse years ago and downloaded the SCAD file. I re-mixed it for my own needs. Unfortunatly now I cannot find the Thing anymore. If you recognize the code as yours please let me know so I can give you proper credit.
-
Moon Miner Arcade Game for the Adafruit Fruit Jam Moon Miner is a retro-styled arcade game reminiscent of the Lunar Lander arcade game from the 1980s. Your mission is to retrieve minerals from various moons around the solar system. It is a physics-based game, where you must navigate your rocket ship based on gravity and thrust to make a safe landing for retrieving minerals. Achieve a successful landing by reducing speed to near zero and keeping the lander upright on designated flat areas. It is available to play on the Adafruit Fruit Jam.
Features
- The game employs physics to maneuver based on gravity. The game takes into account each moon has different gravitational properties.
- A heads-up display shows mission flight data in real-time.
- Complete the mission of collecting gems and your best time to complete the mission is saved. Play again to beat your best time.
- There are several ways you can die. Make sure you land on a horizontal surface at a safe velocity (and watch out for volcanos).
- Use fuel locations when running low on fuel. Some locations can be revisited again for additional fuel.
- Four missions are included.
- Written entirely in CircuitPython for the Adafruit Fruit Jam.
The full learn guide, including download and installation instructions can be found at the author's web site at https://danthegeek.com/2025/12/28/moon-miner-learn-guide/
-
Rust on Adafruit RP2040 Feather ThinkInk (JD79661 E-Paper) I wanted to share my experience experimenting with Rust on the Adafruit RP2040 Feather ThinkInk board, specifically using a bare E-Paper display.
Hardware:
- Adafruit RP2040 Feather ThinkInk (https://www.adafruit.com/product/5727)
- Bare EPD display (JD79661 chipset): {https://www.adafruit.com/product/6373)
Background
I went through all the official tutorials for the ThinkInk Feather: https://learn.adafruit.com/adafruit-rp2040-feather-thinkink
The provided examples are excellent, but they currently focus on CircuitPython and Arduino. Adafruit already provides solid drivers for the JD79661 chipset, and the display is also well supported in the Arduino ecosystem via the GxEPD2 library: https://github.com/ZinggJM/GxEPD2
This made me curious: can I do the same thing in Rust?
Rust on RP2040
The RP2040 has very good Rust support thanks to the rp-rs ecosystem. A great starting point is the RP2040 project template: https://github.com/rp-rs/rp2040-project-template
There is also an existing HAL board definition for the Adafruit Feather RP2040 here: https://github.com/rp-rs/rp-hal-boards/tree/main/boards/adafruit-feather-rp2040
While it’s not an exact match for the ThinkInk Feather, it’s close enough to serve as a solid base, with only minor adjustments needed.
Driver work
At the moment, there is no native Rust driver for the JD79661 E-Paper controller. To move forward, I implemented a custom Rust driver, based directly on the source code from Adafruit’s CircuitPython and Arduino implementations.
Once the low-level driver was in place, everything integrated nicely with the Embedded Graphics framework. Rendering text and images worked as expected.
As a proof of concept, I was able to:
- Convert a JPG image to BMP
- Render and display it correctly on the E-Paper display
Results & Code
All of my experiments and findings are documented here: https://github.com/melastmohican/adafruit-feather-thinkink-discovery
The repository includes:
- Project setup for Rust on the ThinkInk Feather
- A custom JD79661 driver
- Examples using Embedded Graphics
- Notes on differences between the standard Feather RP2040 and the ThinkInk variant
Closing thoughts
Overall, Rust works very well on the RP2040, and bringing up an E-Paper display was quite feasible even without an existing driver. Hopefully this helps others who want to explore Rust + E-Paper + Adafruit hardware.
-
NeoKey TOTP Token This is a two factor authentication token to generate TOTP login codes for up to four accounts. You can select which account by pressing a key on the 4-key NeoKey keypad. The design is intended for desktop use in a safe location (wall power + no worries of physical tampering) but where you do want to prevent secrets from leaking over the network due to mis-configured cloud-sync backup features or whatever.
Design Goals and Features:
- Make the codes really easy to read and type, even in low light, by using a dimmable backlit TFT display with a relatively large font.
- Support 4 TOTP account slots (one for each key of a 4-key NeoKey keypad).
- The NeoPixel under the key for the currently selected account slot lights up. Pressing a different key switches the selected account. Pressing the selected key a second time puts the token in standby mode (backlight and NeoPixels off).
- Store secrets in an I2C EEPROM rather than in the CLUE board's flash. This makes it so the secrets aren't trivially accessible to a connected computer as USB mass storage files. This way, they won't get accidentally sucked into backups, and malware would have to work harder to access them.
- Set DS3231 RTC time from the USB serial console by opening the REPL, importing the
utilmodule, then callingutil.set_time(). - Add and manage TOTP accounts in the EEPROM's database of account slots by using similar REPL functions (
import utilthenutil.menu()). - Use the token fully airgapped after initial setup by powering it from a phone charger and reading codes off the TFT display.
Overview
-
LoRa Wireless Greenhouse Monitor This project uses 900 MHz RFM95W LoRa FeatherWing modules to transmit temperature measurements from greenhouses and receive them at a base station about half a kilometer away. The base station hardware outputs sensor reports over USB serial, an optional 2x16 Character LCD, and an optional ESP-NOW gateway.
The LoRa radio settings are tuned for extended battery life at a range of up to 500m in suburban or rural conditions (non line of sight with limited obstructions). With a fully charged 400 mAh LiPo battery and a 9 minute reporting interval, typical sensor runtime should be about 4 weeks (~22µA deep sleep current, ~2667ms of time per wake cycle, ~0.222 coulombs of charge per wake cycle).
To optimize the transmitter for running on a small LiPo cell, I used a Nordic nRF-PPK2 power analyzer to tune the firmware. Some of the power saving tricks I used include reducing the cpu frequency, putting the RFM95W radio in sleep mode, putting the MAX17048 fuel gauge in hibernate mode, and using ESP32-S3 deep sleep. To compare the impact of different changes, I took extensive measurements using Nordic's Power Profiler app with the PPK2 connected to the Feather board's battery jack and GPIO outputs.
Transparency note: Adafruit provided some of the parts I used for this guide (Thanks Adafruit!).
Related Projects
For logging, charting, and IRC status notifications, check out my related projects:
Overview
-
LoRa Touchscreen Pager and Tracker for Private Off-Grid Messaging About this Pager
You can build this touchscreen off-grid pager that uses LoRa and optionally GPS for location and tracking. This device lets you create (or join) a group of up to 90 devices in a private mesh network where you can send/receive messages, share location, and more.
You never know when you're going to need to be able to send messages between nearby family or friends, where cell phones don't work. You can also use these in combination with other devices (T-Deck, T-Beam, etc) and figure out where different devices are - even without having phones/internet/data. That can be really handy if you're hiking or camping in a far-out location.
How it's Built
- I designed a very simple PCB that accepts a few Adafruit components, that when combined with specialized firmware, become an off-grid communication device ready to be dropped into a 3D printed enclosure
- You can make the PCB (gerber files are included in this project) or order it from PCBWay
- The Adafruit parts are listed further down, but include Adafruit's RFM95W LoRa, 2.8" TFT Touchscreen with EYESPI & SD, Realtime Clock, FRAM, and a few other things
- After assembling these things, you'll need to flash it with firmware, which takes only a minute or so
Assembling the Pager
Here's a video demo of assembling the pager. Full detailed instructions are also available at my website:
-
Building a Sci-Fi Movie Prop Overview
A local production company is working on filming the first of a three-part sci-fi movie and needed a piece of scientific equipment for a laboratory scene. The executive producer/director found an obsolete flow cytometer analyzer in a government surplus sale, winning the bid for US$12. The device had the potential to look like a working DNA synthesizer with the addition of lighting and a bit of animation.
In its day, the analyzer was a high-quality device that was robustly built to provide exceptional mechanical stability for its sensitive optical components. It was therefore quite heavy in spite of its size, requiring at least two persons to lift and position, which would increase the challenge to modify for use in the film. It was not a typical theatrical prop made from foam and balsa wood, for certain.
I was tasked with installing color lighting to enhance the device’s operational appearance for its brief appearance on-screen. To achieve this, I devised a plan to incorporate several NeoPixel LED strips, which would be controlled by a CircuitPython-based microcontroller, such as the Adafruit M4 Express Feather. The multi-colored NeoPixel LEDs could be strategically positioned both within and outside the device, thereby providing ambient illumination and symbolizing various functions, including sample loading and the incubation process.
Given that the initial device employed industrial-grade servos (specifically, three IMS MDI-17 Drive Plus Motion Control motors) for sample positioning and operating the sample fluid “sipper” needle, there was a preliminary aspiration to incorporate robotic physical movements beyond the lighting sequence. However, this objective was deferred due to the imminent project deadline, so a short puppetry cable would likely be attached to the sample positioning cam to animate movement of the test tube rack.
-
Pyboom - A game for the Fruit Jam Py-Boom
Py-Boom is a fast-paced, 8-bit-style arcade game written in CircuitPython for the Adafruit Fruit Jam and other compatible display boards.
This game is a modern take on a classic "catcher" formula, featuring both a single-player mode against an AI and a competitive two-player versus mode.
Game Modes
At the title screen, you can select your game mode:
-
1-Player Mode: You control the Bucket (P1) at the bottom of the screen. An AI-controlled Bomber moves at the top, dropping bombs at an increasing rate. Your goal is to catch as many bombs as possible to survive the level.
-
2-Player Mode: Player 1 controls the Bucket, and Player 2 controls the Bomber. P1's goal is to survive, while P2's goal is to drop bombs strategically to make P1 miss.
How to Play
P1 (Bucket) - The Catcher
-
Goal: Catch every bomb that is dropped. If you miss a bomb, you lose one of your buckets (lives). If you lose all three, the game is over.
-
Winning: If you (P1) successfully catch all bombs in a level (e.g., 10 bombs in Level 1), you win the round and advance to the next, more difficult level.
P2 (Bomber) - The Attacker
-
Goal: Make P1 miss! You have a limited number of bombs per level. Use your movement and timing to drop bombs where P1 isn't.
-
Winning: If you (P2) successfully make P1 lose all three of their buckets, you win the game!
Controls
Action
Player 1 (Bucket)
Player 2 (Bomber)
Move Left
AkeyLeft ArrowkeyMove Right
DkeyRight ArrowkeyDrop Bomb
N/A
Down ArrowkeyStart / Ready
SpacebarEnterkeyOther Controls
-
Select Mode: On the title screen, press the
1or2key. -
Restart Game: On the "Game Over" screen, press the
Rkey to return to the title screen.
Required Files
To run this game, you will need the following files on your CircuitPython device:
-
code.py: The main game code. -
The Fruit Jam OS library package.
-
pyboom.bmp: The title screen logo. -
bomb_icon.bmp: The bomb sprite icon (used in development).
Download at: Pyboom Git Hub
Background
This project was started on Microsoft Make Code for my Pygamer and was called Prison Break. With the introduction of the Fruit Jam I wanted to port this over to Circuit Python. The graphics in Make Code (MC) are saved in a TypeScript file so I had to copy the codes for the sprites over to my Circuit Python. I used the AI tools that are part of Visual Studio Code (VS) the develop functions to map the sprites maps into bitmaps and tile grids. I continued to use the AI tools to help convert the Python code from MC. I mostly used Gemini as I have 3 months of premium from purchasing my phone. Though there were times where Gemini would get stuck on fixing issues it was making so I would switch to the free tokens in VS and use Claude or Chat-GPT. I ran out of free tokens in VS and moved on to Gemini for versions 2 and 3 of the game. I am in the process of uploading my prompts that I still have access to (I lost my VS conversations :( ) and hope to have them done in the next week. I also hope to get controllers setup and maybe make paddle controllers in the future.
I found this a fun project to learn Circuit Python and coding with AI. I'm still learning the concepts of using classes and learned a lot while looking at the errors the AI was coming up with.
-