Everything I’ve built is open source and lives on my GitHub. Here’s a rundown of the major pieces of the software stack that makes the creatures work.
Creature Server
The heart of the whole system. A soft real-time C++ application that runs on Linux and manages everything — the event loop that drives creature animations, voice generation via ElevenLabs, lip sync data creation, and speech-to-text. All of the creatures on the network talk to a single server instance.
Creature Console
The front end for controlling creatures, written in Swift using SwiftUI. It runs on macOS and iOS, and it’s where I create and edit animations, manage playlists, and trigger ad-hoc dialog on the fly. There’s also a handy menu bar mini console for quick access during Zoom calls.
Creature CLI
A command-line Swiss Army knife for debugging and managing the Creature Server. Built in Swift alongside the Console, it can play animations, monitor WebSocket traffic, generate lip sync data, check server metrics, and more. Faster than clicking through a GUI when I just need to test something quick.
Creature Controller
The software that runs on the Linux host connected to each creature’s controller board. It handles the serial communication between the host and the RP2040 microcontroller inside the creature, translating DMX frames into servo positions.
Creature Listener
A C++ application that runs on a Raspberry Pi 5 near each creature, giving them the ability to listen and respond conversationally. It handles wake word detection, speech-to-text, local LLM inference, and coordinates with the Creature Server for text-to-speech and animated playback. It can even query Home Assistant to answer questions about the house.
Creature Agent
A Swift service that listens to MQTT events from Home Assistant and uses an LLM to generate contextual, in-character spoken responses. When something happens in the house — a door opens, motion is detected — the agent makes the creature react to it on the fly.
Creature MQTT
The bridge between the creature network and Home Assistant. It connects to the Creature Server’s WebSocket and republishes all the real-time sensor data, animation state, and system health as clean MQTT topics. This is what makes creature state visible to the rest of the home automation system.
Network Communication
A deep dive into how all of these pieces talk to each other — from the RESTful API and WebSocket between the console and server, to the E1.31 (sACN) protocol used to send DMX frames to creatures over the network.