v0.3.0 — playground edition

A playground
for elizaOS
agent experiments.

A macOS menu-bar app that wraps an elizaOS AgentRuntime. The way I choose to run an eliza stack on my own machine — a tray-app sandbox where I cash in on random intrusive thoughts about agents. "What if the inbox prompted the agent through the real reply pipeline?" "What if I bundled llama.cpp for free local embeddings?" "What if the tray icon told me whether the agent was alive?" Expect it to break often. That's the whole point of having a sandbox separate from eliza and Milady.

install — macos arm64
$ curl -fsSL https://raw.githubusercontent.com/Dexploarer/detour/main/scripts/install.sh | bash

Apple Silicon, macOS 14+. The installer downloads the latest stable release from GitHub Releases, strips the macOS quarantine flag, and drops Detour.app into /Applications. For the rolling canary, append -s -- canary.

Where Detour fits.

There's a hierarchy here. Pick the one that matches what you're trying to do, not the one with the prettiest landing page.

For elizaOS v3 builders

Building on elizaOS v3?
Come hang out.

If you're learning the framework, shipping plugins, or experimenting with agents on top of elizaOS v3, the warmest place to do it is the Cozy Devs Discord. Calm, technical, no shilling — just builders sharing what works and what's broken.

Join Cozy Devs on Discord Follow @detour_squirrel discord.gg/BfTrruWcZ · x.com/detour_squirrel
Optional · tip jar

Fueling a playground built by an ADHD dev.

Detour exists because I can't stop poking at agents in public. If watching that has been useful — or if you just want to help keep an OSS eliza-stack experiment alive on someone's tray icon — there's a Solana token that funds the time I get to spend on it. Buying or holding it is entirely optional and changes nothing about Detour itself.

This is not a competitor to anything in the elizaOS or Milady ecosystems. Those are the canon. This is just a small pool that lets one developer keep shipping breakage in public on top of the framework — which, when it works, feeds back into the OSS that everyone else builds on. The goal is the same goal eliza has always had: the best possible open-source agent app, framework, and community on web3 and the rest of the internet. This token's job is simply to help keep one more person paying attention to that.

$cozy on solana DijmsEDeTXsWCkCLkhYJNTutKaHf541xZshVrCUbcozy

Not financial advice. No promises, no roadmap, no airdrop, no utility claims. It's literally a tip jar attached to a developer who likes building OSS in public. Always verify the contract address before sending anything anywhere.

What's in the box.

The slice of capabilities I'm actively running today. Subject to rearrangement at the speed of curiosity.

Chat with Codex/ChatGPT

Tray popup talks to your existing ChatGPT subscription via the system Codex CLI. No API key, no re-auth. Auto-detects from ~/.codex/auth.json.

Local embeddings, bundled

llama.cpp server ships inside the .app. bge-small-en-v1.5, 384-dim, ~30 ms per embedding. Auto-downloads the model on first launch. No network after that. No API key. Free.

Unified channel feed

Every inbound + outbound message across Discord, Telegram, iMessage, and the in-app chat lands in one timeline. Recorded by a ChannelGateway that hooks elizaOS's MESSAGE_RECEIVED / MESSAGE_SENT events.

Inbox that drives the agent

Programmatic notifications go through the real eliza pipeline — messageService.handleMessage → planner → action → REPLY. Captures the agent's reply text and back-annotates the inbox item.

Pensieve memory browser

Obsidian-style browser for the agent's memories, relationships, and the cross-cutting graph. Vector + substring search using the local embeddings.

Activity inspector

Trajectories with the full LLM call sequence (prompt + response + duration), live runtime introspection, raw PGlite query console, log filtering. Watch your agent think.

Live activity.

What I'm shipping, in real time, across Dexploarer's repos. Pulled directly from the GitHub API on page load. No analytics, no proxy — your browser → api.github.com.

connecting to api.github.com…

The stack.

Single .app bundle. No Electron, no Chromium. Bun runtime in-process, Electrobun for the native shell, llama.cpp for embeddings, PGlite for memory.

Detour.app ├── Electrobun launcher native macOS shell, single .app artifact ├── Bun runtime in-process API + WebSocket on :2138 │ ├── @detour/core composition root, vault, channels, gateway, inbox, llama │ ├── @detour/plugin-codex-chatgpt chat via chatgpt.com/backend-api/codex │ ├── @detour/plugin-embedding-openai OpenAI-compat client → local llama-server │ ├── @detour/plugin-pensieve-tools PENSIEVE_* actions for the agent │ ├── @detour/plugin-vault-tools VAULT_* / LOGIN_* actions │ └── llama-server llama.cpp embedding endpoint, lazy-spawned └── React UI Vite-built; views:// in prod, localhost:5180 in dev ├── Chat popup ├── Pensieve Inbox · Channel feed · Memories · Relationships · Graph ├── Activity Trajectories · Logs · Tasks · Plugins · Runtime ├── Channels Discord · Telegram · iMessage └── Settings Configuration · Vault · Local AI