A macOS menu-bar app that wraps an elizaOS AgentRuntime. The way I choose to run an eliza stack on my own machine — a tray-app sandbox where I cash in on random intrusive thoughts about agents. "What if the inbox prompted the agent through the real reply pipeline?" "What if I bundled llama.cpp for free local embeddings?" "What if the tray icon told me whether the agent was alive?" Expect it to break often. That's the whole point of having a sandbox separate from eliza and Milady.
curl -fsSL https://raw.githubusercontent.com/Dexploarer/detour/main/scripts/install.sh | bash
Apple Silicon, macOS 14+. The installer downloads the latest stable
release from GitHub Releases,
strips the macOS quarantine flag, and drops Detour.app
into /Applications. For the rolling canary, append
-s -- canary.
There's a hierarchy here. Pick the one that matches what you're trying to do, not the one with the prettiest landing page.
If you're learning the framework, shipping plugins, or experimenting with agents on top of elizaOS v3, the warmest place to do it is the Cozy Devs Discord. Calm, technical, no shilling — just builders sharing what works and what's broken.
Detour exists because I can't stop poking at agents in public. If watching that has been useful — or if you just want to help keep an OSS eliza-stack experiment alive on someone's tray icon — there's a Solana token that funds the time I get to spend on it. Buying or holding it is entirely optional and changes nothing about Detour itself.
This is not a competitor to anything in the elizaOS or Milady ecosystems. Those are the canon. This is just a small pool that lets one developer keep shipping breakage in public on top of the framework — which, when it works, feeds back into the OSS that everyone else builds on. The goal is the same goal eliza has always had: the best possible open-source agent app, framework, and community on web3 and the rest of the internet. This token's job is simply to help keep one more person paying attention to that.
DijmsEDeTXsWCkCLkhYJNTutKaHf541xZshVrCUbcozy
Not financial advice. No promises, no roadmap, no airdrop, no utility claims. It's literally a tip jar attached to a developer who likes building OSS in public. Always verify the contract address before sending anything anywhere.
The slice of capabilities I'm actively running today. Subject to rearrangement at the speed of curiosity.
Tray popup talks to your existing ChatGPT subscription via the system Codex CLI. No API key, no re-auth. Auto-detects from ~/.codex/auth.json.
llama.cpp server ships inside the .app. bge-small-en-v1.5, 384-dim, ~30 ms per embedding. Auto-downloads the model on first launch. No network after that. No API key. Free.
Every inbound + outbound message across Discord, Telegram, iMessage, and the in-app chat lands in one timeline. Recorded by a ChannelGateway that hooks elizaOS's MESSAGE_RECEIVED / MESSAGE_SENT events.
Programmatic notifications go through the real eliza pipeline — messageService.handleMessage → planner → action → REPLY. Captures the agent's reply text and back-annotates the inbox item.
Obsidian-style browser for the agent's memories, relationships, and the cross-cutting graph. Vector + substring search using the local embeddings.
Trajectories with the full LLM call sequence (prompt + response + duration), live runtime introspection, raw PGlite query console, log filtering. Watch your agent think.
What I'm shipping, in real time, across Dexploarer's
repos. Pulled directly from the GitHub API on page load.
No analytics, no proxy — your browser → api.github.com.
Single .app bundle. No Electron, no Chromium. Bun runtime in-process, Electrobun for the native shell, llama.cpp for embeddings, PGlite for memory.