Big Bro
"I'll handle it. Step off the line — you're gonna get hit."
Two AI coding agents wrapped in one terminal. Big Bro runs the hot station — read, write, execute, fire tickets. Lil Bro handles prep — reviews the pass, checks the recipe, calls out when something's off. Grandpa keeps the recipe book. No API keys, no cloud bill — it all runs on your hardware via Ollama.
"I'll handle it. Step off the line — you're gonna get hit."
"Heard — picking up line two. That sauce looks broken, chef."
Both bros share one local reference library — stitched together from your project, Grandpa's own notes, and whatever you file in. Keyword search catches exact hits, semantic search catches the rest. If one method misses, the other picks it up.
Syntax, APIs, repo patterns, code Grandpa trusts.
How to think through a problem. When to bail, when to push.
Four tool errors in a row triggers a sibling warning. Five errors, the bro stops and asks for help — no silent burnt pans.
A passive ticket rail — each bro tracks what the other is cooking without actually talking about it.
Toggle and Lil Bro is cleared for write access. Both bros coding the same project at the same time.
Working-status indicators, competitive banter, honest error reporting. They'll tell you when they're stuck.
Grandpa runs keyword + semantic in parallel. Miss on one, catch on the other — context shows up either way.
Read, edit, search, execute, calculate. Lil Bro gets the same set read-only until you drop the bunk.
ollama.ai.ollama pull qwen2.5-coder:7bgit clone StoveGodCooks/LIL-BROpip install -e .python -m lilbro~/.lilbro/config.yaml to pick models & colors.Big Bro stays on the hot line — your dev machine. Lil Bro gets sent to the expo window: a lightweight remote client on a phone or second device. Same two-bro setup, split across stations, reporting back over the wire.