NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Zerostack – A Unix-inspired coding agent written in pure Rust (crates.io)
khimaros 7 minutes ago [-]
i built something with a similar philosophy here: https://github.com/khimaros/airun -- it is intended to be piped and redirected. it discovers skills, AGENTS and prompt templates from Claude Code, Pi.dev, OpenCode and others. no TUI, but does have a basic tool calling loop

$ airun -q -p 'output a shell command for linux to display the current time. output only the command with no other code fencing or prose' | airun -q -s 'review the provided shell command, determine if it is safe, run it only if it is safe, and then summarize the output from the command' --permissions-allow='bash:date *'

360MustangScope 17 minutes ago [-]
Funny this comes out today. I was just about to start to write one in rust. It's amazing having opencode slowly leak memory and end up becoming 6gbs on a large project and then get slower and slower.

Will check this out! Seems cool!

throwa356262 49 minutes ago [-]
"RAM footprint: ~8MB on an empty session, ~12MB when working"

I like this, Claude Code is using multiple gigabytes, which is really annoying on lowend laptops

tecoholic 43 minutes ago [-]
Yes. Just this fact is going to make a lot of people try it out.
marknutter 40 minutes ago [-]
Isn't that because of the context window size?
gidellav 26 minutes ago [-]
Hi, I'm the developer of zerostack! No, the memory footprint is not beacuse of the context window size: on my benchmarks, with a 128k context loaded, and it jumped from 8MB (without any chat/context loaded) to 11MB.

The reasons why the memory footprint of zerostack are:

- Rust, and not JS/Python, so no interpreters/VMs on top

- Load-as-needed, so we only allocate things like LLM connectors when needed

- `smallvec` used for most of the array usage of the tool (up to N items are stored in stack)

- `compactstring` used for most of the string usage of the tool (up to N chars are stored in stack)

- `opt-level=z` to force LLVM to optimize for binary size and not for performance (even tho we still beat both in TTFT and in tool use time opencode)

- heavy usage of [LTO](https://en.wikipedia.org/wiki/Interprocedural_optimization#W...)

SwellJoe 22 minutes ago [-]
The context window is not on your system. It's on the server with the model. There may be some local prompt caching, of some sort, but you're not locally hosting the context unless you're also locally hosting the model.
SatvikBeri 34 minutes ago [-]
The context window has nothing to do with RAM usage and even if it did, a million tokens of context is maybe 5mb.
hiAndrewQuinn 31 minutes ago [-]
The codebase was small enough that I handed it over to DeepSeek v4 Flash in Pi to skim through for any risky business, and I didn't find anything concerning. Nice work.
gidellav 23 minutes ago [-]
Thanks! Funny enough, a good chunk of the coding was done by Deepseek v4 Flash, while I hand-wrote a couple of the TUI logic, as deepseek kept failing on certain cursor-moving logic, and I fully managed the memory optimization process (as you can read on another comment I left, it both a set of compiler optimizations and usage of certain Rust crates in order to leverage more efficient data structures).
hiAndrewQuinn 14 minutes ago [-]
Taking notes and comparing this against my own (non coding agent) Rust TUI project, thank you! I'm new to Rust so this is a helpful baseline.
gidellav 2 minutes ago [-]
No problem, happy to help!
kadoban 18 minutes ago [-]
> I handed it over to DeepSeek v4 Flash in Pi to skim through for any risky business

Doesn't prompt injection make that a rather flimsy investigation?

sergiotapia 36 minutes ago [-]
Given agent harnesses affect so much of the performance of models, it would be great to see some kind of benchmark on how this tool performs compared to claude/codex/opencode/pi etc.
gidellav 16 minutes ago [-]
Hi! While I didn't try any agent benchmark, I already though of this possible issue, and I tried to approach it on two different levels:

1. The tools that are given to the agent are almost the same to the one defined in Opencode, except for Skills and Subagents (both features not implemented in zerostack)

2. Zerostack is prompt-based, so that it ships with a set of .md files, stored in ~/.config/zerostack/prompt, and that can be selected from the TUI in order to activate different 'agents': as you can see from the README, it is designed to contain the most important feautres of superpower + Claude's front-end design + git worktree support and Ralph Wiggum loops (both as integrated features)

hparadiz 44 minutes ago [-]
this is what I've been waiting for

a low level language. please no more scripting language TUIs!

nine_k 10 minutes ago [-]
Rust, a language with affine types, generics, lifetimes, deep static analysis, hygienic macros, etc is not low-level. It's nearly as high-level as Haskell (without HKTs though).

It just does not rely on GC and allows to manage resources efficiently. This efficiency is partly due to its being so high-level.

gidellav 5 minutes ago [-]
While I agree on the fact that it allows to manage resources efficiently, I don't agree on the fact the efficency derives from it being high-level; from a purely tecnical standpoint, i could skim off 2-3MB from the memory footprint by writing the code in pure C, as there are some unused parts of Rust's std that cannot be removed without recompiling std.

This is obv only a technical talk, as writing an AI TUI in pure C would be rather... ehhh

schaefer 30 minutes ago [-]
There has been no reason to wait... Codex is written in rust.

-- So is deepseek-tui.

hparadiz 27 minutes ago [-]
Forgot to add an open source qualifier. I use codex lol
andxor 25 minutes ago [-]
Codex is also opensource.
hparadiz 9 minutes ago [-]
I don't really want something owned by a company for my local stuff. I'd prefer it be small and minimalistic. Maybe in the future I'll change my mind and it will be more like a browser but for now I wanna keep it small and local.
iknowstuff 34 minutes ago [-]
Isn’t codex in rust?
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 00:02:30 GMT+0000 (Coordinated Universal Time) with Vercel.