Hacker Newsnew | past | comments | ask | show | jobs | submit | willm's commentslogin

De nada.


Pretty sure I could run Toad in Toad, but I’m scared to try.


Hope you like it. It is still Claude Code doing the work. Toad talks to the agent, and is the agent that works with the LLM. So the results should be identical to the native CLI.


I have written a coding agent which I plan to open up soon. By far the biggest time sink has been in the TUI - I've just implemented ACP and I really hope that I can use toad as a front end.


It should be as easy as running: toad acp “command”


Does it work with local models? ollama? LM Studio?


you still need an agent, but yes.


I'm very taken by your response!


Guess you don’t like sci-fi movie quotes. You can change that to a simple pulse animation in the settings.

It literally is using Claude under the hood. Should be no different than Claude’s own CLI.


I guess I have never heard them! Thanks for the tip



Ah right so they are your quotes. No not enough that it wasn't quite a shock to spin up a cli tool and the first thing it says is "I didn't murder him" lol


Hi. Will McGugan here. I built Toad. Ask me anything.


Toad looks really nice, I will definitely try it out. I have some ACP questions if you don't mind.

First, from my reading of the ACP doc, one thing that seems pretty janky is if the ACP client wants to expose a tool to the agent, e.g. if Toad wanted to add the ability for the agent to display pretty diffs. In the doc they recommend stdio to the ACP server, then stdio to an MCP server, and then some out of band network request back to the ACP client. Have you thought about this, or found a better solution working on Toad?

Similarly, it would be useful to be able to expose a tool which runs a subagent using ACP using a different agent, e.g. if I'm using Claude for coding but I'd like to invoke codex for code review. Have you thought about doing anything like this? Is it feasible over the protocol?


I don’t follow your first question. Toad already displays pretty diffs. MCP works in the same way as the native CLI.

One of the advantages of Toad is that it is vendor agnostic. In the future Toad will be able to run sub agents, and allocate any agent to any job. Still to figure out the UX for that.


In my first question, I'm referring to exposing functionality from the ACP client to the agent. Imagine an IDE ACP client which wants to expose language refactoring to the agent, for example - I can't think of a better example for something more like Toad. As far as I know the protocol doesn't expose a way to inject tools into the agent from the ACP client.


The ACP protocol supports MCP. That would be how the client provides additional functionality for the agent. There's no UI in Toad for that yet, but there will be in a future update.


Very interesting project! I have 2 questions:

1. How has it been working with ACP? Is it anywhere near feature parity with Claude code’s native interface?

2. I see your repo is written in Python which is interesting to me for a responsive TUI. Is it snappy and performant and if so what gave you done to make it feel native? And why did you choose Python?


ACP is will designed. It will always be a few features behind the native CLIs as the protocol catches up. But there is very little that you can't do with ACP. A lot can be done with slash commands that are passed through to the agent verbatim.

Python is more than capable of running a TUI. It is just text manipulation after all. Toad uses Textual, which is currently the best TUI library around. I may be biased saying that as I built it...


Hi Will,

I was about to try opencode after using claude code for quite a while.

I think understand the fundamental difference in how they work (acp against existing agentic loops with toad vs a single agentic loop for all models with opencode) but I’m curious why we might want toad over something like opencode, which lets me use any model under the sun.

I suppose toad gets to use the highly specialized agentic loops for each cli. And has a nicer (? opencode is pretty slick from my brief usage…).

Curious to hear about why you chose to built this way and what advantages you see.


That’s pretty much it. You can bring your own agent. Including OpenCode by the way. I doubt they will mind as they still get paid for the tokens.

You get a nice UI that is only going to get better as time goes on.

It’s far better model to separate the agent from the UI. The current situation is like building a browser for a single website.


Just installed it...

How are new agents added? Do you have to write a dedicated plugin for each one? Or there's some kind of discovery mechanism?

(I was looking for Copilot, but I guess that will depend on https://github.com/github/copilot-cli/issues/222 ?)


It’s stored statically in the Codebase. In the future, I suspect there will be enough compatible agents that there might be a web service to search them.

I think they are working in the Copilot ACP layer. Doubt it will take long.


It's more like just a simple toml file. https://github.com/batrachianai/toad/tree/main/src%2Ftoad%2F... gets you the currently supported ACP clients

And Copilot isn't supported for now because, well, there is no ACP support


I'm using ollama with local LLM for completion (tabby-ml) and Open WebUI for chat. What will be the goto local ACP server working with ollama ?

Ideally working with toad to experiment with it.


You may be confusing Agent Communication Protocol with Agent Client Protocol. Yeah, 2 ACP protocols. I had no hand in the naming.

If an agent can be configured to use Ollama, then you could use it from a Toad. It might be possible right now.


fast-agent has ACP support and works well with ollama. Once installed you can just use `toad acp "fast-agent-acp --model generic.<ollama-model>"`.


Sorry, not a question, just wanted to say congrats on putting this together. I am so the target market for a nice terminal interface. I can’t wait to try this out!


Thanks. Hope you like it.


Hey Will, just wanted to say this looks pretty damn spectacular. No notes. :)


Thanks!


Cool idea but why python?! Rust please and I’m all ears.


The author is also the creator of the textual Python library for creating TUIs. The performance benefits of Rust don't seem very useful in a tool where you spend a few seconds typing in a prompt and then 90% of your time is spent waiting. As long as the UI is responsive when typing there wouldn't be much of a difference.


Didn’t know that. Good reason then of course. But I do notice these sort of differences. Codex feels way better than Claude code to me for example.

I tried Toad and to me it feels ridiculously slow and laggy. Switching between input and output (ALT+up/down) for example just lags, I can notice the transition. The whole UI lags. It's no wonder, it's python. Simply the wrong language for this, sorry.


Yeah it feels slow and laggy to me too and I'm not on an old laptop. Running on a M3 Macbook Pro here. I definitely notice the difference between using something like Ghostty (Rust based - super fast) and Toad (Python).


It doesn't really make sense to compare the performance of Ghostty, a terminal emulator, with Toad, a TUI. Also Ghostty is written in Zig, not Rust.


It's obviously way slower though. Also the point stands, it's written in a low-level, performance-oriented language. The author of Toad could have written it in Rust, Zig, C++, etc, but chose Python instead. He valued ease of development versus performance and the result is we get a laggy terminal.


I know for a fact that Textual can generate an entire frame in less than a 60th of a second. Any lag you see has nothing to do with the choice of language. A TUI just doesn’t require that much number crunching to use a low level language.

I’d be interesting in knowing what platform and terminal you observed the lag, when testing Toad.


It is quite literally instantaneous on my 5 year old laptop. Whatever you are seeing isn't due to the choice of Python.


Maybe it's something on my setup then. I notice some delay even though it's by no means huge but noticable. For me these things add up, another example is pane resizing in tmux. I like things snappy, but it's kind of an OCD thing I guess.


The creator of Toad, made a TUI framework in Python (Textual). What is so special about Rust, aside from it being blazingly fast and compiled, that you want from it?


Safety, performance, avoiding python dependency hell.

I tried Toad and to me it feels ridiculously slow and laggy. Switching between input and output (ALT+up/down) for example just lags, I can notice the transition. The whole UI lags. It's no wonder, it's python. Simply the wrong language for this, sorry.


Sure, to each their own. No one forced you to use it, you have a thing called free will, and you can gladly use it.

Python is more than capable of doing that. It’s not an issue of raw execution speed.

https://willmcgugan.github.io/streaming-markdown/


I'm working on a fix for the terminal UI.

https://www.youtube.com/watch?v=OGGVdPZTc8E&t=2s


Neat!


I’m working in a universal UI for agentic coding in the terminal.

https://willmcgugan.github.io/toad-report-2/


I asked the author a while back. They said it purely relates to colour. Not other styles. Alas, they removed the issues and there is no record of that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: