pwno-mcp before it became the current open-source project. Written by Ruikai.
Where it came from
This version was a result of literation of redesigning, thinking and researching for around 6 month on this problem of “making a Pwn MCP for LLMs”:-
I first tried writing a
gdbpluging and executing via gdb python api (autogdb.io), integrating MCP backend on a single backend, authorization by rewriting bit low-level implementation of early MCP’s SSE (this was around March, see this post): Didn’t work out well, since first of capturing program stdio was a problem forgdbapis (we did tried delimeters but another story regarding timing we will mention later), while stopping multi-thread binary is bit problematic (makes the entire part of actual executing pretty much unusable) although this version was pretty scalable with only one command backend was enough. (autogdb was only solving the problem of connecting from you’re debugging (research) machine to agent client for research), it sounds easy but it was mixed with jumping between frontend, auth and specific compatibilization problem. - After realizing the scability problem of autogdb.io, I started this idea of bring even the entire research environment on cloud with scable pre-configured environments. Tons of time learning and making mistakes in k8s specifically gke, pretty much starting learning everything fron thin air. We got a working MVP on around 2 weeks diving into this (back then I still have my AP exams). Anyway backend it’s still a major problem of “how to start a environment for everyone, and how to let everyone access their own environment?” We still sticked with the original centralized MCP backend approach, but this time we assign a k8s stream channel for each users, and use these io channels on one hand to natively interact with gdb (with delimiters), this was still intended to solve the problem of program IO capturing, it’s a trick problem, I then thought about you should also let users see their gdb session on cloud, so I came up with the approach of duplicating a stdio channel back into frontend via k8s’s stream and websockets, with around 2 months of development, we got our pwno.io up-and-running, but still tons of problem that spent incredible amount of time that i didnt mentioned, from gke integration to network issues.
-
pwno.io was working I can’t say well, but at a working level, there’s still asynchroization problems and gke native problems but we managed to solve the most pain-in-the-ass scability, interactive IO problem that we spent around by far 3 months on. This is when I started working on pwnuous our cooperation with GGML, which will need a new thing like the previous version of pwno-mcp but for more stable support. Since for previous version, we’re plugged into GDB via direct IO stream, asynchroization problem as I mentioned was another huge pain-in-the-ass, some IO slipped away and it just not stable enough for use. This is when I started thinking rewriting everything, and throw away some part just for usability for LLMs and it’s full agentic compatabilization. I was working on my black hat talk back then so thought a little about statefulness, learnt about this wonderful thing that just seem to be born for us GDB/MI (Debugging with GDB), I spent few days rewriting the entire thing by reading docs. I definited did spent less time conceptualizing backend architecture for pwno.io for this version of
pwno-mcp(around 2 days mainly on gke gateway things), it’s definite not a very elaborate or sophisticated framework by all mean, but it did came from a shit tons of experience of trial-and-erroring my self while thinking about the question of making something that’s can scale (multi-agent, researcher using it), so I will say it’s by far the best conceptualizations and work to best serve for the purpose of LLMs using it stabiliy and scability. And I do think it’s the best time or the now-or-never time to open-source it, or this project or Pwno will die from lack of feedback loop, despitepwno-mcpis a little part of what we’re doing.
Use and licensing
- non-profit: yes, feel free to
- commercial:
oss@pwno.io