tokentap and tokentop
These are direct competitors offering nearly identical functionality—real-time terminal dashboards for monitoring LLM token usage and costs—with tokentap being the more mature and adopted option.
About tokentap
jmuncor/tokentap
Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track costs, debug prompts, and monitor context window usage across your AI development sessions.
Acts as a local HTTP proxy that intercepts API calls by redirecting provider base URLs (e.g., `ANTHROPIC_BASE_URL=localhost:8080`), enabling zero-configuration interception without certificate installation. Supports Claude Code, OpenAI Codex, and Gemini CLI, automatically archiving each request as markdown and JSON for prompt debugging and analysis.
About tokentop
tokentopapp/tokentop
htop for your AI costs — real-time terminal monitoring of LLM token usage and spending across providers and coding agents
Scores updated daily from GitHub, PyPI, and npm data. How scores work