[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-junhoyeo--tokscale":3,"tool-junhoyeo--tokscale":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":76,"owner_avatar_url":77,"owner_bio":78,"owner_company":78,"owner_location":79,"owner_email":80,"owner_twitter":81,"owner_website":82,"owner_url":83,"languages":84,"stars":104,"forks":105,"last_commit_at":106,"license":107,"difficulty_score":23,"env_os":108,"env_gpu":109,"env_ram":109,"env_deps":110,"category_tags":115,"github_topics":116,"view_count":23,"oss_zip_url":78,"oss_zip_packed_at":78,"status":16,"created_at":137,"updated_at":138,"faqs":139,"releases":169},2861,"junhoyeo\u002Ftokscale","tokscale","🛰️ A CLI tool for tracking token usage from OpenCode, Claude Code, 🦞OpenClaw (Clawdbot\u002FMoltbot), Pi, Codex, Gemini, Cursor, AmpCode, Factory Droid, Kimi, and more! • 🏅Global Leaderboard + 2D\u002F3D Contributions Graph","tokscale 是一款高性能的命令行工具，专为监控和分析各类 AI 编程助手的 Token 消耗与成本而设计。随着开发者频繁使用 OpenCode、Claude Code、Cursor、Kimi 等多种 AI 代理进行编码，往往难以直观掌握各模型的具体用量及相应费用，导致成本失控或优化无从下手。tokscale 正是为了解决这一痛点，它能自动读取本地不同工具的日志数据，提供统一的用量视图。\n\n这款工具特别适合依赖 AI 辅助编程的开发者、技术团队负责人以及关注大模型应用成本的研究人员。通过 tokscale，用户不仅能查看详细的每日汇总和模型统计，还能体验基于 Rust 构建的流畅原生终端界面（TUI），支持跨平台运行。其独特亮点在于提供了可视化的 2D\u002F3D 贡献图谱和全球排行榜，用户甚至可以将自己的使用数据提交至云端，生成个性化的年度总结报告。无论是为了精细控制项目预算，还是单纯想探索自己的 AI 协作习惯，tokscale 都能以轻量、专业的方式提供清晰的数据洞察，帮助用户更明智地使用 AI 资源。","\u003C!-- \u003CCENTERED SECTION FOR GITHUB DISPLAY> -->\n\n\u003Cdiv align=\"center\">\n\n[![Tokscale](.\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Ftokscale.ai)\n\n\u003C\u002Fdiv>\n\n> A high-performance CLI tool and visualization dashboard for tracking token usage and costs across multiple AI coding agents.\n\n> [!TIP]\n>\n> v2 is here — native Rust TUI, cross-platform support, and more. \u003Cbr \u002F>\n> I drop new open-source work every week. Don't miss the next one.\n>\n> | [\u003Cimg alt=\"GitHub Follow\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Ffollowers\u002Fjunhoyeo?style=flat-square&logo=github&labelColor=black&color=24292f\" width=\"156px\" \u002F>](https:\u002F\u002Fgithub.com\u002Fjunhoyeo) | Follow [@junhoyeo](https:\u002F\u002Fgithub.com\u002Fjunhoyeo) on GitHub for more projects. Hacking on AI, infra, and everything in between. |\n> | :-----| :----- |\n> [\u003Cimg alt=\"Discord link\" src=\"https:\u002F\u002Fimg.shields.io\u002Fdiscord\u002F1480206352755458110?color=5865F2&label=discord&labelColor=black&logo=discord&logoColor=white&style=flat-square\" width=\"156px\" \u002F>](https:\u002F\u002Fdiscord.gg\u002Fh6DUGWdBbm) | Come hang out in our [Discord](https:\u002F\u002Fdiscord.gg\u002Fh6DUGWdBbm) — and surround yourself with the world's top-tier vibers. |\n\n\u003Cdiv align=\"center\">\n\n[![GitHub Release](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fv\u002Frelease\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&logo=github&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Freleases)\n[![npm Downloads](https:\u002F\u002Fimg.shields.io\u002Fnpm\u002Fdt\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002Ftokscale)\n[![GitHub Contributors](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fgraphs\u002Fcontributors)\n[![GitHub Forks](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fnetwork\u002Fmembers)\n[![GitHub Stars](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fstargazers)\n[![GitHub Issues](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues)\n[![License](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flicense-MIT-white?labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fblob\u002Fmaster\u002FLICENSE)\n\n[🇺🇸 English](README.md) | [🇰🇷 한국어](README.ko.md) | [🇯🇵 日本語](README.ja.md) | [🇨🇳 简体中文](README.zh-cn.md)\n\n\u003C\u002Fdiv>\n\n\u003C!-- \u003C\u002FCENTERED SECTION FOR GITHUB DISPLAY> -->\n\n| Overview | Models |\n|:---:|:---:|\n| ![TUI Overview](.github\u002Fassets\u002Ftui-overview.png) | ![TUI Models](.github\u002Fassets\u002Ftui-models.png) | \n\n| Daily Summary | Stats |\n|:---:|:---:|\n| ![TUI Daily Summary](.github\u002Fassets\u002Ftui-daily.png) | ![TUI Stats](.github\u002Fassets\u002Ftui-stats.png) | \n\n| Frontend (3D Contributions Graph) | Wrapped 2025 |\n|:---:|:---:|\n| \u003Ca href=\"https:\u002F\u002Ftokscale.ai\">\u003Cimg alt=\"Frontend (3D Contributions Graph)\" src=\".github\u002Fassets\u002Ffrontend-contributions-graph.png\" width=\"700px\" \u002F>\u003C\u002Fa> | \u003Ca href=\"#wrapped-2025\">\u003Cimg alt=\"Wrapped 2025\" src=\".github\u002Fassets\u002Fwrapped-2025-agents.png\" width=\"700px\" \u002F>\u003C\u002Fa> |\n\n> **Run [`bunx tokscale@latest submit`](#social) to submit your usage data to the leaderboard and create your public profile!**\n\n## Overview\n\n**Tokscale** helps you monitor and analyze your token consumption from:\n\n| Logo | Client | Data Location | Supported |\n|------|----------|---------------|-----------|\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-opencode.png\" alt=\"OpenCode\" \u002F> | [OpenCode](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode) | `~\u002F.local\u002Fshare\u002Fopencode\u002Fopencode.db` (1.2+) or\u002Fand `~\u002F.local\u002Fshare\u002Fopencode\u002Fstorage\u002Fmessage\u002F` (legacy\u002Funmigrated) | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-claude.jpg\" alt=\"Claude\" \u002F> | [Claude Code](https:\u002F\u002Fdocs.anthropic.com\u002Fen\u002Fdocs\u002Fclaude-code) | `~\u002F.claude\u002Fprojects\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-openclaw.jpg\" alt=\"OpenClaw\" \u002F> | [OpenClaw](https:\u002F\u002Fopenclaw.ai\u002F) | `~\u002F.openclaw\u002Fagents\u002F` (+ legacy: `.clawdbot`, `.moltbot`, `.moldbot`) | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-openai.jpg\" alt=\"Codex\" \u002F> | [Codex CLI](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex) | `~\u002F.codex\u002Fsessions\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-gemini.png\" alt=\"Gemini\" \u002F> | [Gemini CLI](https:\u002F\u002Fgithub.com\u002Fgoogle-gemini\u002Fgemini-cli) | `~\u002F.gemini\u002Ftmp\u002F*\u002Fchats\u002F*.json` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-cursor.jpg\" alt=\"Cursor\" \u002F> | [Cursor IDE](https:\u002F\u002Fcursor.com\u002F) | API sync via `~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-amp.png\" alt=\"Amp\" \u002F> | [Amp (AmpCode)](https:\u002F\u002Fampcode.com\u002F) | `~\u002F.local\u002Fshare\u002Famp\u002Fthreads\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-droid.png\" alt=\"Droid\" \u002F> | [Droid (Factory Droid)](https:\u002F\u002Ffactory.ai\u002F) | `~\u002F.factory\u002Fsessions\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-pi.png\" alt=\"Pi\" \u002F> | [Pi](https:\u002F\u002Fgithub.com\u002Fbadlogic\u002Fpi-mono) | `~\u002F.pi\u002Fagent\u002Fsessions\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-kimi.png\" alt=\"Kimi\" \u002F> | [Kimi CLI](https:\u002F\u002Fgithub.com\u002FMoonshotAI\u002Fkimi-cli) | `~\u002F.kimi\u002Fsessions\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-qwen.png\" alt=\"Qwen\" \u002F> | [Qwen CLI](https:\u002F\u002Fgithub.com\u002FQwenLM\u002Fqwen-cli) | `~\u002F.qwen\u002Fprojects\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-roocode.png\" alt=\"Roo Code\" \u002F> | [Roo Code](https:\u002F\u002Fgithub.com\u002FRooCodeInc\u002FRoo-Code) | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F` (+ server: `~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F`) | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-kilocode.png\" alt=\"Kilo\" \u002F> | [Kilo](https:\u002F\u002Fgithub.com\u002FKilo-Org\u002Fkilocode) | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F` (+ server: `~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F`) | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-mux.png\" alt=\"Mux\" \u002F> | [Mux](https:\u002F\u002Fgithub.com\u002Fcoder\u002Fmux) | `~\u002F.mux\u002Fsessions\u002F` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-kilocode.png\" alt=\"Kilo CLI\" \u002F> | [Kilo CLI](https:\u002F\u002Fgithub.com\u002Fnicepkg\u002Fkilo) | `~\u002F.local\u002Fshare\u002Fkilo\u002Fkilo.db` | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-crush.png\" alt=\"Crush\" \u002F> | [Crush](https:\u002F\u002Fcrush.ai\u002F) | `$XDG_DATA_HOME\u002Fcrush\u002Fprojects.json` (project registry; fallback: `~\u002F.local\u002Fshare\u002Fcrush\u002Fprojects.json`) | ✅ Yes |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-synthetic.png\" alt=\"Synthetic\" \u002F> | [Synthetic](https:\u002F\u002Fsynthetic.new\u002F) | Re-attributed from other sources via `hf:` model prefix or `synthetic` provider (+ [Octofriend](https:\u002F\u002Fgithub.com\u002Fsynthetic-lab\u002Foctofriend): `~\u002F.local\u002Fshare\u002Foctofriend\u002Fsqlite.db`) | ✅ Yes |\n\nGet real-time pricing calculations using [🚅 LiteLLM's pricing data](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm), with support for tiered pricing models and cache token discounts.\n\n### Why \"Tokscale\"?\n\n[![Tokscale](.\u002F.github\u002Fassets\u002Fhero.png)](https:\u002F\u002Ftokscale.ai)\n\nThis project is inspired by the **[Kardashev scale](https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FKardashev_scale)**, a method proposed by astrophysicist Nikolai Kardashev to measure a civilization's level of technological advancement based on its energy consumption. A Type I civilization harnesses all energy available on its planet, Type II captures the entire output of its star, and Type III commands the energy of an entire galaxy.\n\nIn the age of AI-assisted development, **tokens are the new energy**. They power our reasoning, fuel our productivity, and drive our creative output. Just as the Kardashev scale tracks energy consumption at cosmic scales, Tokscale measures your token consumption as you scale the ranks of AI-augmented development. Whether you're a casual user or burning through millions of tokens daily, Tokscale helps you visualize your journey up the scale—from planetary developer to galactic code architect.\n\n## Contents\n\n- [Overview](#overview)\n  - [Why \"Tokscale\"?](#why-tokscale)\n- [Features](#features)\n- [Installation](#installation)\n  - [Quick Start](#quick-start)\n  - [Prerequisites](#prerequisites)\n  - [Development Setup](#development-setup)\n  - [Building the Native Module](#building-the-native-module)\n- [Usage](#usage)\n  - [Basic Commands](#basic-commands)\n  - [TUI Features](#tui-features)\n  - [Filtering by Platform](#filtering-by-platform)\n  - [Date Filtering](#date-filtering)\n  - [Pricing Lookup](#pricing-lookup)\n  - [Social](#social)\n  - [Cursor IDE Commands](#cursor-ide-commands)\n  - [Example Output](#example-output---light-version)\n  - [Configuration](#configuration)\n  - [Environment Variables](#environment-variables)\n- [Frontend Visualization](#frontend-visualization)\n  - [Features](#features-1)\n  - [Running the Frontend](#running-the-frontend)\n- [Social Platform](#social-platform)\n  - [Features](#features-2)\n  - [Getting Started](#getting-started)\n  - [Data Validation](#data-validation)\n- [Wrapped 2025](#wrapped-2025)\n  - [Command](#command)\n  - [What's Included](#whats-included)\n- [Development](#development)\n  - [Prerequisites](#prerequisites-1)\n  - [How to Run](#how-to-run)\n- [Supported Platforms](#supported-platforms)\n  - [Native Module Targets](#native-module-targets)\n  - [Windows Support](#windows-support)\n- [Session Data Retention](#session-data-retention)\n- [Data Sources](#data-sources)\n- [Pricing](#pricing)\n- [Contributing](#contributing)\n  - [Development Guidelines](#development-guidelines)\n- [Acknowledgments](#acknowledgments)\n- [License](#license)\n\n## Features\n\n- **Interactive TUI Mode** - Beautiful terminal UI powered by Ratatui (default mode)\n  - 4 interactive views: Overview, Models, Daily, Stats\n  - Keyboard & mouse navigation\n  - GitHub-style contribution graph with 9 color themes\n  - Real-time filtering and sorting\n  - Zero flicker rendering\n- **Multi-platform support** - Track usage across OpenCode, Claude Code, Codex CLI, Cursor IDE, Gemini CLI, Amp, Droid, OpenClaw, Pi, Kimi CLI, Qwen CLI, Roo Code, Kilo, Mux, Kilo CLI, Crush, and Synthetic\n- **Real-time pricing** - Fetches current pricing from LiteLLM with 1-hour disk cache; automatic OpenRouter fallback and Cursor model pricing for newly released models\n- **Detailed breakdowns** - Input, output, cache read\u002Fwrite, and reasoning token tracking\n- **Native Rust core** - All parsing and aggregation done in Rust for 10x faster processing\n- **Web visualization** - Interactive contribution graph with 2D and 3D views\n- **Flexible filtering** - Filter by platform, date range, or year\n- **Export to JSON** - Generate data for external visualization tools\n- **Social Platform** - Share your usage, compete on leaderboards, and view public profiles\n\n## Installation\n\n### Quick Start\n\n```bash\n# Run directly with npx\nnpx tokscale@latest\n\n# Or use bunx\nbunx tokscale@latest\n\n# Light mode (table rendering only)\nnpx tokscale@latest --light\n```\n\nThat's it! This gives you the full interactive TUI experience with zero setup.\n\n> **Package Structure**: `tokscale` is an alias package (like [`swc`](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002Fswc)) that installs `@tokscale\u002Fcli`. Both install the same CLI with the native Rust core (`@tokscale\u002Fcore`) included.\n\n\n### Prerequisites\n\n- [Node.js](https:\u002F\u002Fnodejs.org\u002F) or [Bun](https:\u002F\u002Fbun.sh\u002F)\n- (Optional) Rust toolchain for building native module from source\n\n### Development Setup\n\nFor local development or building from source:\n\n```bash\n# Clone the repository\ngit clone https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale.git\ncd tokscale\n\n# Install Bun (if not already installed)\ncurl -fsSL https:\u002F\u002Fbun.sh\u002Finstall | bash\n\n# Install dependencies\nbun install\n\n# Run the CLI in development mode\nbun run cli\n```\n\n> **Note**: `bun run cli` is for local development. When installed via `bunx tokscale`, the command runs directly. The Usage section below shows the installed binary commands.\n\n### Building the Native Module\n\nThe native Rust module is **required** for CLI operation. It provides ~10x faster processing through parallel file scanning and SIMD JSON parsing:\n\n```bash\n# Build the native core (run from repository root)\nbun run build:core\n```\n\n> **Note**: Native binaries are pre-built and included when you install via `bunx tokscale@latest`. Building from source is only needed for local development.\n\n## Usage\n\n### Basic Commands\n\n```bash\n# Launch interactive TUI (default)\ntokscale\n\n# Launch TUI with specific tab\ntokscale models    # Models tab\ntokscale monthly   # Daily view (shows daily breakdown)\n\n# Use legacy CLI table output\ntokscale --light\ntokscale models --light\n\n# Launch TUI explicitly\ntokscale tui\n\n# Export contribution graph data as JSON\ntokscale graph --output data.json\n\n# Output data as JSON (for scripting\u002Fautomation)\ntokscale --json                    # Default models view as JSON\ntokscale models --json             # Models breakdown as JSON\ntokscale monthly --json            # Monthly breakdown as JSON\ntokscale models --json > report.json   # Save to file\n```\n\n### TUI Features\n\nThe interactive TUI mode provides:\n\n- **4 Views**: Overview (chart + top models), Models, Daily, Stats (contribution graph)\n- **Keyboard Navigation**:\n  - `1-4` or `←\u002F→\u002FTab`: Switch views\n  - `↑\u002F↓`: Navigate lists\n  - `c\u002Fd\u002Ft`: Sort by cost\u002Fdate\u002Ftokens\n  - `s`: Open source picker dialog\n  - `g`: Open group-by picker dialog (model, client+model, client+provider+model)\n  - `p`: Cycle through 9 color themes\n  - `r`: Refresh data\n  - `e`: Export to JSON\n  - `q`: Quit\n- **Mouse Support**: Click tabs, buttons, and filters\n- **Themes**: Green, Halloween, Teal, Blue, Pink, Purple, Orange, Monochrome, YlGnBu\n- **Settings Persistence**: Preferences saved to `~\u002F.config\u002Ftokscale\u002Fsettings.json` (see [Configuration](#configuration))\n\n### Group-By Strategies\n\nPress `g` in the TUI or use `--group-by` in `--light`\u002F`--json` mode to control how model rows are aggregated:\n\n| Strategy | Flag | TUI Default | Effect |\n|----------|------|-------------|--------|\n| **Model** | `--group-by model` | ✅ | One row per model — merges all clients and providers |\n| **Client + Model** | `--group-by client,model` | | One row per client-model pair |\n| **Client + Provider + Model** | `--group-by client,provider,model` | | Most granular — no merging |\n\n**`--group-by model`** (most consolidated)\n\n| Clients | Providers | Model | Cost |\n|---------|-----------|-------|------|\n| OpenCode, Claude, Amp | github-copilot, anthropic | claude-opus-4-5 | $2,424 |\n| OpenCode, Claude | anthropic, github-copilot | claude-sonnet-4-5 | $1,332 |\n\n**`--group-by client,model`** (CLI default)\n\n| Client | Provider | Model | Cost |\n|--------|----------|-------|------|\n| OpenCode | github-copilot, anthropic | claude-opus-4-5 | $1,368 |\n| Claude | anthropic | claude-opus-4-5 | $970 |\n\n**`--group-by client,provider,model`** (most granular)\n\n| Client | Provider | Model | Cost |\n|--------|----------|-------|------|\n| OpenCode | github-copilot | claude-opus-4-5 | $1,200 |\n| OpenCode | anthropic | claude-opus-4-5 | $168 |\n| Claude | anthropic | claude-opus-4-5 | $970 |\n\n### Filtering by Platform\n\n```bash\n# Show only OpenCode usage\ntokscale --opencode\n\n# Show only Claude Code usage\ntokscale --claude\n\n# Show only Codex CLI usage\ntokscale --codex\n\n# Show only Gemini CLI usage\ntokscale --gemini\n\n# Show only Cursor IDE usage (requires `tokscale cursor login` first)\ntokscale --cursor\n\n# Show only Amp usage\ntokscale --amp\n\n# Show only Droid usage\ntokscale --droid\n\n# Show only OpenClaw usage\ntokscale --openclaw\n\n# Show only Pi usage\ntokscale --pi\n\n# Show only Kimi CLI usage\ntokscale --kimi\n\n# Show only Qwen CLI usage\ntokscale --qwen\n\n# Show only Roo Code usage\ntokscale --roocode\n\n# Show only Kilo usage\ntokscale --kilocode\n\n# Show only Mux usage\ntokscale --mux\n\n# Show only Kilo CLI usage\ntokscale --kilo\n\n# Show only Crush usage\ntokscale --crush\n\n# Show only Synthetic (synthetic.new) usage\ntokscale --synthetic\n\n# Combine filters\ntokscale --opencode --claude\n```\n\n### Date Filtering\n\nDate filters work across all commands that generate reports (`tokscale`, `tokscale models`, `tokscale monthly`, `tokscale graph`):\n\n```bash\n# Quick date shortcuts\ntokscale --today              # Today only\ntokscale --week               # Last 7 days\ntokscale --month              # Current calendar month\n\n# Custom date range (inclusive, local timezone)\ntokscale --since 2024-01-01 --until 2024-12-31\n\n# Filter by year\ntokscale --year 2024\n\n# Combine with other options\ntokscale models --week --claude --json\ntokscale monthly --month --benchmark\n```\n\n> **Note**: Date filters use your local timezone. Both `--since` and `--until` are inclusive.\n\n### Pricing Lookup\n\nLook up real-time pricing for any model:\n\n```bash\n# Look up model pricing\ntokscale pricing \"claude-3-5-sonnet-20241022\"\ntokscale pricing \"gpt-4o\"\ntokscale pricing \"grok-code\"\n\n# Force specific provider source\ntokscale pricing \"grok-code\" --provider openrouter\ntokscale pricing \"claude-3-5-sonnet\" --provider litellm\n```\n\n**Lookup Strategy:**\n\nThe pricing lookup uses a multi-step resolution strategy:\n\n1. **Exact Match** - Direct lookup in LiteLLM\u002FOpenRouter databases\n2. **Alias Resolution** - Resolves friendly names (e.g., `big-pickle` → `glm-4.7`)\n3. **Tier Suffix Stripping** - Removes quality tiers (`gpt-5.2-xhigh` → `gpt-5.2`)\n4. **Version Normalization** - Handles version formats (`claude-3-5-sonnet` ↔ `claude-3.5-sonnet`)\n5. **Provider Prefix Matching** - Tries common prefixes (`anthropic\u002F`, `openai\u002F`, etc.)\n6. **Cursor Model Pricing** - Hardcoded pricing for models not yet in LiteLLM\u002FOpenRouter (e.g., `gpt-5.3-codex`)\n7. **Fuzzy Matching** - Word-boundary matching for partial model names\n\n**Provider Preference:**\n\nWhen multiple matches exist, original model creators are preferred over resellers:\n\n| Preferred (Original) | Deprioritized (Reseller) |\n|---------------------|-------------------------|\n| `xai\u002F` (Grok) | `azure_ai\u002F` |\n| `anthropic\u002F` (Claude) | `bedrock\u002F` |\n| `openai\u002F` (GPT) | `vertex_ai\u002F` |\n| `google\u002F` (Gemini) | `together_ai\u002F` |\n| `meta-llama\u002F` | `fireworks_ai\u002F` |\n\nExample: `grok-code` matches `xai\u002Fgrok-code-fast-1` ($0.20\u002F$1.50) instead of `azure_ai\u002Fgrok-code-fast-1` ($3.50\u002F$17.50).\n\n### Social\n\n```bash\n# Login to Tokscale (opens browser for GitHub auth)\ntokscale login\n\n# Check who you're logged in as\ntokscale whoami\n\n# Submit your usage data to the leaderboard\ntokscale submit\n\n# Submit with filters\ntokscale submit --opencode --claude --since 2024-01-01\n\n# Preview what would be submitted (dry run)\ntokscale submit --dry-run\n\n# Logout\ntokscale logout\n```\n\n\u003Cimg alt=\"CLI Submit\" src=\".\u002F.github\u002Fassets\u002Fcli-submit.png\" \u002F>\n\n### Cursor IDE Commands\n\nCursor IDE requires separate authentication via session token (different from the social platform login):\n\n```bash\n# Login to Cursor (requires session token from browser)\n# --name is optional; it just helps you identify accounts later\ntokscale cursor login --name work\n\n# Check Cursor authentication status and session validity\ntokscale cursor status\n\n# List saved Cursor accounts\ntokscale cursor accounts\n\n# Switch active account (controls which account syncs to cursor-cache\u002Fusage.csv)\ntokscale cursor switch work\n\n# Logout from a specific account (keeps history; excludes it from aggregation)\ntokscale cursor logout --name work\n\n# Logout and delete cached usage for that account\ntokscale cursor logout --name work --purge-cache\n\n# Logout from all Cursor accounts (keeps history; excludes from aggregation)\ntokscale cursor logout --all\n\n# Logout from all accounts and delete cached usage\ntokscale cursor logout --all --purge-cache\n```\n\nBy default, tokscale **aggregates usage across all saved Cursor accounts** (all `cursor-cache\u002Fusage*.csv`).\n\nWhen you log out, tokscale keeps your cached usage history by moving it to `cursor-cache\u002Farchive\u002F` (so it won't be aggregated). Use `--purge-cache` if you want to delete the cached usage instead.\n\n**Credentials storage**: Cursor accounts are stored in `~\u002F.config\u002Ftokscale\u002Fcursor-credentials.json`. Usage data is cached at `~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F` (active account uses `usage.csv`, additional accounts use `usage.\u003Caccount>.csv`).\n\n**To get your Cursor session token:**\n1. Open https:\u002F\u002Fwww.cursor.com\u002Fsettings in your browser\n2. Open Developer Tools (F12)\n3. **Option A - Network tab**: Make any action on the page, find a request to `cursor.com\u002Fapi\u002F*`, look in the Request Headers for the `Cookie` header, and copy only the value after `WorkosCursorSessionToken=`\n4. **Option B - Application tab**: Go to Application → Cookies → `https:\u002F\u002Fwww.cursor.com`, find the `WorkosCursorSessionToken` cookie, and copy its value (not the cookie name)\n\n> ⚠️ **Security Warning**: Treat your session token like a password. Never share it publicly or commit it to version control. The token grants full access to your Cursor account.\n\n### Example Output (`--light` version)\n\n\u003Cimg alt=\"CLI Light\" src=\".\u002F.github\u002Fassets\u002Fcli-light.png\" \u002F>\n\n### Configuration\n\nTokscale stores settings in `~\u002F.config\u002Ftokscale\u002Fsettings.json`:\n\n```json\n{\n  \"colorPalette\": \"blue\",\n  \"includeUnusedModels\": false\n}\n```\n\n| Setting | Type | Default | Description |\n|---------|------|---------|-------------|\n| `colorPalette` | string | `\"blue\"` | TUI color theme (green, halloween, teal, blue, pink, purple, orange, monochrome, ylgnbu) |\n| `includeUnusedModels` | boolean | `false` | Show models with zero tokens in reports |\n| `autoRefreshEnabled` | boolean | `false` | Enable auto-refresh in TUI |\n| `autoRefreshMs` | number | `60000` | Auto-refresh interval (30000-3600000ms) |\n| `nativeTimeoutMs` | number | `300000` | Maximum time for native subprocess processing (5000-3600000ms) |\n\n### Environment Variables\n\nEnvironment variables override config file values. For CI\u002FCD or one-off use:\n\n| Variable | Default | Description |\n|----------|---------|-------------|\n| `TOKSCALE_NATIVE_TIMEOUT_MS` | `300000` (5 min) | Overrides `nativeTimeoutMs` config |\n\n```bash\n# Example: Increase timeout for very large datasets\nTOKSCALE_NATIVE_TIMEOUT_MS=600000 tokscale graph --output data.json\n```\n\n> **Note**: For persistent changes, prefer setting `nativeTimeoutMs` in `~\u002F.config\u002Ftokscale\u002Fsettings.json`. Environment variables are best for one-off overrides or CI\u002FCD.\n\n### Headless Mode\n\nTokscale can aggregate token usage from **Codex CLI headless outputs** for automation, CI\u002FCD pipelines, and batch processing.\n\n**What is headless mode?**\n\nWhen you run Codex CLI with JSON output flags (e.g., `codex exec --json`), it outputs usage data to stdout instead of storing it in its regular session directories. Headless mode allows you to capture and track this usage.\n\n**Storage location:** `~\u002F.config\u002Ftokscale\u002Fheadless\u002F`\n\nOn macOS, Tokscale also scans `~\u002FLibrary\u002FApplication Support\u002Ftokscale\u002Fheadless\u002F` when `TOKSCALE_HEADLESS_DIR` is not set.\n\nTokscale automatically scans this directory structure:\n```\n~\u002F.config\u002Ftokscale\u002Fheadless\u002F\n└── codex\u002F       # Codex CLI JSONL outputs\n```\n\n**Environment variable:** Set `TOKSCALE_HEADLESS_DIR` to customize the headless log directory:\n```bash\nexport TOKSCALE_HEADLESS_DIR=\"$HOME\u002Fmy-custom-logs\"\n```\n\n**Recommended (automatic capture):**\n\n| Tool | Command Example |\n|------|-----------------|\n| **Codex CLI** | `tokscale headless codex exec -m gpt-5 \"implement feature\"` |\n\n**Manual redirect (optional):**\n\n| Tool | Command Example |\n|------|-----------------|\n| **Codex CLI** | `codex exec --json \"implement feature\" > ~\u002F.config\u002Ftokscale\u002Fheadless\u002Fcodex\u002Fci-run.jsonl` |\n\n**Diagnostics:**\n\n```bash\n# Show scan locations and headless counts\ntokscale sources\ntokscale sources --json\n```\n\n**CI\u002FCD integration example:**\n\n```bash\n# In your GitHub Actions workflow\n- name: Run AI automation\n  run: |\n    mkdir -p ~\u002F.config\u002Ftokscale\u002Fheadless\u002Fcodex\n    codex exec --json \"review code changes\" \\\n      > ~\u002F.config\u002Ftokscale\u002Fheadless\u002Fcodex\u002Fpr-${{ github.event.pull_request.number }}.jsonl\n\n# Later, track usage\n- name: Report token usage\n  run: tokscale --json\n```\n\n> **Note**: Headless capture is supported for Codex CLI only. If you run Codex directly, redirect stdout to the headless directory as shown above.\n\n## Frontend Visualization\n\nThe frontend provides a GitHub-style contribution graph visualization:\n\n### Features\n\n- **2D View**: Classic GitHub contribution calendar\n- **3D View**: Isometric 3D contribution graph with height based on token usage\n- **Multiple color palettes**: GitHub, GitLab, Halloween, Winter, and more\n- **3-way theme toggle**: Light \u002F Dark \u002F System (follows OS preference)\n- **GitHub Primer design**: Uses GitHub's official color system\n- **Interactive tooltips**: Hover for detailed daily breakdowns\n- **Day breakdown panel**: Click to see per-source and per-model details\n- **Year filtering**: Navigate between years\n- **Source filtering**: Filter by platform (OpenCode, Claude, Codex, Cursor, Gemini, Amp, Droid, OpenClaw, Pi, Kimi, Qwen, Roo Code, Kilo, Mux, Kilo CLI, Crush, Synthetic)\n- **Stats panel**: Total cost, tokens, active days, streaks\n- **FOUC prevention**: Theme applied before React hydrates (no flash)\n\n### Running the Frontend\n\n```bash\ncd packages\u002Ffrontend\nbun install\nbun run dev\n```\n\nOpen [http:\u002F\u002Flocalhost:3000](http:\u002F\u002Flocalhost:3000) to access the social platform.\n\n## Social Platform\n\nTokscale includes a social platform where you can share your usage data and compete with other developers.\n\n### Features\n\n- **Leaderboard** - See who's using the most tokens across all platforms\n- **User Profiles** - Public profiles with contribution graphs and statistics\n- **Period Filtering** - View stats for all time, this month, or this week\n- **GitHub Integration** - Login with your GitHub account\n- **Local Viewer** - View your data privately without submitting\n\n### GitHub Profile Embed Widget\n\nYou can embed your public Tokscale stats directly in your GitHub profile README:\n\n```md\n[![Tokscale Stats](https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fembed\u002F\u003Cusername>\u002Fsvg)](https:\u002F\u002Ftokscale.ai\u002Fu\u002F\u003Cusername>)\n```\n\n- Replace `\u003Cusername>` with your GitHub username\n- Optional query params:\n  - `theme=light` for a light theme\n  - `sort=tokens` (default) or `sort=cost` to control ranking basis\n  - `compact=1` to use compact layout + compact number notation (e.g., `1.2M`, `$3.4K`)\n- Example:\n  - `https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fembed\u002F\u003Cusername>\u002Fsvg?theme=light&sort=cost&compact=1`\n\n### GitHub Profile Badge\n\nYou can also use a shields.io-style badge for a more compact display:\n\n```md\n![Tokscale Tokens](https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fbadge\u002F\u003Cusername>\u002Fsvg)\n```\n\n- Replace `\u003Cusername>` with your GitHub username\n- Optional query params:\n  - `metric=tokens` (default), `metric=cost`, or `metric=rank`\n  - `style=flat` (default) or `style=flat-square`\n  - `sort=tokens` (default) or `sort=cost` to control ranking basis\n  - `compact=1` to use compact number notation (e.g., `1.2M`, `$3.4K`)\n  - `label=\u003Ctext>` to override the left-side label\n  - `color=\u003Chex>` to override the right-side color (e.g., `color=ff5733`)\n- Examples:\n  - `https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fbadge\u002F\u003Cusername>\u002Fsvg?metric=cost&compact=1`\n  - `https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fbadge\u002F\u003Cusername>\u002Fsvg?metric=rank&sort=cost&style=flat-square`\n\n### Getting Started\n\n1. **Login** - Run `tokscale login` to authenticate via GitHub\n2. **Submit** - Run `tokscale submit` to upload your usage data\n3. **View** - Visit the web platform to see your profile and the leaderboard\n\n### Data Validation\n\nSubmitted data goes through Level 1 validation:\n- Mathematical consistency (totals match, no negatives)\n- No future dates\n- Required fields present\n- Duplicate detection\n\n## Wrapped 2025\n\n![Wrapped 2025](.github\u002Fassets\u002Fhero-wrapped-2025.png)\n\nGenerate a beautiful year-in-review image summarizing your AI coding assistant usage—inspired by Spotify Wrapped.\n\n| `bunx tokscale@latest wrapped` | `bunx tokscale@latest wrapped --clients` | `bunx tokscale@latest wrapped --agents --disable-pinned` |\n|:---:|:---:|:---:|\n| ![Wrapped 2025 (Agents + Pin Sisyphus)](.github\u002Fassets\u002Fwrapped-2025-agents.png) | ![Wrapped 2025 (Clients)](.github\u002Fassets\u002Fwrapped-2025-clients.png) | ![Wrapped 2025 (Agents + Disable Pinned)](.github\u002Fassets\u002Fwrapped-2025-agents-disable-pinned.png) |\n\n### Command\n\n```bash\n# Generate wrapped image for current year\ntokscale wrapped\n\n# Generate for a specific year\ntokscale wrapped --year 2025\n```\n\n### What's Included\n\nThe generated image includes:\n\n- **Total Tokens** - Your total token consumption for the year\n- **Top Models** - Your 3 most-used AI models ranked by cost\n- **Top Clients** - Your 3 most-used platforms (OpenCode, Claude Code, Cursor, etc.)\n- **Messages** - Total number of AI interactions\n- **Active Days** - Days with at least one AI interaction\n- **Cost** - Estimated total cost based on LiteLLM pricing\n- **Streak** - Your longest consecutive streak of active days\n- **Contribution Graph** - A visual heatmap of your yearly activity\n\nThe generated PNG is optimized for sharing on social media. Share your coding journey with the community!\n\n## Development\n\n> **Quick setup**: If you just want to get started quickly, see [Development Setup](#development-setup) in the Installation section above.\n\n### Prerequisites\n\n```bash\n# Bun (required)\nbun --version\n\n# Rust (for native module)\nrustc --version\ncargo --version\n```\n\n### How to Run\n\nAfter following the [Development Setup](#development-setup), you can:\n\n```bash\n# Build native module (optional but recommended)\nbun run build:core\n\n# Run in development mode (launches TUI)\ncd packages\u002Fcli && bun src\u002Fcli.ts\n\n# Or use legacy CLI mode\ncd packages\u002Fcli && bun src\u002Fcli.ts --light\n```\n\n\u003Cdetails>\n\u003Csummary>Advanced Development\u003C\u002Fsummary>\n\n### Project Scripts\n\n| Script | Description |\n|--------|-------------|\n| `bun run cli` | Run CLI in development mode (TUI with Bun) |\n| `bun run build:core` | Build native Rust module (release) |\n| `bun run build:cli` | Build CLI TypeScript to dist\u002F |\n| `bun run build` | Build both core and CLI |\n| `bun run dev:frontend` | Run frontend development server |\n\n**Package-specific scripts** (from within package directories):\n- `packages\u002Fcli`: `bun run dev`, `bun run tui`\n- `packages\u002Fcore`: `bun run build:debug`, `bun run test`, `bun run bench`\n\n**Note**: This project uses **Bun** as the package manager for development.\n\n### Testing\n\n```bash\n# Test native module (Rust)\ncd packages\u002Fcore\nbun run test:rust      # Cargo tests\nbun run test           # Node.js integration tests\nbun run test:all       # Both\n```\n\n### Native Module Development\n\n```bash\ncd packages\u002Fcore\n\n# Build in debug mode (faster compilation)\nbun run build:debug\n\n# Build in release mode (optimized)\nbun run build\n\n# Run Rust benchmarks\nbun run bench\n```\n\n### Graph Command Options\n\n```bash\n# Export graph data to file\ntokscale graph --output usage-data.json\n\n# Date filtering (all shortcuts work)\ntokscale graph --today\ntokscale graph --week\ntokscale graph --since 2024-01-01 --until 2024-12-31\ntokscale graph --year 2024\n\n# Filter by platform\ntokscale graph --opencode --claude\n\n# Show processing time benchmark\ntokscale graph --output data.json --benchmark\n```\n\n### Benchmark Flag\n\nShow processing time for performance analysis:\n\n```bash\ntokscale --benchmark           # Show processing time with default view\ntokscale models --benchmark    # Benchmark models report\ntokscale monthly --benchmark   # Benchmark monthly report\ntokscale graph --benchmark     # Benchmark graph generation\n```\n\n### Generating Data for Frontend\n\n```bash\n# Export data for visualization\ntokscale graph --output packages\u002Ffrontend\u002Fpublic\u002Fmy-data.json\n```\n\n### Performance\n\nThe native Rust module provides significant performance improvements:\n\n| Operation | TypeScript | Rust Native | Speedup |\n|-----------|------------|-------------|---------|\n| File Discovery | ~500ms | ~50ms | **10x** |\n| JSON Parsing | ~800ms | ~100ms | **8x** |\n| Aggregation | ~200ms | ~25ms | **8x** |\n| **Total** | **~1.5s** | **~175ms** | **~8.5x** |\n\n*Benchmarks for ~1000 session files, 100k messages*\n\n#### Memory Optimization\n\nThe native module also provides ~45% memory reduction through:\n\n- Streaming JSON parsing (no full file buffering)\n- Zero-copy string handling\n- Efficient parallel aggregation with map-reduce\n\n#### Running Benchmarks\n\n```bash\n# Generate synthetic data\ncd packages\u002Fbenchmarks && bun run generate\n\n# Run Rust benchmarks\ncd packages\u002Fcore && bun run bench\n```\n\n\u003C\u002Fdetails>\n\n## Supported Platforms\n\n### Native Module Targets\n\n| Platform | Architecture | Status |\n|----------|--------------|--------|\n| macOS | x86_64 | ✅ Supported |\n| macOS | aarch64 (Apple Silicon) | ✅ Supported |\n| Linux | x86_64 (glibc) | ✅ Supported |\n| Linux | aarch64 (glibc) | ✅ Supported |\n| Linux | x86_64 (musl) | ✅ Supported |\n| Linux | aarch64 (musl) | ✅ Supported |\n| Windows | x86_64 | ✅ Supported |\n| Windows | aarch64 | ✅ Supported |\n\n### Windows Support\n\nTokscale fully supports Windows. The TUI and CLI work the same as on macOS\u002FLinux.\n\n**Installation on Windows:**\n```powershell\n# Install Bun (PowerShell)\npowershell -c \"irm bun.sh\u002Finstall.ps1 | iex\"\n\n# Run tokscale\nbunx tokscale@latest\n```\n\n#### Data Locations on Windows\n\nAI coding tools store their session data in cross-platform locations. Most tools use the same relative paths on all platforms:\n\n| Tool | Unix Path | Windows Path | Source |\n|------|-----------|--------------|--------|\n| OpenCode | `~\u002F.local\u002Fshare\u002Fopencode\u002F` | `%USERPROFILE%\\.local\\share\\opencode\\` | Uses [`xdg-basedir`](https:\u002F\u002Fgithub.com\u002Fsindresorhus\u002Fxdg-basedir) for cross-platform consistency ([source](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode\u002Fblob\u002Fmain\u002Fpackages\u002Fopencode\u002Fsrc\u002Fglobal\u002Findex.ts)) |\n| Claude Code | `~\u002F.claude\u002F` | `%USERPROFILE%\\.claude\\` | Same path on all platforms |\n| OpenClaw | `~\u002F.openclaw\u002F` (+ legacy: `.clawdbot`, `.moltbot`, `.moldbot`) | `%USERPROFILE%\\.openclaw\\` (+ legacy paths) | Same path on all platforms |\n| Codex CLI | `~\u002F.codex\u002F` | `%USERPROFILE%\\.codex\\` | Configurable via `CODEX_HOME` env var ([source](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex)) |\n| Gemini CLI | `~\u002F.gemini\u002F` | `%USERPROFILE%\\.gemini\\` | Same path on all platforms |\n| Amp | `~\u002F.local\u002Fshare\u002Famp\u002F` | `%USERPROFILE%\\.local\\share\\amp\\` | Uses `xdg-basedir` like OpenCode |\n| Cursor | API sync | API sync | Data fetched via API, cached in `%USERPROFILE%\\.config\\tokscale\\cursor-cache\\` |\n| Droid | `~\u002F.factory\u002F` | `%USERPROFILE%\\.factory\\` | Same path on all platforms |\n| Pi | `~\u002F.pi\u002F` | `%USERPROFILE%\\.pi\\` | Same path on all platforms |\n| Kimi CLI | `~\u002F.kimi\u002F` | `%USERPROFILE%\\.kimi\\` | Same path on all platforms |\n| Qwen CLI | `~\u002F.qwen\u002F` | `%USERPROFILE%\\.qwen\\` | Same path on all platforms |\n| Roo Code | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F` | `%USERPROFILE%\\.config\\Code\\User\\globalStorage\\rooveterinaryinc.roo-cline\\tasks\\` | VS Code globalStorage task logs |\n| Kilo | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F` | `%USERPROFILE%\\.config\\Code\\User\\globalStorage\\kilocode.kilo-code\\tasks\\` | VS Code globalStorage task logs |\n| Mux | `~\u002F.mux\u002Fsessions\u002F` | `%USERPROFILE%\\.mux\\sessions\\` | Same path on all platforms |\n| Kilo CLI | `~\u002F.local\u002Fshare\u002Fkilo\u002F` | `%USERPROFILE%\\.local\\share\\kilo\\` | Uses `xdg-basedir` like OpenCode |\n| Crush | `$XDG_DATA_HOME\u002Fcrush\u002F` (fallback: `~\u002F.local\u002Fshare\u002Fcrush\u002F`) | `%USERPROFILE%\\.local\\share\\crush\\` (or `%XDG_DATA_HOME%\\crush\\` if set) | Uses XDG data directory with fallback |\n| Synthetic | Re-attributed from other sources | Re-attributed from other sources | Detects `hf:` model prefix + `synthetic` provider |\n\n> **Note**: On Windows, `~` expands to `%USERPROFILE%` (e.g., `C:\\Users\\YourName`). These tools intentionally use Unix-style paths (like `.local\u002Fshare`) even on Windows for cross-platform consistency, rather than Windows-native paths like `%APPDATA%`.\n\n#### Windows-Specific Configuration\n\nTokscale stores its configuration in:\n- **Config**: `%USERPROFILE%\\.config\\tokscale\\settings.json`\n- **Cache**: `%USERPROFILE%\\.cache\\tokscale\\`\n- **Cursor credentials**: `%USERPROFILE%\\.config\\tokscale\\cursor-credentials.json`\n\n## Session Data Retention\n\nBy default, some AI coding assistants automatically delete old session files. To preserve your usage history for accurate tracking, disable or extend the cleanup period.\n\n| Platform | Default | Config File | Setting to Disable | Source |\n|----------|---------|-------------|-------------------|--------|\n| Claude Code | **⚠️ 30 days** | `~\u002F.claude\u002Fsettings.json` | `\"cleanupPeriodDays\": 9999999999` | [Docs](https:\u002F\u002Fdocs.anthropic.com\u002Fen\u002Fdocs\u002Fclaude-code\u002Fsettings) |\n| Gemini CLI | Disabled | `~\u002F.gemini\u002Fsettings.json` | `\"general.sessionRetention.enabled\": false` | [Docs](https:\u002F\u002Fgithub.com\u002Fgoogle-gemini\u002Fgemini-cli\u002Fblob\u002Fmain\u002Fdocs\u002Fcli\u002Fsession-management.md) |\n| Codex CLI | Disabled | N\u002FA | No cleanup feature | [#6015](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex\u002Fissues\u002F6015) |\n| OpenCode | Disabled | N\u002FA | No cleanup feature | [#4980](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode\u002Fissues\u002F4980) |\n\n### Claude Code\n\n**Default**: 30 days cleanup period\n\nAdd to `~\u002F.claude\u002Fsettings.json`:\n```json\n{\n  \"cleanupPeriodDays\": 9999999999\n}\n```\n\n> Setting an extremely large value (e.g., `9999999999` days ≈ 27 million years) effectively disables cleanup.\n\n### Gemini CLI\n\n**Default**: Cleanup disabled (sessions persist forever)\n\nIf you've enabled cleanup and want to disable it, remove or set `enabled: false` in `~\u002F.gemini\u002Fsettings.json`:\n```json\n{\n  \"general\": {\n    \"sessionRetention\": {\n      \"enabled\": false\n    }\n  }\n}\n```\n\nOr set an extremely long retention period:\n```json\n{\n  \"general\": {\n    \"sessionRetention\": {\n      \"enabled\": true,\n      \"maxAge\": \"9999999d\"\n    }\n  }\n}\n```\n\n### Codex CLI\n\n**Default**: No automatic cleanup (sessions persist forever)\n\nCodex CLI does not have built-in session cleanup. Sessions in `~\u002F.codex\u002Fsessions\u002F` persist indefinitely.\n\n> **Note**: There's an open feature request for this: [#6015](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex\u002Fissues\u002F6015)\n\n### OpenCode\n\n**Default**: No automatic cleanup (sessions persist forever)\n\nOpenCode does not have built-in session cleanup. Sessions in `~\u002F.local\u002Fshare\u002Fopencode\u002Fstorage\u002F` persist indefinitely.\n\n> **Note**: See [#4980](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode\u002Fissues\u002F4980)\n\n---\n\n## Data Sources\n\n### OpenCode\n\nLocation: `~\u002F.local\u002Fshare\u002Fopencode\u002Fopencode.db` (v1.2+) or `storage\u002Fmessage\u002F{sessionId}\u002F*.json` (legacy)\n\nOpenCode 1.2+ stores sessions in SQLite. Tokscale reads from SQLite first and falls back to legacy JSON files for older versions.\n\nEach message contains:\n```json\n{\n  \"id\": \"msg_xxx\",\n  \"role\": \"assistant\",\n  \"modelID\": \"claude-sonnet-4-20250514\",\n  \"providerID\": \"anthropic\",\n  \"tokens\": {\n    \"input\": 1234,\n    \"output\": 567,\n    \"reasoning\": 0,\n    \"cache\": { \"read\": 890, \"write\": 123 }\n  },\n  \"time\": { \"created\": 1699999999999 }\n}\n```\n\n### Claude Code\n\nLocation: `~\u002F.claude\u002Fprojects\u002F{projectPath}\u002F*.jsonl`\n\nJSONL format with assistant messages containing usage data:\n```json\n{\"type\": \"assistant\", \"message\": {\"model\": \"claude-sonnet-4-20250514\", \"usage\": {\"input_tokens\": 1234, \"output_tokens\": 567, \"cache_read_input_tokens\": 890}}, \"timestamp\": \"2024-01-01T00:00:00Z\"}\n```\n\n### Codex CLI\n\nLocation: `~\u002F.codex\u002Fsessions\u002F*.jsonl`\n\nEvent-based format with `token_count` events:\n```json\n{\"type\": \"event_msg\", \"payload\": {\"type\": \"token_count\", \"info\": {\"last_token_usage\": {\"input_tokens\": 1234, \"output_tokens\": 567}}}}\n```\n\n### Gemini CLI\n\nLocation: `~\u002F.gemini\u002Ftmp\u002F{projectHash}\u002Fchats\u002F*.json`\n\nSession files containing message arrays:\n```json\n{\n  \"sessionId\": \"xxx\",\n  \"messages\": [\n    {\"type\": \"gemini\", \"model\": \"gemini-2.5-pro\", \"tokens\": {\"input\": 1234, \"output\": 567, \"cached\": 890, \"thoughts\": 123}}\n  ]\n}\n```\n\n### Cursor IDE\n\nLocation: `~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F` (synced via Cursor API)\n\nCursor data is fetched from the Cursor API using your session token and cached locally. Run `tokscale cursor login` to authenticate. See [Cursor IDE Commands](#cursor-ide-commands) for setup instructions.\n\n### OpenClaw\n\nLocation: `~\u002F.openclaw\u002Fagents\u002F*\u002Fsessions\u002Fsessions.json` (also scans legacy paths: `~\u002F.clawdbot\u002F`, `~\u002F.moltbot\u002F`, `~\u002F.moldbot\u002F`)\n\nIndex file pointing to JSONL session files:\n```json\n{\n  \"agent:main:main\": {\n    \"sessionId\": \"uuid\",\n    \"sessionFile\": \"\u002Fpath\u002Fto\u002Fsession.jsonl\"\n  }\n}\n```\n\nSession JSONL format with model_change events and assistant messages:\n```json\n{\"type\":\"model_change\",\"provider\":\"openai-codex\",\"modelId\":\"gpt-5.2\"}\n{\"type\":\"message\",\"message\":{\"role\":\"assistant\",\"usage\":{\"input\":1660,\"output\":55,\"cacheRead\":108928,\"cost\":{\"total\":0.02}},\"timestamp\":1769753935279}}\n```\n\n### Pi\n\nLocation: `~\u002F.pi\u002Fagent\u002Fsessions\u002F\u003Cencoded-cwd>\u002F*.jsonl`\n\nJSONL format with session header and message entries:\n```json\n{\"type\":\"session\",\"id\":\"pi_ses_001\",\"timestamp\":\"2026-01-01T00:00:00.000Z\",\"cwd\":\"\u002Ftmp\"}\n{\"type\":\"message\",\"id\":\"msg_001\",\"timestamp\":\"2026-01-01T00:00:01.000Z\",\"message\":{\"role\":\"assistant\",\"model\":\"claude-3-5-sonnet\",\"provider\":\"anthropic\",\"usage\":{\"input\":100,\"output\":50,\"cacheRead\":10,\"cacheWrite\":5,\"totalTokens\":165}}}\n```\n\n### Kimi CLI\n\nLocation: `~\u002F.kimi\u002Fsessions\u002F{GROUP_ID}\u002F{SESSION_UUID}\u002Fwire.jsonl`\n\nwire.jsonl format with StatusUpdate messages:\n```json\n{\"type\": \"metadata\", \"protocol_version\": \"1.3\"}\n{\"timestamp\": 1770983426.420942, \"message\": {\"type\": \"StatusUpdate\", \"payload\": {\"token_usage\": {\"input_other\": 1562, \"output\": 2463, \"input_cache_read\": 0, \"input_cache_creation\": 0}, \"message_id\": \"chatcmpl-xxx\"}}}\n```\n\n### Qwen CLI\n\nLocation: `~\u002F.qwen\u002Fprojects\u002F{PROJECT_PATH}\u002Fchats\u002F{CHAT_ID}.jsonl`\n\nFormat: JSONL — one JSON object per line, each with `type`, `model`, `timestamp`, `sessionId`, and `usageMetadata` fields.\n\nToken fields (from `usageMetadata`):\n- `promptTokenCount` → input tokens\n- `candidatesTokenCount` → output tokens\n- `thoughtsTokenCount` → reasoning\u002Fthinking tokens\n- `cachedContentTokenCount` → cached input tokens\n\n### Roo Code\n\nLocation:\n- Local: `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n- Server (best-effort): `~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n\nEach task directory may also include `api_conversation_history.json` with `\u003Cenvironment_details>` blocks used for model\u002Fagent metadata.\n\n`ui_messages.json` is an array of UI events. Tokscale counts only:\n- `type == \"say\"`\n- `say == \"api_req_started\"`\n\nThe `text` field is JSON containing token\u002Fcost metadata:\n```json\n{\n  \"type\": \"say\",\n  \"say\": \"api_req_started\",\n  \"ts\": \"2026-02-18T12:00:00Z\",\n  \"text\": \"{\\\"cost\\\":0.12,\\\"tokensIn\\\":100,\\\"tokensOut\\\":50,\\\"cacheReads\\\":20,\\\"cacheWrites\\\":5,\\\"apiProtocol\\\":\\\"anthropic\\\"}\"\n}\n```\n\n### Kilo\n\nLocation:\n- Local: `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n- Server (best-effort): `~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n\nKilo uses the same task log shape as Roo Code. Tokscale applies the same rules:\n- count only `say\u002Fapi_req_started` events from `ui_messages.json`\n- parse `tokensIn`, `tokensOut`, `cacheReads`, `cacheWrites`, `cost`, and `apiProtocol` from `text` JSON\n- enrich model\u002Fagent metadata from sibling `api_conversation_history.json` when available\n\n### Mux\n\nLocation:\n- `~\u002F.mux\u002Fsessions\u002F{WORKSPACE_ID}\u002Fsession-usage.json`\n\nMux stores cumulative per-session token usage in `session-usage.json` files. Each file contains a `byModel` map with per-model token breakdowns:\n- `input`, `cached` (cache reads), `cacheCreate` (cache writes), `output`, `reasoning`\n- Model names use `provider:model` format (e.g., `anthropic:claude-opus-4-6`) — tokscale strips the provider prefix for model identification\n- Sub-agent usage is automatically rolled up into parent sessions by Mux, so there is no double-counting\n\n### Kilo CLI\n\nLocation: `~\u002F.local\u002Fshare\u002Fkilo\u002Fkilo.db`\n\nKilo CLI stores session data in a SQLite database similar to OpenCode. Each message row contains per-message token breakdowns (input, output, cache read\u002Fwrite, reasoning) with model and provider attribution.\n\n### Crush\n\nLocation: Project-level SQLite databases discovered via `$XDG_DATA_HOME\u002Fcrush\u002Fprojects.json` (fallback: `~\u002F.local\u002Fshare\u002Fcrush\u002Fprojects.json`)\n\nCrush stores usage in per-project SQLite databases (`crush.db`). Tokscale imports session-level cost totals from root sessions only, because Crush does not expose reliable per-message or per-model token accounting. Records appear as `model=session-total` with zero token breakdown.\n\n### Synthetic (synthetic.new)\n\nSynthetic usage is detected via post-processing of existing agent session files. Messages are re-attributed to `synthetic` when they use `hf:` model IDs or synthetic providers (`synthetic`, `glhf`, `octofriend`).\n\nTokscale also checks Octofriend SQLite at `~\u002F.local\u002Fshare\u002Foctofriend\u002Fsqlite.db` and parses token-bearing records when available.\n\n## Pricing\n\nTokscale fetches real-time pricing from [LiteLLM's pricing database](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm\u002Fblob\u002Fmain\u002Fmodel_prices_and_context_window.json).\n\n**Dynamic Fallback**: For models not yet available in LiteLLM (e.g., recently released models), Tokscale automatically fetches pricing from [OpenRouter's endpoints API](https:\u002F\u002Fopenrouter.ai\u002Fdocs\u002Fapi\u002Fapi-reference\u002Fendpoints\u002Flist-endpoints). This ensures you get accurate pricing from the model's author provider (e.g., Z.AI for glm-4.7) without waiting for LiteLLM updates.\n\n**Cursor Model Pricing**: For very recently released models not yet in either LiteLLM or OpenRouter (e.g., `gpt-5.3-codex`), Tokscale includes hardcoded pricing sourced from [Cursor's model docs](https:\u002F\u002Fcursor.com\u002Fen-US\u002Fdocs\u002Fmodels). These overrides are checked after all upstream sources but before fuzzy matching, so they automatically yield once real upstream pricing becomes available.\n\n**Caching**: Pricing data is cached to disk with 1-hour TTL for fast startup:\n- LiteLLM cache: `~\u002F.cache\u002Ftokscale\u002Fpricing-litellm.json`\n- OpenRouter cache: `~\u002F.cache\u002Ftokscale\u002Fpricing-openrouter.json` (caches author pricing for models from supported providers)\n\nPricing includes:\n- Input tokens\n- Output tokens\n- Cache read tokens (discounted)\n- Cache write tokens\n- Reasoning tokens (for models like o1)\n- Tiered pricing (above 200k tokens)\n\n## Contributing\n\nContributions are welcome! Please follow these steps:\n\n1. Fork the repository\n2. Create a feature branch (`git checkout -b feature\u002Famazing-feature`)\n3. Make your changes\n4. Run tests (`cd packages\u002Fcore && bun run test:all`)\n5. Commit your changes (`git commit -m 'Add amazing feature'`)\n6. Push to the branch (`git push origin feature\u002Famazing-feature`)\n7. Open a Pull Request\n\n### Development Guidelines\n\n- Follow existing code style\n- Add tests for new functionality\n- Update documentation as needed\n- Keep commits focused and atomic\n\n## Acknowledgments\n\n- [ccusage](https:\u002F\u002Fgithub.com\u002Fryoppippi\u002Fccusage), [viberank](https:\u002F\u002Fgithub.com\u002Fsculptdotfun\u002Fviberank), and [Isometric Contributions](https:\u002F\u002Fgithub.com\u002Fjasonlong\u002Fisometric-contributions) for inspiration\n- [Ratatui](https:\u002F\u002Fgithub.com\u002Fratatui\u002Fratatui) for terminal UI framework\n- [Solid.js](https:\u002F\u002Fwww.solidjs.com\u002F) for reactive rendering\n- [LiteLLM](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm) for pricing data\n- [napi-rs](https:\u002F\u002Fnapi.rs\u002F) for Rust\u002FNode.js bindings\n- [github-contributions-canvas](https:\u002F\u002Fgithub.com\u002Fsallar\u002Fgithub-contributions-canvas) for 2D graph reference\n\n## License\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fjunhoyeo\">\n    \u003Cimg src=\".github\u002Fassets\u002Flabtocat-on-spaceship.png\" width=\"540\">\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n  \u003Cstrong>MIT © \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fjunhoyeo\">Junho Yeo\u003C\u002Fa>\u003C\u002Fstrong>\n\u003C\u002Fp>\n\nIf you find this project intriguing, **please consider starring it ⭐** or [follow me on GitHub](https:\u002F\u002Fgithub.com\u002Fjunhoyeo) and join the ride (1.1k+ already aboard). I code around the clock and ship mind-blowing things on a regular basis—your support won't go to waste.\n","\u003C!-- \u003CCENTERED SECTION FOR GITHUB DISPLAY> -->\n\n\u003Cdiv align=\"center\">\n\n[![Tokscale](.\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Ftokscale.ai)\n\n\u003C\u002Fdiv>\n\n> 一款高性能的命令行工具及可视化仪表盘，用于跟踪多个 AI 编码代理的 token 使用量和成本。\n\n> [!TIP]\n>\n> v2 已发布——原生 Rust TUI、跨平台支持等更多功能。每周我都会推出新的开源项目，别错过下一个哦。\n>\n> | [\u003Cimg alt=\"GitHub Follow\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Ffollowers\u002Fjunhoyeo?style=flat-square&logo=github&labelColor=black&color=24292f\" width=\"156px\" \u002F>](https:\u002F\u002Fgithub.com\u002Fjunhoyeo) | 关注 GitHub 上的 [@junhoyeo](https:\u002F\u002Fgithub.com\u002Fjunhoyeo)，了解更多项目。专注于 AI、基础设施以及相关领域的开发。 |\n> | :-----| :----- |\n> [\u003Cimg alt=\"Discord link\" src=\"https:\u002F\u002Fimg.shields.io\u002Fdiscord\u002F1480206352755458110?color=5865F2&label=discord&labelColor=black&logo=discord&logoColor=white&style=flat-square\" width=\"156px\" \u002F>](https:\u002F\u002Fdiscord.gg\u002Fh6DUGWdBbm) | 欢迎加入我们的 [Discord](https:\u002F\u002Fdiscord.gg\u002Fh6DUGWdBbm) 社区，与全球顶尖开发者一起交流互动。 |\n\n\u003Cdiv align=\"center\">\n\n[![GitHub Release](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fv\u002Frelease\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&logo=github&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Freleases)\n[![npm Downloads](https:\u002F\u002Fimg.shields.io\u002Fnpm\u002Fdt\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002Ftokscale)\n[![GitHub Contributors](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fgraphs\u002Fcontributors)\n[![GitHub Forks](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fnetwork\u002Fmembers)\n[![GitHub Stars](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fstargazers)\n[![GitHub Issues](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002Fjunhoyeo\u002Ftokscale?color=0073FF&labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues)\n[![License](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flicense-MIT-white?labelColor=black&style=flat-square)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fblob\u002Fmaster\u002FLICENSE)\n\n[🇺🇸 English](README.md) | [🇰🇷 한국어](README.ko.md) | [🇯🇵 日本語](README.ja.md) | [🇨🇳 简体中文](README.zh-cn.md)\n\n\u003C\u002Fdiv>\n\n\u003C!-- \u003C\u002FCENTERED SECTION FOR GITHUB DISPLAY> -->\n\n| 概览 | 模型 |\n|:---:|:---:|\n| ![TUI 概览](.github\u002Fassets\u002Ftui-overview.png) | ![TUI 模型](.github\u002Fassets\u002Ftui-models.png) | \n\n| 每日摘要 | 统计数据 |\n|:---:|:---:|\n| ![TUI 每日摘要](.github\u002Fassets\u002Ftui-daily.png) | ![TUI 统计数据](.github\u002Fassets\u002Ftui-stats.png) | \n\n| 前端（3D 贡献图） | Wrapped 2025 |\n|:---:|:---:|\n| \u003Ca href=\"https:\u002F\u002Ftokscale.ai\">\u003Cimg alt=\"前端（3D 贡献图）\" src=\".github\u002Fassets\u002Ffrontend-contributions-graph.png\" width=\"700px\" \u002F>\u003C\u002Fa> | \u003Ca href=\"#wrapped-2025\">\u003Cimg alt=\"Wrapped 2025\" src=\".github\u002Fassets\u002Fwrapped-2025-agents.png\" width=\"700px\" \u002F>\u003C\u002Fa> |\n\n> **运行 [`bunx tokscale@latest submit`](#social)，将你的使用数据提交到排行榜并创建公开个人资料！**\n\n## 概述\n\n**Tokscale** 可帮助你监控和分析以下来源的 token 消耗情况：\n\n| 标志 | 客户端 | 数据位置 | 支持情况 |\n|------|----------|---------------|-----------|\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-opencode.png\" alt=\"OpenCode\" \u002F> | [OpenCode](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode) | `~\u002F.local\u002Fshare\u002Fopencode\u002Fopencode.db` (1.2+) 或\u002F且 `~\u002F.local\u002Fshare\u002Fopencode\u002Fstorage\u002Fmessage\u002F` (旧版\u002F未迁移) | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-claude.jpg\" alt=\"Claude\" \u002F> | [Claude Code](https:\u002F\u002Fdocs.anthropic.com\u002Fen\u002Fdocs\u002Fclaude-code) | `~\u002F.claude\u002Fprojects\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-openclaw.jpg\" alt=\"OpenClaw\" \u002F> | [OpenClaw](https:\u002F\u002Fopenclaw.ai\u002F) | `~\u002F.openclaw\u002Fagents\u002F` (+ 旧版：`.clawdbot`、`.moltbot`、`.moldbot`) | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-openai.jpg\" alt=\"Codex\" \u002F> | [Codex CLI](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex) | `~\u002F.codex\u002Fsessions\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-gemini.png\" alt=\"Gemini\" \u002F> | [Gemini CLI](https:\u002F\u002Fgithub.com\u002Fgoogle-gemini\u002Fgemini-cli) | `~\u002F.gemini\u002Ftmp\u002F*\u002Fchats\u002F*.json` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-cursor.jpg\" alt=\"Cursor\" \u002F> | [Cursor IDE](https:\u002F\u002Fcursor.com\u002F) | 通过 `~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F` 进行 API 同步 | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-amp.png\" alt=\"Amp\" \u002F> | [Amp (AmpCode)](https:\u002F\u002Fampcode.com\u002F) | `~\u002F.local\u002Fshare\u002Famp\u002Fthreads\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-droid.png\" alt=\"Droid\" \u002F> | [Droid (Factory Droid)](https:\u002F\u002Ffactory.ai\u002F) | `~\u002F.factory\u002Fsessions\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-pi.png\" alt=\"Pi\" \u002F> | [Pi](https:\u002F\u002Fgithub.com\u002Fbadlogic\u002Fpi-mono) | `~\u002F.pi\u002Fagent\u002Fsessions\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-kimi.png\" alt=\"Kimi\" \u002F> | [Kimi CLI](https:\u002F\u002Fgithub.com\u002FMoonshotAI\u002Fkimi-cli) | `~\u002F.kimi\u002Fsessions\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-qwen.png\" alt=\"Qwen\" \u002F> | [Qwen CLI](https:\u002F\u002Fgithub.com\u002FQwenLM\u002Fqwen-cli) | `~\u002F.qwen\u002Fprojects\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-roocode.png\" alt=\"Roo Code\" \u002F> | [Roo Code](https:\u002F\u002Fgithub.com\u002FRooCodeInc\u002FRoo-Code) | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F` (+ 服务器：`~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F`) | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-kilocode.png\" alt=\"Kilo\" \u002F> | [Kilo](https:\u002F\u002Fgithub.com\u002FKilo-Org\u002Fkilocode) | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F` (+ 服务器：`~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F`) | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-mux.png\" alt=\"Mux\" \u002F> | [Mux](https:\u002F\u002Fgithub.com\u002Fcoder\u002Fmux) | `~\u002F.mux\u002Fsessions\u002F` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-kilocode.png\" alt=\"Kilo CLI\" \u002F> | [Kilo CLI](https:\u002F\u002Fgithub.com\u002Fnicepkg\u002Fkilo) | `~\u002F.local\u002Fshare\u002Fkilo\u002Fkilo.db` | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-crush.png\" alt=\"Crush\" \u002F> | [Crush](https:\u002F\u002Fcrush.ai\u002F) | `$XDG_DATA_HOME\u002Fcrush\u002Fprojects.json` (项目注册表；备用：`~\u002F.local\u002Fshare\u002Fcrush\u002Fprojects.json`) | ✅ 是 |\n| \u003Cimg width=\"48px\" src=\".github\u002Fassets\u002Fclient-synthetic.png\" alt=\"Synthetic\" \u002F> | [Synthetic](https:\u002F\u002Fsynthetic.new\u002F) | 通过 `hf:` 模型前缀或 `synthetic` 提供商从其他来源重新归因而来 (+ [Octofriend](https:\u002F\u002Fgithub.com\u002Fsynthetic-lab\u002Foctofriend)：`~\u002F.local\u002Fshare\u002Foctofriend\u002Fsqlite.db`) | ✅ 是 |\n\n使用 [🚅 LiteLLM 的定价数据](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm)，获取实时定价计算，支持分层定价模型和缓存 token 折扣。\n\n### 为什么叫“Tokscale”？\n\n[![Tokscale](.\u002F.github\u002Fassets\u002Fhero.png)](https:\u002F\u002Ftokscale.ai)\n\n本项目灵感源自**卡达谢夫量表**（Kardashev scale），这是天体物理学家尼古拉·卡达谢夫提出的一种衡量文明技术发展水平的方法，其依据是该文明的能量消耗。I型文明能够利用其所在行星上的全部能量，II型文明则能捕获其恒星的全部能量输出，而III型文明则掌控整个星系的能量。\n\n在人工智能辅助开发的时代，**token就是新的能源**。它们驱动我们的推理能力、提升生产力，并推动创意产出。正如卡达谢夫量表以宇宙尺度追踪能量消耗一样，Tokscale会根据你在 AI 增强开发领域的进步程度来衡量你的 token 消耗情况。无论你是普通用户，还是每天消耗数百万 tokens 的开发者，Tokscale 都能帮助你可视化自己的成长轨迹——从行星级开发者到银河系级别的代码架构师。\n\n## 目录\n\n- [概述](#overview)\n  - [为什么叫“Tokscale”？](#why-tokscale)\n- [功能](#features)\n- [安装](#installation)\n  - [快速入门](#quick-start)\n  - [前提条件](#prerequisites)\n  - [开发环境搭建](#development-setup)\n  - [构建原生模块](#building-the-native-module)\n- [使用方法](#usage)\n  - [基本命令](#basic-commands)\n  - [TUI 功能](#tui-features)\n  - [按平台筛选](#filtering-by-platform)\n  - [按日期筛选](#date-filtering)\n  - [价格查询](#pricing-lookup)\n  - [社交功能](#social)\n  - [Cursor IDE 命令](#cursor-ide-commands)\n  - [示例输出](#example-output---light-version)\n  - [配置](#configuration)\n  - [环境变量](#environment-variables)\n- [前端可视化](#frontend-visualization)\n  - [功能](#features-1)\n  - [运行前端](#running-the-frontend)\n- [社交平台](#social-platform)\n  - [功能](#features-2)\n  - [开始使用](#getting-started)\n  - [数据验证](#data-validation)\n- [Wrapped 2025](#wrapped-2025)\n  - [命令](#command)\n  - [包含内容](#whats-included)\n- [开发](#development)\n  - [前提条件](#prerequisites-1)\n  - [如何运行](#how-to-run)\n- [支持的平台](#supported-platforms)\n  - [原生模块目标](#native-module-targets)\n  - [Windows 支持](#windows-support)\n- [会话数据保留](#session-data-retention)\n- [数据来源](#data-sources)\n- [定价](#pricing)\n- [贡献](#contributing)\n  - [开发指南](#development-guidelines)\n- [致谢](#acknowledgments)\n- [许可证](#license)\n\n## 功能\n\n- **交互式 TUI 模式**：由 Ratatui 提供支持的精美终端 UI（默认模式）\n  - 4 个交互式视图：概览、模型、每日、统计\n  - 键盘与鼠标导航\n  - GitHub 风格的贡献图，提供 9 种配色主题\n  - 实时过滤与排序\n  - 无闪烁渲染\n- **多平台支持**：跟踪 OpenCode、Claude Code、Codex CLI、Cursor IDE、Gemini CLI、Amp、Droid、OpenClaw、Pi、Kimi CLI、Qwen CLI、Roo Code、Kilo、Mux、Kilo CLI、Crush 和 Synthetic 等平台的使用情况。\n- **实时定价**：从 LiteLLM 获取当前价格，缓存有效期为 1 小时；自动回退至 OpenRouter，并为新发布的模型提供 Cursor 模型定价。\n- **详细拆分**：输入、输出、缓存读写以及推理 token 的追踪。\n- **原生 Rust 核心**：所有解析和聚合操作均由 Rust 完成，处理速度提升 10 倍。\n- **Web 可视化**：交互式贡献图，支持 2D 和 3D 视图。\n- **灵活筛选**：可按平台、日期范围或年份进行筛选。\n- **导出为 JSON**：生成数据以便用于外部可视化工具。\n- **社交平台**：分享你的使用情况、参与排行榜竞争并查看公开资料。\n\n## 安装\n\n### 快速入门\n\n```bash\n# 直接使用 npx 运行\nnpx tokscale@latest\n\n# 或者使用 bunx\nbunx tokscale@latest\n\n# 轻量模式（仅表格显示）\nnpx tokscale@latest --light\n```\n\n就这么简单！无需任何设置即可获得完整的交互式 TUI 体验。\n\n> **包结构**：“tokscale” 是一个别名包（类似于 [`swc`](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002Fswc)），它会安装 `@tokscale\u002Fcli`。两者都会安装相同的 CLI，并包含原生 Rust 核心 (`@tokscale\u002Fcore`)。\n\n\n### 前提条件\n\n- [Node.js](https:\u002F\u002Fnodejs.org\u002F) 或 [Bun](https:\u002F\u002Fbun.sh\u002F)\n- （可选）Rust 工具链，用于从源码构建原生模块。\n\n### 开发环境搭建\n\n若需本地开发或从源码构建：\n\n```bash\n# 克隆仓库\ngit clone https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale.git\ncd tokscale\n\n# 安装 Bun（如果尚未安装）\ncurl -fsSL https:\u002F\u002Fbun.sh\u002Finstall | bash\n\n# 安装依赖\nbun install\n\n# 以开发模式运行 CLI\nbun run cli\n```\n\n> **注意**：“bun run cli” 用于本地开发。通过 `bunx tokscale` 安装后，直接运行即可。下方的使用部分展示了已安装的二进制命令。\n\n### 构建原生模块\n\n原生 Rust 模块是 CLI 正常运行的**必要条件**。它通过并行文件扫描和 SIMD JSON 解析，将处理速度提升约 10 倍：\n\n```bash\n# 构建原生核心（在仓库根目录下运行）\nbun run build:core\n```\n\n> **注意**：通过 `bunx tokscale@latest` 安装时，预编译的原生二进制文件已包含在内。只有在本地开发时才需要从源码构建。\n\n## 使用方法\n\n### 基本命令\n\n```bash\n# 启动交互式 TUI（默认）\ntokscale\n\n# 启动特定标签页的 TUI\ntokscale models    # 模型标签页\ntokscale monthly   # 每日视图（显示每日明细）\n\n# 使用旧版 CLI 表格输出\ntokscale --light\ntokscale models --light\n\n# 显式启动 TUI\ntokscale tui\n\n# 导出贡献图数据为 JSON\ntokscale graph --output data.json\n\n# 输出数据为 JSON（用于脚本或自动化）\ntokscale --json                    # 默认模型视图的 JSON 格式\ntokscale models --json             # 模型明细的 JSON 格式\ntokscale monthly --json            # 月度明细的 JSON 格式\ntokscale models --json > report.json   # 保存到文件\n```\n\n### TUI 功能\n\n交互式 TUI 模式提供以下功能：\n\n- **4 个视图**：概览（图表 + 排行前几的模型）、模型、每日、统计（贡献图）。\n- **键盘导航**：\n  - `1-4` 或 `←\u002F→\u002FTab`：切换视图。\n  - `↑\u002F↓`：浏览列表。\n  - `c\u002Fd\u002Ft`：按成本\u002F日期\u002Ftoken 数量排序。\n  - `s`：打开源选择对话框。\n  - `g`：打开分组选择对话框（按模型、客户端+模型、客户端+提供商+模型）。\n  - `p`：循环切换 9 种配色主题。\n  - `r`：刷新数据。\n  - `e`：导出为 JSON。\n  - `q`：退出。\n- **鼠标支持**：点击标签页、按钮和筛选器。\n- **主题**：绿色、万圣节、青蓝色、蓝色、粉色、紫色、橙色、单色、YlGnBu。\n- **设置持久化**：偏好设置会保存到 `~\u002F.config\u002Ftokscale\u002Fsettings.json` 文件中（参见 [配置](#configuration)）。\n\n### 分组策略\n\n在 TUI 中按 `g` 键，或在 `--light`\u002F`--json` 模式下使用 `--group-by` 选项，可以控制模型行的聚合方式：\n\n| 策略 | 标志 | TUI 默认值 | 效果 |\n|----------|------|-------------|--------|\n| **模型** | `--group-by model` | ✅ | 每个模型一行 — 合并所有客户端和提供商 |\n| **客户端 + 模型** | `--group-by client,model` | | 每个客户端-模型组合一行 |\n| **客户端 + 提供商 + 模型** | `--group-by client,provider,model` | | 最细粒度 — 不进行合并 |\n\n**`--group-by model`**（最整合）\n\n| 客户端 | 提供商 | 模型 | 费用 |\n|---------|-----------|-------|------|\n| OpenCode、Claude、Amp | github-copilot、anthropic | claude-opus-4-5 | $2,424 |\n| OpenCode、Claude | anthropic、github-copilot | claude-sonnet-4-5 | $1,332 |\n\n**`--group-by client,model`**（CLI 默认）\n\n| 客户端 | 提供商 | 模型 | 费用 |\n|--------|----------|-------|------|\n| OpenCode | github-copilot、anthropic | claude-opus-4-5 | $1,368 |\n| Claude | anthropic | claude-opus-4-5 | $970 |\n\n**`--group-by client,provider,model`**（最细粒度）\n\n| 客户端 | 提供商 | 模型 | 费用 |\n|--------|----------|-------|------|\n| OpenCode | github-copilot | claude-opus-4-5 | $1,200 |\n| OpenCode | anthropic | claude-opus-4-5 | $168 |\n| Claude | anthropic | claude-opus-4-5 | $970 |\n\n### 按平台筛选\n\n```bash\n# 仅显示 OpenCode 的使用情况\ntokscale --opencode\n\n# 仅显示 Claude Code 的使用情况\ntokscale --claude\n\n# 仅显示 Codex CLI 的使用情况\ntokscale --codex\n\n# 仅显示 Gemini CLI 的使用情况\ntokscale --gemini\n\n# 仅显示 Cursor IDE 的使用情况（需先执行 `tokscale cursor login`）\ntokscale --cursor\n\n# 仅显示 Amp 的使用情况\ntokscale --amp\n\n# 仅显示 Droid 的使用情况\ntokscale --droid\n\n# 仅显示 OpenClaw 的使用情况\ntokscale --openclaw\n\n# 仅显示 Pi 的使用情况\ntokscale --pi\n\n# 仅显示 Kimi CLI 的使用情况\ntokscale --kimi\n\n# 仅显示 Qwen CLI 的使用情况\ntokscale --qwen\n\n# 仅显示 Roo Code 的使用情况\ntokscale --roocode\n\n# 仅显示 Kilo 的使用情况\ntokscale --kilocode\n\n# 仅显示 Mux 的使用情况\ntokscale --mux\n\n# 仅显示 Kilo CLI 的使用情况\ntokscale --kilo\n\n# 仅显示 Crush 的使用情况\ntokscale --crush\n\n# 仅显示 Synthetic (synthetic.new) 的使用情况\ntokscale --synthetic\n\n# 组合筛选条件\ntokscale --opencode --claude\n```\n\n### 日期筛选\n\n日期筛选适用于所有生成报告的命令（`tokscale`、`tokscale models`、`tokscale monthly`、`tokscale graph`）：\n\n```bash\n# 快速日期快捷键\ntokscale --today              # 仅今天\ntokscale --week               # 近 7 天\ntokscale --month              # 当前日历月\n\n# 自定义日期范围（包含起始与结束日期，本地时区）\ntokscale --since 2024-01-01 --until 2024-12-31\n\n# 按年份筛选\ntokscale --year 2024\n\n# 结合其他选项\ntokscale models --week --claude --json\ntokscale monthly --month --benchmark\n```\n\n> **注意**：日期筛选使用您的本地时区。`--since` 和 `--until` 均为包含性范围。\n\n### 定价查询\n\n查询任意模型的实时定价：\n\n```bash\n# 查询模型定价\ntokscale pricing \"claude-3-5-sonnet-20241022\"\ntokscale pricing \"gpt-4o\"\ntokscale pricing \"grok-code\"\n\n# 强制指定提供商来源\ntokscale pricing \"grok-code\" --provider openrouter\ntokscale pricing \"claude-3-5-sonnet\" --provider litellm\n```\n\n**查询策略：**\n\n定价查询采用多步骤解析策略：\n\n1. **精确匹配** - 直接在 LiteLLM\u002FOpenRouter 数据库中查找\n2. **别名解析** - 解析友好名称（如 `big-pickle` → `glm-4.7`）\n3. **去除等级后缀** - 移除质量等级（`gpt-5.2-xhigh` → `gpt-5.2`）\n4. **版本标准化** - 处理版本格式（`claude-3-5-sonnet` ↔ `claude-3.5-sonnet`）\n5. **匹配提供商前缀** - 尝试常见前缀（`anthropic\u002F`、`openai\u002F` 等）\n6. **Cursor 模型定价** - 对尚未收录于 LiteLLM\u002FOpenRouter 的模型硬编码定价（如 `gpt-5.3-codex`）\n7. **模糊匹配** - 基于词边界对部分模型名称进行匹配\n\n**提供商优先级：**\n\n当存在多个匹配项时，原始模型开发者优先于转售商：\n\n| 优先（原始） | 降级（转售商） |\n|---------------------|-------------------------|\n| `xai\u002F`（Grok） | `azure_ai\u002F` |\n| `anthropic\u002F`（Claude） | `bedrock\u002F` |\n| `openai\u002F`（GPT） | `vertex_ai\u002F` |\n| `google\u002F`（Gemini） | `together_ai\u002F` |\n| `meta-llama\u002F` | `fireworks_ai\u002F` |\n\n例如：`grok-code` 匹配 `xai\u002Fgrok-code-fast-1`（$0.20\u002F$1.50），而非 `azure_ai\u002Fgrok-code-fast-1`（$3.50\u002F$17.50）。\n\n### 社交功能\n\n```bash\n# 登录 Tokscale（打开浏览器进行 GitHub 认证）\ntokscale login\n\n# 查看当前登录用户\ntokscale whoami\n\n# 提交使用数据至排行榜\ntokscale submit\n\n# 带筛选条件提交\ntokscale submit --opencode --claude --since 2024-01-01\n\n# 预览将要提交的内容（试运行）\ntokscale submit --dry-run\n\n# 注销\ntokscale logout\n```\n\n\u003Cimg alt=\"CLI Submit\" src=\".\u002F.github\u002Fassets\u002Fcli-submit.png\" \u002F>\n\n### Cursor IDE 命令\n\nCursor IDE 需要通过会话令牌单独认证（与社交平台登录不同）：\n\n```bash\n# 登录 Cursor（需要从浏览器获取会话令牌）\n# --name 是可选的；它只是帮助您稍后识别账户\ntokscale cursor login --name work\n\n# 检查 Cursor 认证状态及会话有效期\ntokscale cursor status\n\n# 列出已保存的 Cursor 账户\ntokscale cursor accounts\n\n# 切换活动账户（控制哪个账户同步到 cursor-cache\u002Fusage.csv）\ntokscale cursor switch work\n\n# 从特定账户注销（保留历史记录；不计入汇总）\ntokscale cursor logout --name work\n\n# 注销并删除该账户的缓存使用数据\ntokscale cursor logout --name work --purge-cache\n\n# 从所有 Cursor 账户注销（保留历史记录；不计入汇总）\ntokscale cursor logout --all\n\n# 退出所有账户并删除缓存的使用数据\ntokscale cursor logout --all --purge-cache\n```\n\n默认情况下，tokscale 会**聚合所有已保存的 Cursor 账户的使用情况**（即所有的 `cursor-cache\u002Fusage*.csv` 文件）。\n\n当你注销时，tokscale 会将你的缓存使用历史移动到 `cursor-cache\u002Farchive\u002F` 目录下，这样它就不会被聚合。如果你希望直接删除这些缓存的使用数据，可以使用 `--purge-cache` 参数。\n\n**凭据存储**：Cursor 账户信息存储在 `~\u002F.config\u002Ftokscale\u002Fcursor-credentials.json` 中。使用数据则缓存在 `~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F` 目录下（当前活跃账户的数据存储在 `usage.csv` 中，其他账户的数据则分别存储在 `usage.\u003Caccount>.csv` 文件中）。\n\n**获取 Cursor 会话令牌的方法**：\n1. 在浏览器中打开 https:\u002F\u002Fwww.cursor.com\u002Fsettings\n2. 打开开发者工具（按 F12 键）\n3. **选项 A - Network 标签页**：在页面上执行任意操作，找到一个指向 `cursor.com\u002Fapi\u002F*` 的请求，在请求头中查找 `Cookie` 字段，复制 `WorkosCursorSessionToken=` 后面的值。\n4. **选项 B - Application 标签页**：进入 Application → Cookies → `https:\u002F\u002Fwww.cursor.com`，找到名为 `WorkosCursorSessionToken` 的 Cookie，复制其值（不是 Cookie 名称）。\n\n> ⚠️ **安全提示**：请将你的会话令牌视为密码一样对待。切勿公开分享或将其提交到版本控制系统中。该令牌可让你完全访问你的 Cursor 账户。\n\n### 示例输出（`--light` 版本）\n\n\u003Cimg alt=\"CLI Light\" src=\".\u002F.github\u002Fassets\u002Fcli-light.png\" \u002F>\n\n### 配置\n\nTokscale 将设置存储在 `~\u002F.config\u002Ftokscale\u002Fsettings.json` 文件中：\n\n```json\n{\n  \"colorPalette\": \"blue\",\n  \"includeUnusedModels\": false\n}\n```\n\n| 设置 | 类型 | 默认值 | 描述 |\n|------|------|--------|------|\n| `colorPalette` | 字符串 | `\"blue\"` | TUI 颜色主题（green、halloween、teal、blue、pink、purple、orange、monochrome、ylgnbu） |\n| `includeUnusedModels` | 布尔值 | `false` | 在报告中显示零用量的模型 |\n| `autoRefreshEnabled` | 布尔值 | `false` | 在 TUI 中启用自动刷新 |\n| `autoRefreshMs` | 数字 | `60000` | 自动刷新间隔（30000-3600000 毫秒） |\n| `nativeTimeoutMs` | 数字 | `300000` | 原生子进程处理的最大时间（5000-3600000 毫秒） |\n\n### 环境变量\n\n环境变量会覆盖配置文件中的值。适用于 CI\u002FCD 或一次性使用场景：\n\n| 变量 | 默认值 | 描述 |\n|------|--------|------|\n| `TOKSCALE_NATIVE_TIMEOUT_MS` | `300000`（5 分钟） | 覆盖 `nativeTimeoutMs` 配置 |\n\n```bash\n# 示例：为超大数据集增加超时时间\nTOKSCALE_NATIVE_TIMEOUT_MS=600000 tokscale graph --output data.json\n```\n\n> **注意**：若需持久化更改，建议在 `~\u002F.config\u002Ftokscale\u002Fsettings.json` 中设置 `nativeTimeoutMs`。环境变量更适合用于一次性覆盖或 CI\u002FCD 场景。\n\n### 无头模式\n\nTokscale 可以从 **Codex CLI 的无头输出** 中聚合 Token 使用情况，适用于自动化、CI\u002FCD 流水线和批量处理。\n\n**什么是无头模式？**\n\n当你使用 JSON 输出标志运行 Codex CLI（例如 `codex exec --json`）时，它会将使用数据输出到标准输出，而不是存储在其常规会话目录中。无头模式允许你捕获并跟踪这些使用数据。\n\n**存储位置**：`~\u002F.config\u002Ftokscale\u002Fheadless\u002F`\n\n在 macOS 上，如果未设置 `TOKSCALE_HEADLESS_DIR`，Tokscale 还会扫描 `~\u002FLibrary\u002FApplication Support\u002Ftokscale\u002Fheadless\u002F` 目录。\n\nTokscale 会自动扫描以下目录结构：\n```\n~\u002F.config\u002Ftokscale\u002Fheadless\u002F\n└── codex\u002F       # Codex CLI 的 JSONL 输出\n```\n\n**环境变量**：设置 `TOKSCALE_HEADLESS_DIR` 可自定义无头日志目录：\n```bash\nexport TOKSCALE_HEADLESS_DIR=\"$HOME\u002Fmy-custom-logs\"\n```\n\n**推荐（自动捕获）**：\n\n| 工具 | 命令示例 |\n|------|----------|\n| **Codex CLI** | `tokscale headless codex exec -m gpt-5 \"implement feature\"` |\n\n**手动重定向（可选）**：\n\n| 工具 | 命令示例 |\n|------|----------|\n| **Codex CLI** | `codex exec --json \"implement feature\" > ~\u002F.config\u002Ftokscale\u002Fheadless\u002Fcodex\u002Fci-run.jsonl` |\n\n**诊断信息**：\n\n```bash\n# 显示扫描位置和无头计数\ntokscale sources\ntokscale sources --json\n```\n\n**CI\u002FCD 集成示例**：\n\n```bash\n# 在你的 GitHub Actions 工作流中\n- name: 运行 AI 自动化\n  run: |\n    mkdir -p ~\u002F.config\u002Ftokscale\u002Fheadless\u002Fcodex\n    codex exec --json \"review code changes\" \\\n      > ~\u002F.config\u002Ftokscale\u002Fheadless\u002Fcodex\u002Fpr-${{ github.event.pull_request.number }}.jsonl\n\n# 稍后，跟踪使用情况\n- name: 报告 Token 使用情况\n  run: tokscale --json\n```\n\n> **注意**：无头捕获仅支持 Codex CLI。如果你直接运行 Codex，请按照上述方法将标准输出重定向到无头目录。\n\n## 前端可视化\n\n前端提供了一种类似 GitHub 的贡献图可视化：\n\n### 功能\n\n- **2D 视图**：经典的 GitHub 贡献日历\n- **3D 视图**：基于 Token 使用量的高度等距 3D 贡献图\n- **多种配色方案**：GitHub、GitLab、万圣节、冬季等\n- **三重主题切换**：浅色 \u002F 深色 \u002F 系统（跟随操作系统偏好）\n- **GitHub Primer 设计**：采用 GitHub 官方颜色体系\n- **交互式提示框**：悬停可查看每日详细分解\n- **每日分解面板**：点击可查看各来源及各模型的详细信息\n- **年份筛选**：可在不同年份之间导航\n- **来源筛选**：可按平台筛选（OpenCode、Claude、Codex、Cursor、Gemini、Amp、Droid、OpenClaw、Pi、Kimi、Qwen、Roo Code、Kilo、Mux、Kilo CLI、Crush、Synthetic）\n- **统计面板**：总成本、Token 数量、活跃天数、连续天数\n- **FOUC 防止**：在 React 水合之前应用主题（无闪烁现象）\n\n### 运行前端\n\n```bash\ncd packages\u002Ffrontend\nbun install\nbun run dev\n```\n\n打开 [http:\u002F\u002Flocalhost:3000](http:\u002F\u002Flocalhost:3000) 即可访问社交平台。\n\n## 社交平台\n\nTokscale 包含一个社交平台，你可以在其中分享自己的使用数据，并与其他开发者竞争。\n\n### 功能\n\n- **排行榜**：查看跨平台使用最多 Token 的用户\n- **用户个人资料**：带有贡献图和统计数据的公开个人资料\n- **周期筛选**：可查看全部时间、本月或本周的统计数据\n- **GitHub 集成**：使用 GitHub 账户登录\n- **本地查看器**：无需提交即可私下查看自己的数据\n\n### GitHub 个人资料嵌入小部件\n\n你可以将你的公开 Tokscale 统计数据直接嵌入到你的 GitHub 个人主页 README 中：\n\n```md\n[![Tokscale Stats](https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fembed\u002F\u003Cusername>\u002Fsvg)](https:\u002F\u002Ftokscale.ai\u002Fu\u002F\u003Cusername>)\n```\n\n- 将 `\u003Cusername>` 替换为你的 GitHub 用户名\n- 可选查询参数：\n  - `theme=light` 用于浅色主题\n  - `sort=tokens`（默认）或 `sort=cost` 用于控制排名依据\n  - `compact=1` 用于使用紧凑布局 + 紧凑数字表示法（例如 `1.2M`, `$3.4K`)\n- 示例：\n  - `https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fembed\u002F\u003Cusername>\u002Fsvg?theme=light&sort=cost&compact=1`\n\n### GitHub 个人资料徽章\n\n你也可以使用 shields.io 风格的徽章来实现更紧凑的展示：\n\n```md\n![Tokscale Tokens](https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fbadge\u002F\u003Cusername>\u002Fsvg)\n```\n\n- 将 `\u003Cusername>` 替换为你的 GitHub 用户名\n- 可选查询参数：\n  - `metric=tokens`（默认）、`metric=cost` 或 `metric=rank`\n  - `style=flat`（默认）或 `style=flat-square`\n  - `sort=tokens`（默认）或 `sort=cost` 来控制排名依据\n  - `compact=1` 使用紧凑数字表示法（例如 `1.2M`、`$3.4K`）\n  - `label=\u003Ctext>` 覆盖左侧标签\n  - `color=\u003Chex>` 覆盖右侧颜色（例如 `color=ff5733`）\n- 示例：\n  - `https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fbadge\u002F\u003Cusername>\u002Fsvg?metric=cost&compact=1`\n  - `https:\u002F\u002Ftokscale.ai\u002Fapi\u002Fbadge\u002F\u003Cusername>\u002Fsvg?metric=rank&sort=cost&style=flat-square`\n\n### 开始使用\n\n1. **登录** - 运行 `tokscale login` 通过 GitHub 进行身份验证\n2. **提交** - 运行 `tokscale submit` 上传你的使用数据\n3. **查看** - 访问网页平台查看你的个人资料和排行榜\n\n### 数据验证\n\n提交的数据会经过一级验证：\n- 数学一致性（总计匹配，无负值）\n- 无未来日期\n- 必填字段齐全\n- 重复检测\n\n## Wrapped 2025\n\n![Wrapped 2025](.github\u002Fassets\u002Fhero-wrapped-2025.png)\n\n生成一张精美的年度回顾图片，总结你使用 AI 编码助手的情况——灵感来自 Spotify Wrapped。\n\n| `bunx tokscale@latest wrapped` | `bunx tokscale@latest wrapped --clients` | `bunx tokscale@latest wrapped --agents --disable-pinned` |\n|:---:|:---:|:---:|\n| ![Wrapped 2025 (Agents + Pin Sisyphus)](.github\u002Fassets\u002Fwrapped-2025-agents.png) | ![Wrapped 2025 (Clients)](.github\u002Fassets\u002Fwrapped-2025-clients.png) | ![Wrapped 2025 (Agents + Disable Pinned)](.github\u002Fassets\u002Fwrapped-2025-agents-disable-pinned.png) |\n\n### 命令\n\n```bash\n# 生成当前年的 Wrapped 图片\ntokscale wrapped\n\n# 生成特定年份的 Wrapped 图片\ntokscale wrapped --year 2025\n```\n\n### 包含内容\n\n生成的图片包括：\n\n- **总 Token 数量** - 你全年的 Token 消耗总量\n- **顶级模型** - 按成本排名的你最常用的 3 个 AI 模型\n- **顶级客户端** - 你最常用的 3 个平台（OpenCode、Claude Code、Cursor 等）\n- **消息数量** - 总共的 AI 交互次数\n- **活跃天数** - 至少有一次 AI 交互的日子\n- **成本** - 根据 LiteLLM 定价估算的总成本\n- **连续活跃天数** - 你最长的连续活跃天数\n- **贡献图** - 你全年活动的可视化热力图\n\n生成的 PNG 已经优化，适合在社交媒体上分享。与社区分享你的编码之旅吧！\n\n## 开发\n\n> **快速设置**：如果你只想快速开始，请参阅上方安装部分中的 [开发设置](#development-setup)。\n\n### 先决条件\n\n```bash\n# Bun（必需）\nbun --version\n\n# Rust（用于原生模块）\nrustc --version\ncargo --version\n```\n\n### 如何运行\n\n按照 [开发设置](#development-setup) 后，你可以：\n\n```bash\n# 构建原生模块（可选但推荐）\nbun run build:core\n\n# 在开发模式下运行（启动 TUI）\ncd packages\u002Fcli && bun src\u002Fcli.ts\n\n# 或者使用旧版 CLI 模式\ncd packages\u002Fcli && bun src\u002Fcli.ts --light\n```\n\n\u003Cdetails>\n\u003Csummary>高级开发\u003C\u002Fsummary>\n\n### 项目脚本\n\n| 脚本 | 描述 |\n|--------|-------------|\n| `bun run cli` | 在开发模式下运行 CLI（Bun 的 TUI） |\n| `bun run build:core` | 构建原生 Rust 模块（发布版） |\n| `bun run build:cli` | 将 CLI TypeScript 编译到 dist\u002F 目录 |\n| `bun run build` | 同时构建核心和 CLI |\n| `bun run dev:frontend` | 运行前端开发服务器 |\n\n**包内专用脚本**（在各包目录下执行）：\n- `packages\u002Fcli`: `bun run dev`, `bun run tui`\n- `packages\u002Fcore`: `bun run build:debug`, `bun run test`, `bun run bench`\n\n**注意**：该项目使用 **Bun** 作为开发包管理器。\n\n### 测试\n\n```bash\n# 测试原生模块（Rust）\ncd packages\u002Fcore\nbun run test:rust      # Cargo 测试\nbun run test           # Node.js 集成测试\nbun run test:all       # 两者都运行\n```\n\n### 原生模块开发\n\n```bash\ncd packages\u002Fcore\n\n# 以调试模式构建（编译更快）\nbun run build:debug\n\n# 以发布模式构建（优化版）\nbun run build\n\n# 运行 Rust 基准测试\nbun run bench\n```\n\n### 图表命令选项\n\n```bash\n# 将图表数据导出到文件\ntokscale graph --output usage-data.json\n\n# 按日期筛选（所有快捷方式均适用）\ntokscale graph --today\ntokscale graph --week\ntokscale graph --since 2024-01-01 --until 2024-12-31\ntokscale graph --year 2024\n\n# 按平台筛选\ntokscale graph --opencode --claude\n\n# 显示处理时间基准测试\ntokscale graph --output data.json --benchmark\n```\n\n### 基准标志\n\n显示处理时间以便进行性能分析：\n\n```bash\ntokscale --benchmark           # 显示默认视图的处理时间\ntokscale models --benchmark    # 模型报告的基准测试\ntokscale monthly --benchmark   # 月度报告的基准测试\ntokscale graph --benchmark     # 图表生成的基准测试\n```\n\n### 为前端生成数据\n\n```bash\n# 导出数据用于可视化\ntokscale graph --output packages\u002Ffrontend\u002Fpublic\u002Fmy-data.json\n```\n\n### 性能\n\n原生 Rust 模块带来了显著的性能提升：\n\n| 操作 | TypeScript | Rust 原生 | 加速比 |\n|-----------|------------|-------------|---------|\n| 文件发现 | ~500ms | ~50ms | **10x** |\n| JSON 解析 | ~800ms | ~100ms | **8x** |\n| 聚合 | ~200ms | ~25ms | **8x** |\n| **总计** | **~1.5s** | **~175ms** | **~8.5x** |\n\n*基于约 1000 个会话文件和 10 万条消息的基准测试*\n\n#### 内存优化\n\n原生模块还通过以下方式减少了约 45% 的内存占用：\n- 流式 JSON 解析（无需完整文件缓冲）\n- 零拷贝字符串处理\n- 使用 map-reduce 实现高效的并行聚合\n\n#### 运行基准测试\n\n```bash\n# 生成合成数据\ncd packages\u002Fbenchmarks && bun run generate\n\n# 运行 Rust 基准测试\ncd packages\u002Fcore && bun run bench\n```\n\n\u003C\u002Fdetails>\n\n## 支持的平台\n\n### 原生模块目标平台\n\n| 平台 | 架构 | 状态 |\n|----------|--------------|--------|\n| macOS | x86_64 | ✅ 支持 |\n| macOS | aarch64（Apple Silicon） | ✅ 支持 |\n| Linux | x86_64（glibc） | ✅ 支持 |\n| Linux | aarch64（glibc） | ✅ 支持 |\n| Linux | x86_64（musl） | ✅ 支持 |\n| Linux | aarch64（musl） | ✅ 支持 |\n| Windows | x86_64 | ✅ 支持 |\n| Windows | aarch64 | ✅ 支持 |\n\n### Windows 支持\n\nTokscale 完全支持 Windows。TUI 和 CLI 的使用方式与 macOS\u002FLinux 相同。\n\n**Windows 上的安装：**\n```powershell\n# 安装 Bun（PowerShell）\npowershell -c \"irm bun.sh\u002Finstall.ps1 | iex\"\n\n# 运行 tokscale\nbunx tokscale@latest\n```\n\n#### Windows 上的数据存储位置\n\nAI 编码工具会将会话数据存储在跨平台的位置。大多数工具在所有平台上都使用相同的相对路径：\n\n| 工具       | Unix 路径                  | Windows 路径                     | 来源                           |\n|------------|----------------------------|----------------------------------|--------------------------------|\n| OpenCode   | `~\u002F.local\u002Fshare\u002Fopencode\u002F` | `%USERPROFILE%\\.local\\share\\opencode\\` | 使用 [`xdg-basedir`](https:\u002F\u002Fgithub.com\u002Fsindresorhus\u002Fxdg-basedir) 实现跨平台一致性 ([来源](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode\u002Fblob\u002Fmain\u002Fpackages\u002Fopencode\u002Fsrc\u002Fglobal\u002Findex.ts)) |\n| Claude Code | `~\u002F.claude\u002F`              | `%USERPROFILE%\\.claude\\`         | 所有平台使用相同路径           |\n| OpenClaw   | `~\u002F.openclaw\u002F` (+ 旧版: `.clawdbot`, `.moltbot`, `.moldbot`) | `%USERPROFILE%\\.openclaw\\` (+ 旧版路径) | 所有平台使用相同路径          |\n| Codex CLI  | `~\u002F.codex\u002F`                | `%USERPROFILE%\\.codex\\`           | 可通过 `CODEX_HOME` 环境变量配置 ([来源](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex)) |\n| Gemini CLI | `~\u002F.gemini\u002F`               | `%USERPROFILE%\\.gemini\\`         | 所有平台使用相同路径          |\n| Amp      | `~\u002F.local\u002Fshare\u002Famp\u002F`     | `%USERPROFILE%\\.local\\share\\amp\\` | 类似 OpenCode，使用 `xdg-basedir` |\n| Cursor   | API 同步                 | API 同步                         | 数据通过 API 获取，缓存至 `%USERPROFILE%\\.config\\tokscale\\cursor-cache\\` |\n| Droid    | `~\u002F.factory\u002F`             | `%USERPROFILE%\\.factory\\`        | 所有平台使用相同路径          |\n| Pi       | `~\u002F.pi\u002F`                   | `%USERPROFILE%\\.pi\\`             | 所有平台使用相同路径          |\n| Kimi CLI | `~\u002F.kimi\u002F`                 | `%USERPROFILE%\\.kimi\\`           | 所有平台使用相同路径          |\n| Qwen CLI | `~\u002F.qwen\u002F`                 | `%USERPROFILE%\\.qwen\\`           | 所有平台使用相同路径          |\n| Roo Code | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F` | `%USERPROFILE%\\.config\\Code\\User\\globalStorage\\rooveterinaryinc.roo-cline\\tasks\\` | VS Code globalStorage 的任务日志 |\n| Kilo     | `~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F` | `%USERPROFILE%\\.config\\Code\\User\\globalStorage\\kilocode.kilo-code\\tasks\\` | VS Code globalStorage 的任务日志 |\n| Mux      | `~\u002F.mux\u002Fsessions\u002F`         | `%USERPROFILE%\\.mux\\sessions\\`   | 所有平台使用相同路径          |\n| Kilo CLI | `~\u002F.local\u002Fshare\u002Fkilo\u002F`    | `%USERPROFILE%\\.local\\share\\kilo\\` | 类似 OpenCode，使用 `xdg-basedir` |\n| Crush    | `$XDG_DATA_HOME\u002Fcrush\u002F` (备用: `~\u002F.local\u002Fshare\u002Fcrush\u002F`) | `%USERPROFILE%\\.local\\share\\crush\\` (或若设置了 `%XDG_DATA_HOME%\\crush\\` 则使用该路径) | 使用 XDG 数据目录，并提供备用路径 |\n| Synthetic| 从其他来源重新归因         | 从其他来源重新归因             | 检测 `hf:` 模型前缀 + `synthetic` 提供者 |\n\n> **注意**：在 Windows 上，`~` 会展开为 `%USERPROFILE%`（例如 `C:\\Users\\YourName`）。这些工具有意使用 Unix 风格的路径（如 `.local\u002Fshare`），即使在 Windows 上也是如此，以实现跨平台一致性，而不是使用 Windows 原生路径（如 `%APPDATA%`）。\n\n#### Windows 特定配置\n\nTokscale 将其配置存储在：\n- **配置文件**：`%USERPROFILE%\\.config\\tokscale\\settings.json`\n- **缓存**：`%USERPROFILE%\\.cache\\tokscale\\`\n- **Cursor 凭证**：`%USERPROFILE%\\.config\\tokscale\\cursor-credentials.json`\n\n## 会话数据保留\n\n默认情况下，一些 AI 编码助手会自动删除旧的会话文件。为了保留您的使用历史以便准确跟踪，您可以禁用清理功能或延长清理周期。\n\n| 平台       | 默认设置                 | 配置文件                     | 禁用设置                       | 来源                           |\n|------------|--------------------------|------------------------------|--------------------------------|--------------------------------|\n| Claude Code | **⚠️ 30 天**            | `~\u002F.claude\u002Fsettings.json`     | `\"cleanupPeriodDays\": 9999999999` | [文档](https:\u002F\u002Fdocs.anthropic.com\u002Fen\u002Fdocs\u002Fclaude-code\u002Fsettings) |\n| Gemini CLI | 已禁用                 | `~\u002F.gemini\u002Fsettings.json`     | `\"general.sessionRetention.enabled\": false` | [文档](https:\u002F\u002Fgithub.com\u002Fgoogle-gemini\u002Fgemini-cli\u002Fblob\u002Fmain\u002Fdocs\u002Fcli\u002Fsession-management.md) |\n| Codex CLI  | 已禁用                 | 无                           | 无清理功能                     | [#6015](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex\u002Fissues\u002F6015) |\n| OpenCode   | 已禁用                 | 无                           | 无清理功能                     | [#4980](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode\u002Fissues\u002F4980) |\n\n### Claude Code\n\n**默认**：30 天清理周期\n\n在 `~\u002F.claude\u002Fsettings.json` 中添加以下内容：\n```json\n{\n  \"cleanupPeriodDays\": 9999999999\n}\n```\n\n> 设置一个极大的值（例如 `9999999999` 天 ≈ 2700 万年）可以有效禁用清理功能。\n\n### Gemini CLI\n\n**默认**：已禁用清理（会话永久保存）\n\n如果您启用了清理功能并希望将其禁用，请移除或在 `~\u002F.gemini\u002Fsettings.json` 中将 `enabled` 设置为 `false`：\n```json\n{\n  \"general\": {\n    \"sessionRetention\": {\n      \"enabled\": false\n    }\n  }\n}\n```\n\n或者设置一个极长的保留期限：\n```json\n{\n  \"general\": {\n    \"sessionRetention\": {\n      \"enabled\": true,\n      \"maxAge\": \"9999999d\"\n    }\n  }\n}\n```\n\n### Codex CLI\n\n**默认**：无自动清理（会话永久保存）\n\nCodex CLI 没有内置的会话清理功能。`~\u002F.codex\u002Fsessions\u002F` 中的会话将无限期保留。\n\n> **注意**：目前有一个关于此功能的公开请求：[#6015](https:\u002F\u002Fgithub.com\u002Fopenai\u002Fcodex\u002Fissues\u002F6015)\n\n### OpenCode\n\n**默认**：无自动清理（会话永久保存）\n\nOpenCode 也没有内置的会话清理功能。`~\u002F.local\u002Fshare\u002Fopencode\u002Fstorage\u002F` 中的会话将无限期保留。\n\n> **注意**：请参阅 [#4980](https:\u002F\u002Fgithub.com\u002Fsst\u002Fopencode\u002Fissues\u002F4980)\n\n---\n\n## 数据来源\n\n### OpenCode\n\n位置：`~\u002F.local\u002Fshare\u002Fopencode\u002Fopencode.db`（v1.2+）或 `storage\u002Fmessage\u002F{sessionId}\u002F*.json`（旧版）\n\nOpenCode 1.2+ 将会话存储在 SQLite 数据库中。Tokscale 会优先从 SQLite 读取数据，如果遇到旧版本则回退到旧版 JSON 文件。\n\n每条消息包含：\n```json\n{\n  \"id\": \"msg_xxx\",\n  \"role\": \"assistant\",\n  \"modelID\": \"claude-sonnet-4-20250514\",\n  \"providerID\": \"anthropic\",\n  \"tokens\": {\n    \"input\": 1234,\n    \"output\": 567,\n    \"reasoning\": 0,\n    \"cache\": { \"read\": 890, \"write\": 123 }\n  },\n  \"time\": { \"created\": 1699999999999 }\n}\n```\n\n### Claude Code\n\n位置：`~\u002F.claude\u002Fprojects\u002F{projectPath}\u002F*.jsonl`\n\nJSONL 格式，包含助手消息及用量数据：\n```json\n{\"type\": \"assistant\", \"message\": {\"model\": \"claude-sonnet-4-20250514\", \"usage\": {\"input_tokens\": 1234, \"output_tokens\": 567, \"cache_read_input_tokens\": 890}}, \"timestamp\": \"2024-01-01T00:00:00Z\"}\n```\n\n### Codex CLI\n\n位置：`~\u002F.codex\u002Fsessions\u002F*.jsonl`\n\n基于事件的格式，包含 `token_count` 事件：\n```json\n{\"type\": \"event_msg\", \"payload\": {\"type\": \"token_count\", \"info\": {\"last_token_usage\": {\"input_tokens\": 1234, \"output_tokens\": 567}}}}\n``。\n\n### Gemini CLI\n\n位置：`~\u002F.gemini\u002Ftmp\u002F{projectHash}\u002Fchats\u002F*.json`\n\n会话文件包含消息数组：\n```json\n{\n  \"sessionId\": \"xxx\",\n  \"messages\": [\n    {\"type\": \"gemini\", \"model\": \"gemini-2.5-pro\", \"tokens\": {\"input\": 1234, \"output\": 567, \"cached\": 890, \"thoughts\": 123}}\n  ]\n}\n```\n\n### Cursor IDE\n\n位置：`~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F`（通过 Cursor API 同步）\n\nCursor 数据会使用您的会话令牌从 Cursor API 获取，并本地缓存。运行 `tokscale cursor login` 进行身份验证。有关设置说明，请参阅 [Cursor IDE 命令](#cursor-ide-commands)。\n\n### OpenClaw\n\n位置：`~\u002F.openclaw\u002Fagents\u002F*\u002Fsessions\u002Fsessions.json`（同时扫描旧路径：`~\u002F.clawdbot\u002F`、`~\u002F.moltbot\u002F`、`~\u002F.moldbot\u002F`）\n\n指向 JSONL 会话文件的索引文件：\n```json\n{\n  \"agent:main:main\": {\n    \"sessionId\": \"uuid\",\n    \"sessionFile\": \"\u002Fpath\u002Fto\u002Fsession.jsonl\"\n  }\n}\n```\n\n包含模型变更事件和助手消息的会话 JSONL 格式：\n```json\n{\"type\":\"model_change\",\"provider\":\"openai-codex\",\"modelId\":\"gpt-5.2\"}\n{\"type\":\"message\",\"message\":{\"role\":\"assistant\",\"usage\":{\"input\":1660,\"output\":55,\"cacheRead\":108928,\"cost\":{\"total\":0.02}},\"timestamp\":1769753935279}}\n```\n\n### Pi\n\n位置：`~\u002F.pi\u002Fagent\u002Fsessions\u002F\u003Cencoded-cwd>\u002F*.jsonl`\n\n带有会话头和消息条目的 JSONL 格式：\n```json\n{\"type\":\"session\",\"id\":\"pi_ses_001\",\"timestamp\":\"2026-01-01T00:00:00.000Z\",\"cwd\":\"\u002Ftmp\"}\n{\"type\":\"message\",\"id\":\"msg_001\",\"timestamp\":\"2026-01-01T00:00:01.000Z\",\"message\":{\"role\":\"assistant\",\"model\":\"claude-3-5-sonnet\",\"provider\":\"anthropic\",\"usage\":{\"input\":100,\"output\":50,\"cacheRead\":10,\"cacheWrite\":5,\"totalTokens\":165}}}\n```\n\n### Kimi CLI\n\n位置：`~\u002F.kimi\u002Fsessions\u002F{GROUP_ID}\u002F{SESSION_UUID}\u002Fwire.jsonl`\n\n带有 StatusUpdate 消息的 wire.jsonl 格式：\n```json\n{\"type\": \"metadata\", \"protocol_version\": \"1.3\"}\n{\"timestamp\": 1770983426.420942, \"message\": {\"type\": \"StatusUpdate\", \"payload\": {\"token_usage\": {\"input_other\": 1562, \"output\": 2463, \"input_cache_read\": 0, \"input_cache_creation\": 0}, \"message_id\": \"chatcmpl-xxx\"}}}\n```\n\n### Qwen CLI\n\n位置：`~\u002F.qwen\u002Fprojects\u002F{PROJECT_PATH}\u002Fchats\u002F{CHAT_ID}.jsonl`\n\n格式：JSONL — 每行一个 JSON 对象，每个对象包含 `type`、`model`、`timestamp`、`sessionId` 和 `usageMetadata` 字段。\n\n令牌字段（来自 `usageMetadata`）：\n- `promptTokenCount` → 输入令牌\n- `candidatesTokenCount` → 输出令牌\n- `thoughtsTokenCount` → 推理\u002F思考令牌\n- `cachedContentTokenCount` → 缓存输入令牌\n\n### Roo Code\n\n位置：\n- 本地：`~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n- 服务器（尽力而为）：`~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Frooveterinaryinc.roo-cline\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n\n每个任务目录还可能包含 `api_conversation_history.json` 文件，其中包含用于模型\u002F代理元数据的 `\u003Cenvironment_details>` 块。\n\n`ui_messages.json` 是一个 UI 事件数组。Tokscale 只统计：\n- `type == \"say\"`\n- `say == \"api_req_started\"`\n\n`text` 字段是包含令牌\u002F成本元数据的 JSON：\n```json\n{\n  \"type\": \"say\",\n  \"say\": \"api_req_started\",\n  \"ts\": \"2026-02-18T12:00:00Z\",\n  \"text\": \"{\\\"cost\\\":0.12,\\\"tokensIn\\\":100,\\\"tokensOut\\\":50,\\\"cacheReads\\\":20,\\\"cacheWrites\\\":5,\\\"apiProtocol\\\":\\\"anthropic\\\"}\"\n}\n```\n\n### Kilo\n\n位置：\n- 本地：`~\u002F.config\u002FCode\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n- 服务器（尽力而为）：`~\u002F.vscode-server\u002Fdata\u002FUser\u002FglobalStorage\u002Fkilocode.kilo-code\u002Ftasks\u002F{TASK_ID}\u002Fui_messages.json`\n\nKilo 使用与 Roo Code 相同的任务日志结构。Tokscale 应用相同的规则：\n- 只统计 `ui_messages.json` 中的 `say\u002Fapi_req_started` 事件\n- 从 `text` JSON 中解析 `tokensIn`、`tokensOut`、`cacheReads`、`cacheWrites`、`cost` 和 `apiProtocol`\n- 在可用时，从同级的 `api_conversation_history.json` 中丰富模型\u002F代理元数据\n\n### Mux\n\n位置：\n- `~\u002F.mux\u002Fsessions\u002F{WORKSPACE_ID}\u002Fsession-usage.json`\n\nMux 将每会话的累计令牌用量存储在 `session-usage.json` 文件中。每个文件包含一个 `byModel` 映射，其中按模型细分了以下用量：\n- `input`、`cached`（缓存读取）、`cacheCreate`（缓存写入）、`output`、`reasoning`\n- 模型名称采用 `provider:model` 格式（例如 `anthropic:claude-opus-4-6`），Tokscale 会去除提供商前缀以识别模型\n- 子代理的用量会由 Mux 自动汇总到父会话中，因此不会出现重复计算\n\n### Kilo CLI\n\n位置：`~\u002F.local\u002Fshare\u002Fkilo\u002Fkilo.db`\n\nKilo CLI 将会话数据存储在一个类似于 OpenCode 的 SQLite 数据库中。每条消息记录都包含按消息细分的令牌用量（输入、输出、缓存读写、推理），并注明模型和提供商。\n\n### Crush\n\n位置：通过 `$XDG_DATA_HOME\u002Fcrush\u002Fprojects.json` 发现的项目级 SQLite 数据库（备用：`~\u002F.local\u002Fshare\u002Fcrush\u002Fprojects.json`）\n\nCrush 将用量存储在每个项目的 SQLite 数据库（`crush.db`）中。Tokscale 只从根会话中导入会话级别的总成本，因为 Crush 不提供可靠的每条消息或每模型的令牌核算。记录显示为 `model=session-total`，且无令牌细分。\n\n### Synthetic (synthetic.new)\n\nSynthetic 的用量是通过对现有代理会话文件进行后处理检测到的。当消息使用 `hf:` 模型 ID 或合成提供商（`synthetic`、`glhf`、`octofriend`）时，这些消息会被重新归因于 `synthetic`。\n\nTokscale 还会检查位于 `~\u002F.local\u002Fshare\u002Foctofriend\u002Fsqlite.db` 的 Octofriend SQLite 数据库，并在可用时解析带有令牌信息的记录。\n\n## 定价\n\nTokscale 会从 [LiteLLM 的定价数据库](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm\u002Fblob\u002Fmain\u002Fmodel_prices_and_context_window.json) 获取实时定价。\n\n**动态回退**：对于 LiteLLM 中尚未收录的模型（例如最近发布的模型），Tokscale 会自动从 [OpenRouter 的 endpoints API](https:\u002F\u002Fopenrouter.ai\u002Fdocs\u002Fapi\u002Fapi-reference\u002Fendpoints\u002Flist-endpoints) 获取定价。这确保您可以从模型的原始提供商处获得准确的定价（例如 Z.AI 对 glm-4.7 的定价），而无需等待 LiteLLM 更新。\n\n**Cursor 模型定价**：对于尚未出现在 LiteLLM 或 OpenRouter 中的非常新发布的模型（例如 `gpt-5.3-codex`），Tokscale 会包含硬编码的定价，该定价来源于 [Cursor 的模型文档](https:\u002F\u002Fcursor.com\u002Fen-US\u002Fdocs\u002Fmodels)。这些覆盖会在所有上游来源之后、模糊匹配之前被检查，因此一旦实际的上游定价可用，它们就会自动生效。\n\n**缓存**：定价数据会以 1 小时 TTL 的形式缓存在磁盘上，以便快速启动：\n- LiteLLM 缓存：`~\u002F.cache\u002Ftokscale\u002Fpricing-litellm.json`\n- OpenRouter 缓存：`~\u002F.cache\u002Ftokscale\u002Fpricing-openrouter.json`（缓存来自受支持提供商的模型的原始定价）\n\n定价包括：\n- 输入令牌\n- 输出令牌\n- 缓存读取令牌（折扣价）\n- 缓存写入令牌\n- 推理令牌（适用于 o1 等模型）\n- 分级定价（超过 20 万令牌）\n\n## 贡献\n\n欢迎贡献！请按照以下步骤操作：\n\n1. 克隆仓库并创建分支\n2. 创建功能分支 (`git checkout -b feature\u002Famazing-feature`)\n3. 进行更改\n4. 运行测试 (`cd packages\u002Fcore && bun run test:all`)\n5. 提交更改 (`git commit -m '添加惊人功能'`)\n6. 推送到分支 (`git push origin feature\u002Famazing-feature`)\n7. 打开拉取请求\n\n### 开发指南\n\n- 遵循现有代码风格\n- 为新功能添加测试\n- 根据需要更新文档\n- 保持提交内容专注且原子化\n\n## 致谢\n\n- 感谢 [ccusage](https:\u002F\u002Fgithub.com\u002Fryoppippi\u002Fccusage)、[viberank](https:\u002F\u002Fgithub.com\u002Fsculptdotfun\u002Fviberank) 和 [Isometric Contributions](https:\u002F\u002Fgithub.com\u002Fjasonlong\u002Fisometric-contributions) 提供的灵感\n- 感谢 [Ratatui](https:\u002F\u002Fgithub.com\u002Fratatui\u002Fratatui) 提供的终端 UI 框架\n- 感谢 [Solid.js](https:\u002F\u002Fwww.solidjs.com\u002F) 提供的响应式渲染能力\n- 感谢 [LiteLLM](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm) 提供的定价数据\n- 感谢 [napi-rs](https:\u002F\u002Fnapi.rs\u002F) 提供的 Rust\u002FNode.js 绑定\n- 感谢 [github-contributions-canvas](https:\u002F\u002Fgithub.com\u002Fsallar\u002Fgithub-contributions-canvas) 提供的 2D 图表参考\n\n## 许可证\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fjunhoyeo\">\n    \u003Cimg src=\".github\u002Fassets\u002Flabtocat-on-spaceship.png\" width=\"540\">\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n  \u003Cstrong>MIT © \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fjunhoyeo\">Junho Yeo\u003C\u002Fa>\u003C\u002Fstrong>\n\u003C\u002Fp>\n\n如果你觉得这个项目很有趣，**请考虑给它点个赞 ⭐**，或者 [在 GitHub 上关注我](https:\u002F\u002Fgithub.com\u002Fjunhoyeo)，一起加入这场旅程吧（目前已有 1,100 多位小伙伴同行）。我几乎全天候都在编程，并且定期发布令人惊叹的作品——你的支持绝不会白费。","# Tokscale 快速上手指南\n\nTokscale 是一款高性能命令行工具（CLI）和可视化仪表盘，专为追踪和分析多个 AI 编程助手（如 Claude Code, Cursor, OpenCode 等）的 Token 消耗与成本而设计。它基于 Rust 构建，提供极速的数据处理和精美的终端界面（TUI）。\n\n## 环境准备\n\n在开始之前，请确保你的开发环境满足以下要求：\n\n*   **操作系统**：支持 macOS, Linux, Windows (WSL 推荐)。\n*   **运行时环境**（二选一）：\n    *   [Node.js](https:\u002F\u002Fnodejs.org\u002F) (推荐 LTS 版本)\n    *   [Bun](https:\u002F\u002Fbun.sh\u002F) (速度更快，推荐)\n*   **可选依赖**：如果你需要从源码构建或进行二次开发，需要安装 [Rust 工具链](https:\u002F\u002Frustup.rs\u002F)。\n\n> **提示**：国内用户若下载 Bun 或 Node 较慢，可配置相应的国内镜像源加速安装。\n\n## 安装步骤\n\nTokscale 无需复杂的全局安装，推荐使用 `npx` 或 `bunx` 直接运行最新版本。\n\n### 方式一：使用 npx (Node.js 用户)\n\n```bash\nnpx tokscale@latest\n```\n\n### 方式二：使用 bunx (Bun 用户 - 推荐)\n\n```bash\nbunx tokscale@latest\n```\n\n### 轻量模式\n\n如果只需要表格数据而不需要交互式界面，可以添加 `--light` 参数：\n\n```bash\nnpx tokscale@latest --light\n```\n\n> **说明**：首次运行时，工具会自动下载包含原生 Rust 核心的二进制文件，无需手动编译。\n\n## 基本使用\n\n安装完成后，Tokscale 会自动扫描你本地常见的 AI 工具数据目录（如 `~\u002F.claude`, `~\u002F.cursor`, `~\u002F.opencode` 等），并计算 Token 用量和预估费用。\n\n### 1. 启动交互式界面 (TUI)\n\n直接在终端输入运行命令即可进入主界面：\n\n```bash\nbunx tokscale@latest\n```\n\n**界面功能概览：**\n*   **Overview (概览)**：查看总消耗、成本及贡献热力图。\n*   **Models (模型)**：按模型维度统计 Token 使用情况。\n*   **Daily (每日)**：查看每日消耗趋势。\n*   **Stats (统计)**：详细的输入\u002F输出\u002F缓存\u002F推理 Token 细分数据。\n\n**操作方式：**\n*   使用 **方向键** 或 **鼠标** 进行导航。\n*   按 `q` 或 `Esc` 退出。\n\n### 2. 查看支持的平台\n\nTokscale 默认支持以下主流 AI 编程工具的数据读取（只需它们已在本地产生过数据）：\n\n| 工具名称 | 数据路径示例 |\n| :--- | :--- |\n| **Claude Code** | `~\u002F.claude\u002Fprojects\u002F` |\n| **Cursor IDE** | 通过 API 同步至 `~\u002F.config\u002Ftokscale\u002Fcursor-cache\u002F` |\n| **OpenCode** | `~\u002F.local\u002Fshare\u002Fopencode\u002Fopencode.db` |\n| **Gemini CLI** | `~\u002F.gemini\u002Ftmp\u002F*\u002Fchats\u002F*.json` |\n| **Codex CLI** | `~\u002F.codex\u002Fsessions\u002F` |\n| **Roo Code \u002F Kilo** | VS Code 全局存储目录 |\n\n### 3. 常用命令示例\n\n*   **查看帮助信息**：\n    ```bash\n    bunx tokscale@latest --help\n    ```\n\n*   **仅查看特定平台数据** (例如只看 Claude Code)：\n    ```bash\n    bunx tokscale@latest --platform claude\n    ```\n\n*   **导出数据为 JSON**：\n    ```bash\n    bunx tokscale@latest --export json\n    ```\n\n*   **提交数据到排行榜 (可选)**：\n    如果你想参与社区排名或生成个人年度总结，可以提交匿名化数据：\n    ```bash\n    bunx tokscale@latest submit\n    ```\n\n### 4. 价格计算\n\nTokscale 内置了来自 LiteLLM 的最新定价数据，支持分层定价和缓存 Token 折扣计算。价格数据会缓存在本地（有效期 1 小时），确保查询速度飞快且准确。\n\n---\n*现在，运行 `bunx tokscale@latest` 开始监控你的 AI 开发能耗吧！*","某全栈开发团队在日常工作中同时使用 Claude Code、Cursor 和 OpenCode 等多种 AI 编程助手进行快速迭代，亟需掌控整体 Token 消耗以优化成本。\n\n### 没有 tokscale 时\n- **数据分散难统计**：各工具的 Token 记录散落在不同的本地数据库或日志文件中，手动汇总耗时且容易出错。\n- **成本黑盒无感知**：无法实时查看具体哪个模型或哪次会话消耗了大量 Token，往往等到月底收到高额账单才后知后觉。\n- **缺乏可视化分析**：只有枯燥的数字列表，难以直观判断每日用量趋势或识别异常的高峰时段。\n- **团队协作无基准**：成员间无法对比使用效率，缺乏全局排行榜来激励优化提示词或减少冗余调用。\n\n### 使用 tokscale 后\n- **一键聚合多源数据**：tokscale 自动读取并统一展示来自 Claude Code、Cursor 等所有支持工具的用量数据，生成标准化报表。\n- **实时监控与预警**：通过原生 Rust TUI 界面实时刷新消耗详情，精准定位高成本模型和会话，立即调整使用策略。\n- **直观图表辅助决策**：利用 2D\u002F3D 贡献图和每日摘要视图，清晰呈现用量波动规律，快速发现并解决异常消耗问题。\n- **全球榜单驱动优化**：提交数据至全球排行榜，团队可在匿名对比中了解自身效率水平，主动优化工作流以降低单位产出成本。\n\ntokscale 将分散隐蔽的 Token 消耗转化为透明可视的数据资产，帮助开发者在享受 AI 提效红利的同时牢牢掌握成本主动权。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fjunhoyeo_tokscale_cbface76.png","junhoyeo","Junho Yeo","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fjunhoyeo_2a876247.jpg",null,"Seoul, Korea","i@junho.io","_junhoyeo","https:\u002F\u002Fjunho.io\u002Fabout","https:\u002F\u002Fgithub.com\u002Fjunhoyeo",[85,89,93,97,101],{"name":86,"color":87,"percentage":88},"Rust","#dea584",67.9,{"name":90,"color":91,"percentage":92},"TypeScript","#3178c6",31.6,{"name":94,"color":95,"percentage":96},"JavaScript","#f1e05a",0.3,{"name":98,"color":99,"percentage":100},"CSS","#663399",0.1,{"name":102,"color":103,"percentage":100},"Shell","#89e051",1556,118,"2026-04-03T15:14:25","MIT","Linux, macOS, Windows","未说明",{"notes":111,"python":109,"dependencies":112},"该工具是一个基于 Rust 核心的 CLI 工具，主要通过 Node.js (npx) 或 Bun (bunx) 运行。原生 Rust 模块已预包含在安装包中，仅当需要从源码构建时才需要安装 Rust 工具链。它用于监控和分析多个 AI 编码代理的 Token 使用情况，不涉及 GPU 加速或大型模型下载。",[113,114],"Node.js 或 Bun","Rust toolchain (可选，用于从源码构建)",[13,14,15,26,53],[117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136],"claude","claude-code","codex","cursor","cursor-ide","gemini","gemini-cli","gpt","opencode","opentui","ai","ai-agents","token-usage","ampcode","droid","oh","oh-my-opencode","clawdbot","moltbot","openclaw","2026-03-27T02:49:30.150509","2026-04-06T05:36:33.189851",[140,145,150,155,160,165],{"id":141,"question_zh":142,"answer_zh":143,"source_url":144},13222,"为什么查询 `gpt-5.3-codex` 的价格时显示错误的数值（如 $0.25\u002F$2.00）？","这是因为工具错误匹配了 LiteLLM 数据中基于订阅的 `github_copilot\u002Fgpt-5.3-codex` 条目（显示为 $0.00），从而回退到了不准确的 `gpt-5.1-codex-mini` 价格。由于 GPT-5.3 Codex 尚未在 OpenAI API 正式上架，工具已更新为注入硬编码的价格覆盖值。当前正确价格为：输入 $1.75 \u002F 1M tokens，输出 $14.00 \u002F 1M tokens，缓存读取 $0.175 \u002F 1M tokens。请升级到最新版本以获取修复。","https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues\u002F172",{"id":146,"question_zh":147,"answer_zh":148,"source_url":149},13223,"提交数据时遇到 'Invalid enum value... received kilo' 错误怎么办？","该错误是因为客户端名称枚举值不匹配导致的。维护者已在最新版本中修复了此问题。请尝试升级到最新版 tokscale（例如 v2.0.15 或更高版本）来解决此验证错误。升级命令参考：`bunx tokscale@latest` 或通过 releases 页面下载指定标签版本。","https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues\u002F363",{"id":151,"question_zh":152,"answer_zh":153,"source_url":154},13224,"本地 Codex 解析器统计的 Token 数量为何比实际多（重复计数问题）？","旧版本逻辑优先使用 `last_token_usage` 字段，导致在会话日志中出现重复快照（累计总量未变化）时被重复计数。该问题已在 `tokscale@v2.0.10` 中修复。新逻辑优先使用 `total_token_usage` 的增量差值，并跳过累计总量未前进的行。请运行以下命令升级：`bunx tokscale@latest` 或手动更新至 v2.0.10+。","https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues\u002F297",{"id":156,"question_zh":157,"answer_zh":158,"source_url":159},13225,"Gemini CLI 的使用量自从 2 月 7 日起停止追踪怎么办？","这是一个已知的解析故障，已在 `tokscale@2.0.5` 版本中修复。请确保您将 tokscale 更新到 2.0.5 或更高版本，之后 Gemini CLI 的活动将恢复正常追踪。如果更新后问题仍然存在，请重新提交 Issue。","https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues\u002F233",{"id":161,"question_zh":162,"answer_zh":163,"source_url":164},13226,"在 Linux x64 系统上运行 `bunx tokscale@latest submit` 报错 'Native module required' 如何解决？","这是由于 `@tokscale\u002Fcore` 包缺少 Linux x64 平台的预编译二进制文件导致的。通常是因为 `optionalDependencies` 配置缺失或未正确发布对应平台的包。解决方法是确保安装最新版本的 tokscale，或者如果是开发环境，可能需要手动构建核心模块：运行 `bun run build:core`。建议检查是否已安装针对 `linux-x64-gnu` 的可选依赖包。","https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fissues\u002F208",{"id":166,"question_zh":167,"answer_zh":168,"source_url":144},13227,"GPT-5.3 Codex 的官方定价来源是什么？","由于 GPT-5.3 Codex 尚未在 OpenAI 官方 API 定价表中列出，目前的定价数据来源于 Cursor 的官方文档（cursor.com\u002Fen-US\u002Fdocs\u002Fmodels）以及 llm-stats.com 的佐证数据。具体价格为：输入 $1.75，输出 $14.00，缓存读取 $0.175（每 1M tokens）。一旦 OpenAI 官方 API 上线，工具将自动切换至官方数据。",[170,175,180,185,190,195,200,205,210,215,220,225,230,235,240,245,250,255,260,265],{"id":171,"version":172,"summary_zh":173,"released_at":174},71895,"v2.0.17","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.17` 发布啦！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 修复（前端）：由 @stevejkang 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F382 中改进了徽章 SVG 文本渲染和字距\n* 修复（TUI）：由 @stevejkang 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F383 中将主题背景应用到概览页的图表和图例区域\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.16...v2.0.17\n","2026-04-01T10:03:53",{"id":176,"version":177,"summary_zh":178,"released_at":179},71896,"v2.0.16","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.16` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 重构(客户端)：将 Kilo 整合为多源客户端架构，由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F370 中完成。\n* 修复(TUI)：将代理名称首字母大写，并移除框架前缀，由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F372 中完成。\n* 新功能(前端)：添加排行榜用户搜索功能，由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F371 中实现。\n* 新功能(前端)：添加 shields.io 风格的 GitHub 个人主页徽章，由 @stevejkang 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F373 中实现。\n* 新功能(设置)：新增自助提交数据删除功能，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F365 中完成。\n* 新功能(CLI)：为本地使用添加工作空间感知分组功能，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F340 中实现。\n* 新功能(CLI)：为本地报告命令添加 --home 覆盖选项，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F367 中实现。\n* 新功能(Crush)：添加初始会话级本地使用支持，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F346 中实现。\n* 修复(审计)：保留 Crush 工作空间分组和排行榜搜索排名，由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F377 中完成。\n\n## 新贡献者\n* @stevejkang 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F373 中完成了首次贡献。\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.15...v2.0.16","2026-03-31T16:45:52",{"id":181,"version":182,"summary_zh":183,"released_at":184},71897,"v2.0.15","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.15` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 修复(前端)：接受 Kilo CLI 提交，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F368 中完成\n* 修复(核心)：对扫描路径进行排序，以实现确定性的去重和稳定的刷新，由 @crhan 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F357 中完成\n* 更新定价以支持所有 Composer 模型，由 @RawToast 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F361 中完成\n* 修复：增加未来日期的缓冲时间，以防止因时区问题导致的提交失败，由 @bluzername 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F360 中完成\n* 修复(会话\u002Fclaudecode)：为重复条目保留最终的流式传输 token 数量，由 @crhan 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F358 中完成\n* 文档：更新 README.md，由 @junhoyeo (fb165f910a672636cfa8e3582ecda2aeaf192d93) 完成\n\n## 新贡献者\n* @bluzername 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F360 中完成了首次贡献\n* @RawToast 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F361 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.14...v2.0.15","2026-03-29T14:54:33",{"id":186,"version":187,"summary_zh":188,"released_at":189},71898,"v2.0.14","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.14` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 修复(cli)：限制提交时间为当前 UTC 日期，以防止未来日期的请求被拒绝，由 @cuipengfei 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F337 中实现。\n* 修复(scanner)：在扫描时包含 OpenClaw 自动归档的子代理对话记录，由 @baanish 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F342 中实现。\n* 修复(cli)：在限制 UTC 日期后重新计算所有摘要字段，由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F343 中实现。\n* 新增功能(cli)：添加 --kilo 标志，用于 Kilo CLI 的使用情况统计，由 @mine-13-zoom 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F353 中实现。\n* 修复(pricing)：在提供者感知匹配中跳过所有无定价条目，由 @crhan 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F352 中实现。\n* 功能增强：通过 `TOKSCALE_EXTRA_DIRS` 支持额外的扫描目录，由 @crhan 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F348 中实现。\n* 性能优化(core)：缓存已解析的源消息以便刷新，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F327 中实现。\n* 防止派生摘要中的回归，保护 UTC 提交日期限制功能，由 @junhoyeo (3a935d713221f445cf00fac8783dde082f928924) 实现。\n* 防止 OpenClaw 归档对话记录扫描中的回归，由 @junhoyeo (c29e8171b1319cd7c0cbaeceb40cd13307321a26) 实现。\n* 保持 #327 缓存刷新审计分支的 lint 检查通过，由 @junhoyeo (041fce135f804cf3e7eef9423b62591370d142e9) 实现。\n* 证明格式错误的 Codex 行不会导致热缓存刷新结果偏差，由 @junhoyeo (341e7f0bc6ab94486e4ac3314fde3afb501185d5) 实现。\n* 在 v2.0.13 后的审计之后，保留有效的本地扫描和定价查询结果，由 @junhoyeo (1a4b897e02f5a96e0c61313417abe6966422b080) 实现。\n* 解除 Kilo 解析器 clippy lint 的阻塞，以便 CI 能够隔离剩余的定价问题，由 @junhoyeo (9a1aa75e1eb0b05dde22e3080610df2bd9e0f4c0) 实现。\n* 杂项(fmt)：在审计合并后保持严格的格式规范，由 @junhoyeo (8b0f54209e6de08f65efec2be29cd12c52f1d6d5) 实现。\n* 修复(openclaw)：统计重置归档和仅模型快照过渡的情况，由 @PJunhyuk 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F349 中实现。\n* 修复(ci)：为 bun publish 写入 ~\u002F.npmrc 文件，并规范化仓库 URL，由 @junhoyeo (bef1da83f3778097fddcc0a9bbe03c3a0c0f5120) 实现。\n\n## 新贡献者\n* @PJunhyuk 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F349 中完成了首次贡献。\n* @crhan 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F352 中完成了首次贡献。\n* @mine-13-zoom 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F353 中完成了首次贡献。\n* @baanish 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F342 中完成了首次贡献。\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.13...v2.0.14","2026-03-25T22:30:57",{"id":191,"version":192,"summary_zh":193,"released_at":194},71899,"v2.0.13","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.13` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 功能（前端）：由 @Yeachan-Heo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F323 中更新 README 嵌入的 SVG 卡片\n* 修复（定价）：由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F322 中使定价查询具备供应商感知能力\n* 修复（定价）：由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F332 中处理供应商感知查询的边缘情况\n* 修复（定价）：由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F335 中强化供应商感知查询与推断的健壮性\n* 修复（核心）：由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F333 中在使用过期缓存回退前刷新本地定价\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.12...v2.0.13\n","2026-03-17T01:03:33",{"id":196,"version":197,"summary_zh":198,"released_at":199},71900,"v2.0.12","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.12` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 功能（提交）：在个人资料和排行榜中公开新鲜度元数据，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F314 中实现。\n* 修复（opencode）：规范化代理名称的大小写，由 @minpeter 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F326 中实现。\n* 修复（CLI）：对于无头登录，跳过浏览器自动打开，由 @IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F325 中实现。\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.11...v2.0.12\n","2026-03-15T12:45:12",{"id":201,"version":202,"summary_zh":203,"released_at":204},71901,"v2.0.11","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.11` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 功能（前端）：由 @Yeachan-Heo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F315 中添加了更加精美的个人资料嵌入对话框\n* 更新跟踪标签以反映新值，由 @junhoyeo 完成 (6084a2d04413697da79a6d747a3278d650d105de)\n* 修复（TUI）：由 @minpeter 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F316 中将 Opencode 代理变体合并到代理统计中\n* 修复：允许在未来日期验证中使用 1 天的时区缓冲，由 @cuipengfei 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F319 中完成\n\n## 新贡献者\n* @cuipengfei 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F319 中完成了首次贡献\n* @minpeter 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F316 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.10...v2.0.11\n","2026-03-13T03:25:01",{"id":206,"version":207,"summary_zh":208,"released_at":209},71902,"v2.0.10","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.10` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 修复：使用本地日期进行每日用量分桶，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F288 中实现。\n* 修复：为排行榜周期使用每日细分数据，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F289 中实现。\n* 文档（README）：添加新的 Discord 链接和徽章，由 junhoyeo (3d8aad66890c5e65174991ff8b38d4fecdc2a873) 完成。\n* 文档（README）：修复表格，由 junhoyeo (678a2b7af1fa739e6e31672ffbb96491442c799b) 完成。\n* 文档（README）：修复表格，由 junhoyeo (79a46563c7d703ef1d8192ac3584d9acbaaaba54) 完成。\n* 文档：从 README 中移除过时的 OpenTUI\u002FBun 相关引用，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F290 中完成。\n* 修复（TUI）：过滤掉按键释放事件以防止重复操作，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F291 中实现。\n* 功能（TUI）：添加 PageUp\u002FPageDown\u002FHome\u002FEnd 导航功能以及三区统计布局，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F292 中实现。\n* 修复：对于重复快照，优先使用累计 token 差值，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F298 中完成。\n* 修复：为合成网关流量保留客户端身份，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F301 中完成。\n* 功能（CI）：在发布后将发布说明推送到 Discord，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F293 中实现。\n* 修复：为 Composer 1.5 添加 Cursor 定价，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F299 中完成。\n* 修复：修正合成提供商的归一化逻辑并移除死代码，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F302 中完成。\n* 功能（TUI）：添加本地代理成本细分，由 IvGolovach (35afa5d71557ed7e39b5b2529938d9e74a873c1c) 实现。\n* 修复：使旧版 TUI 缓存模式失效，由 IvGolovach (1c8bf184f485ed2f215aeffd776fac8678bc4396) 完成。\n* 样式：移除 TUI 数据测试中的尾部空行，由 junhoyeo (a0b51eb1b9dc80b715d95b871a8235be91a06dcd) 完成。\n* 重构（前端）：集中管理个人 token 认证路径，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F294 中完成。\n* 测试：为设置 token 和设备轮询边缘情况添加路由级别测试，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F305 中完成。\n* 修复：解决深度质量审计中发现的 P1 级别问题，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F306 中完成。\n* 修复（合成）：保留网关客户端身份，由 IvGolovach 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F304 中完成。\n* 修复（缓存）：为升级路径提供向后兼容的 TUI 缓存，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F307 中完成。\n* 修复（Codex）：避免重复计算过时的 token_count 回归问题，由 Soju06 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F308 中完成。\n* 修复（Codex）：将 last_token_usage 作为主要增量来源，由 junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F309 中完成。\n* 修复（Codex）：将会话提供商\u002F代理传播至无头回退，并修复空 slug 的短路问题，由 junhoyeo 在 https:\u002F\u002Fg","2026-03-11T16:44:03",{"id":211,"version":212,"summary_zh":213,"released_at":214},71903,"v2.0.9","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.9` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 修复：在 BreakdownPanel 中居中 CloseButton 图标，由 @anaclumos 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F286 中完成\n* 修复终端高度较小时 Stats 选项卡出现的 panic 问题，由 @shawnpang 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F287 中完成\n\n## 新贡献者\n* @shawnpang 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F287 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.8...v2.0.9\n","2026-03-07T21:43:58",{"id":216,"version":217,"summary_zh":218,"released_at":219},71904,"v2.0.8","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.8` 已发布！\n\u003C\u002Fdiv>\n\n## 变更内容\n* 修复（TUI）：由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F285 中实现，支持通过鼠标滚轮事件进行列表导航。\n* 修复：由 @junhoyeo 在 https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F284 中解决，修复了在生成发布说明时 PR 标题被截断的问题。\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.7...v2.0.8\n","2026-03-06T04:50:03",{"id":221,"version":222,"summary_zh":223,"released_at":224},71905,"v2.0.7","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.7` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* fix(tui): make stats day breakdown scrollable by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F283\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.6...v2.0.7\n","2026-03-05T23:30:37",{"id":226,"version":227,"summary_zh":228,"released_at":229},71906,"v2.0.6","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.6` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* fix(pricing): correct >200k tier billing and harden provider-prefixed model resolution by @daymade in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F277\n* feat(tui): add today jump\u002Fhighlight in daily view while keeping globa… by @cy920820 in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F278\n* fix(tui): stabilize day breakdown render order to prevent flickering by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F279\n* fix(tui): use BTreeMap for DailyUsage.models to eliminate all HashMap flickering by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F280\n* fix(tui): use Utc instead of Local for today comparison in Daily view by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F281\n\n## New Contributors\n* @cy920820 made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F278\n* @daymade made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F277\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.5...v2.0.6\n","2026-03-05T10:35:11",{"id":231,"version":232,"summary_zh":233,"released_at":234},71907,"v2.0.5","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.5` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* Update AI agent guidelines in AGENTS.md by @junhoyeo (540af8d1763d2f440879b362abcf12a697651b6d)\n* feat: add Mux (coder\u002Fmux) client support by @kacpersaw in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F269\n* ci: update coverage badge [skip ci] by @github-actions[bot] (1f5c6821a17f49223904e16ba9eb56267acbcb2d)\n* [ImgBot] Optimize images by @app\u002Fimgbot in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F272\n* feat: warm TUI cache after successful submit by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F273\n* fix(mux): harden Mux parser and fill coverage gaps from #269 by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F274\n* ci: update coverage badge [skip ci] by @github-actions[bot] (9c5a7e2263cc7f19d0c0e4d10cb615fb2007fad9)\n* fix: restore Gemini CLI tracking and handle new session format by @cantalupo555 in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F275\n\n## New Contributors\n* @kacpersaw made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F269\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.4...v2.0.5\n","2026-03-04T09:14:46",{"id":236,"version":237,"summary_zh":238,"released_at":239},71908,"v2.0.4","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.4` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* Fix Octofriend link in README by @junhoyeo (711a3bece88f448e5eb7e12e445fb9d0349c71cd)\n* Sync translated READMEs (zh-cn, ko, ja) with English source by @junhoyeo (1d0b2370e01218d4e8837f2e97644faf6df9f75c)\n* feat(cli): add OSC 8 clickable hyperlinks for terminal URLs by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F263\n* ci: grant write permissions to coverage job for badge push by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F265\n* fix(cache): subset matching for new clients + consolidate lint CI by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F264\n* docs: update v2 note — v2 is shipped, not upcoming by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F266\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.3...v2.0.4\n","2026-03-02T20:47:18",{"id":241,"version":242,"summary_zh":243,"released_at":244},71909,"v2.0.3","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.3` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* feat: add Roo Code and KiloCode sources by @unbraind in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F224\n* feat: add synthetic.new as 11th source by @ComBba in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F225\n* feat(core): add Kilocode support by @asf0 in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F226\n* feat: Add Qwen CLI token usage tracking support (Rust) by @Efan404 in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F236\n* fix(cache): improve caching reliability and correctness by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F251\n* fix(qwen): use per-test TempDir for cleanup and change hotkey to avoid quit conflict by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F253\n* docs(qwen): add Qwen CLI to READMEs, frontend, and logo assets by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F254\n* [ImgBot] Optimize images by @app\u002Fimgbot in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F256\n* refactor(core): post-v2.0.2 audit cleanup — remove dead code, consolidate tests by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F257\n* ci(fmt): auto-fix and commit formatting issues on PRs by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F258\n* fix(cli): restore star prompt caching from v1 and fix gh API command by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F259\n* [ImgBot] Optimize images by @app\u002Fimgbot in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F260\n* feat(cli): add --synthetic flag, fix client display names, clean up model aliases by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F261\n\n## New Contributors\n* @Efan404 made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F236\n* @asf0 made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F226\n* @ComBba made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F225\n* @unbraind made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F224\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.2...v2.0.3\n","2026-03-02T11:34:50",{"id":246,"version":247,"summary_zh":248,"released_at":249},71910,"v2.0.2","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.2` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* fix(cli): fix binary path resolution for npm\u002Fpnpm\u002Fbunx installs by @junhoyeo (6c73f009e6ce6e2fc68984cf90404e2e30b28ebc)\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.1...v2.0.2\n","2026-02-26T19:58:35",{"id":251,"version":252,"summary_zh":253,"released_at":254},71911,"v2.0.1","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.1` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* fix(ci): download bumped artifacts to correct packages\u002F path by @junhoyeo (44b2218451300bf7482ca94fdd35397f5642e9d1)\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv2.0.0...v2.0.1\n","2026-02-26T19:20:55",{"id":256,"version":257,"summary_zh":258,"released_at":259},71912,"v2.0.0","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero-v2.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v2.0.0` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* feat(frontend): add landing page and move leaderboard to \u002Fleaderboard by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F177\n* fix(frontend): resolve Vercel build failures from dependency type mismatches by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F179\n* feat(cli): show provider and raw model in --light table for dedup diagnosis by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F174\n* feat(core): normalize model names at aggregation level to consolidate duplicates by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F175\n* feat(cli): add --group-by flag for configurable model grouping strategies by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F178\n* perf: reduce allocations in JSONL parsing and optimize frontend rendering by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F182\n* feat: add comprehensive code coverage infrastructure and test suite (22% → 60%+) by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F184\n* fix(core): port missing SQLite+dedup and OpenClaw fixes to rewrite branch by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F193\n* fix(tui): resolve TUI freeze\u002Fbrick when left idle and eliminate all warnings by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F195\n* fix(tui): wrap-around list\u002Ftab selection at list boundaries by @Yeachan-Heo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F215\n* chore: sync main → feat\u002Fratatui-rewrite (v1.2.7–v1.3.0) by @Yeachan-Heo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F214\n* sync: merge main into ratatui-rewrite by @Yeachan-Heo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F219\n* sync(main→rewrite): merge main v1.4.2 — Kimi CLI as 10th data source by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F222\n* fix(frontend): harden postgres connection pool for serverless by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F223\n* Ratatui rewrite and Rust workspace migration by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F150\n* feat(tui): add modal dialog for source selection by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F228\n* refactor: rename source→client for AI client terminology by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F230\n* feat(tui): add group-by picker dialog by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F232\n* Revise README with new image and follow info by @junhoyeo (5a095f472ee6b2289ade6e6185d54b686169176b)\n* Add explanation of Tokscale and its inspiration by @junhoyeo (2a5c77404556f9dbcc19aed6a7b0209490a6556d)\n* refactor(core,cli): centralize client definitions with define_clients! macro by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F237\n* Deploy v2 to main 🛰️ by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F239\n* fix: backwards compat for source→client rename and SSR profile fetch by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F240\n* chore: update og-image by @junhoyeo (8d09fe3dd15bc57c04d028ac359f9566687cd589)\n* Update README with upcoming v2 announcement by @junhoyeo (f36a34b7825b6a2918fa00db2fd933f3ac7b2bab)\n* docs: sync v2 announcement to translated READMEs (ko, ja, zh-cn) by @junhoyeo (83be80170613f2bfdaf549d29307efab1f2d09a5)\n* feat(frontend): format large token counts as T (trillions) with styled hover tooltip by @junhoyeo (6ae0974db62062f8762a9fc96c228b7b84f6a46d)\n* chore: fix version control by @junhoyeo (8a54542d35fbab925244fd6aa57678797644a6dd)\n* feat(frontend): add mobile navigation for ≤520px viewports by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F242\n* refactor(frontend): remove @primer\u002Freact and replace with styled-components by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F243\n* feat(frontend): add GitHub README profile embed SVG widget by @Yeachan-Heo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F238\n* Fix React Server Components CVE vulnerabilities by @app\u002Fvercel in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F246\n* chore(cli): remove dead v1 TypeScript code by @junhoyeo (51da7173b8d2e2cc20e9060b434c939316d6dc5c)\n* chore: remove packages\u002Fcore napi-rs package by @junhoyeo (3b8e2e06dba2ddb87d97b3d038b5a254841c3692)\n* chore: regenerate bun.lock after workspace cleanup by @junhoyeo (771bb47bd5e3d1dd4b40b4bdb84a24c72d30b327)\n* fix(ci): use cross-compilation strip for ARM64 Linux targets by @junhoyeo (b8b8608daef3ddd99640b1ed9332330aa2dfe5a5)\n\n## New Contributors\n* @app\u002Fvercel made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F246\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv1.4.3...v2.0.0\n","2026-02-26T18:56:00",{"id":261,"version":262,"summary_zh":263,"released_at":264},71913,"v1.4.3","\u003Cdiv align=\"center\">\n\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\n\n# `tokscale@v1.4.3` is here!\n\u003C\u002Fdiv>\n\n## What's Changed\n* feat: add Kimi CLI as 10th data source by @Miss-you in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F198\n* fix(core): add dedup_key to Kimi parser and unify colors by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F220\n* sync(main→rewrite): merge main v1.4.2 — Kimi CLI as 10th data source by @junhoyeo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F222\n* [ImgBot] Optimize images by @app\u002Fimgbot in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F221\n\n## New Contributors\n* @Miss-you made their first contribution in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F198\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv1.4.2...v1.4.3\n","2026-02-18T20:18:28",{"id":266,"version":267,"summary_zh":268,"released_at":269},71914,"v1.4.2","\u003Cdiv align=\"center\">\r\n\r\n[![Tokscale](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fraw\u002Fmain\u002F.github\u002Fassets\u002Fhero.png)](https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale)\r\n\r\n# `tokscale@v1.4.2` is here!\r\n\u003C\u002Fdiv>\r\n\r\n## What's Changed\r\n* fix(opencode): fix migration cache edge cases by @Yeachan-Heo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F217\r\n* fix(amp): use fallback timestamps for events missing timestamp field by @Yeachan-Heo in https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fpull\u002F218\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fjunhoyeo\u002Ftokscale\u002Fcompare\u002Fv1.4.1...v1.4.2\r\n","2026-02-18T17:56:40"]