pkoukk/tiktoken-go
go version of tiktoken
Implements BPE tokenization for OpenAI models with pluggable dictionary loaders—supporting runtime downloads, cached dictionaries via `TIKTOKEN_CACHE_DIR`, or offline embedded encodings. Provides both encoding-level (cl100k_base, p50k_base) and model-specific APIs (gpt-4, gpt-3.5-turbo), with built-in helpers for counting tokens in chat completion messages compatible with the go-openai SDK.
895 stars. No commits in the last 6 months.
Stars
895
Forks
102
Language
Go
License
MIT
Category
Last pushed
Sep 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/pkoukk/tiktoken-go"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
lenML/tokenizers
a lightweight no-dependency fork from transformers.js (only tokenizers)
aiqinxuancai/TiktokenSharp
Token calculation for OpenAI models, using `o200k_base` `cl100k_base` `p50k_base` encoding.
dqbd/tiktokenizer
Online playground for OpenAPI tokenizers
tryAGI/Tiktoken
This project implements token calculation for OpenAI's gpt-4 and gpt-3.5-turbo model,...
microsoft/Tokenizer
Typescript and .NET implementation of BPE tokenizer for OpenAI LLMs.