VinAIResearch/PhoGPT

PhoGPT: Generative Pre-training for Vietnamese (2023)

44
/ 100
Emerging

Built on a 3.7B-parameter decoder architecture trained on 102B Vietnamese tokens with 8K context length, PhoGPT supports inference across vLLM, Text Generation Inference, llama.cpp, and standard Transformers pipelines, with quantization options via bitsandbytes and GGUF formats. The chat variant incorporates instruction-following and conversation tuning on 70K prompt-response pairs plus 290K multi-turn dialogues. Full fine-tuning is facilitated through llm-foundry or alternative frameworks like LLaMA-Factory and lit-gpt.

798 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

798

Forks

74

Language

Python

License

BSD-3-Clause

Last pushed

Nov 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/VinAIResearch/PhoGPT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.