microsoft/aici

AICI: Prompts as (Wasm) Programs

41
/ 100
Emerging

Implements constrained decoding through lightweight WebAssembly modules that execute on the inference engine's CPU during token generation, enabling token-by-token control over LLM output. Controllers written in Rust, C++, Python, or JavaScript maintain state and implement diverse strategies—from programmatic decoding to multi-agent coordination—while the sandbox architecture ensures security without filesystem or network access. Integrates with llama.cpp, HuggingFace Transformers, and rLLM, designed as a portable layer for higher-level control libraries like Guidance and LMQL.

2,064 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

2,064

Forks

82

Language

Rust

License

MIT

Last pushed

Jan 22, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/microsoft/aici"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.