node-llama-cpp and LLamaSharp
These are ecosystem siblings: node-llama-cpp provides Node.js bindings for the underlying llama.cpp C++ inference engine, while LLamaSharp provides C#/.NET bindings for the same engine, allowing developers to run local LLMs in their preferred language/runtime.
Maintenance
16/25
Adoption
25/25
Maturity
25/25
Community
20/25
Maintenance
20/25
Adoption
10/25
Maturity
16/25
Community
22/25
Stars: 1,942
Forks: 176
Downloads: 4,219,393
Commits (30d): 3
Language: TypeScript
License: MIT
Stars: 3,572
Forks: 488
Downloads: —
Commits (30d): 14
Language: C#
License: MIT
No risk flags
No Package
No Dependents
About node-llama-cpp
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
About LLamaSharp
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work