autogluon-assistant and jorel

autogluon-assistant
71
Verified
jorel
47
Emerging
Maintenance 10/25
Adoption 14/25
Maturity 25/25
Community 22/25
Maintenance 6/25
Adoption 11/25
Maturity 18/25
Community 12/25
Stars: 257
Forks: 50
Downloads: 37
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 5
Forks: 1
Downloads: 470
Commits (30d): 0
Language: TypeScript
License: MIT
No risk flags
No risk flags

About autogluon-assistant

autogluon/autogluon-assistant

Multi-Agent System Powered by LLMs for End-to-end Multimodal ML Automation

The system uses a node-based manager with Monte Carlo Tree Search (MCTS) to orchestrate multiple specialized agents that collaboratively handle data preprocessing, model selection, and hyperparameter tuning. It supports multiple LLM providers (AWS Bedrock, OpenAI, Anthropic, SageMaker) and exposes functionality through CLI, Python API, WebUI, and MCP interfaces, with a dedicated chat mode for non-code ML guidance.

About jorel

christianheine/jorel

LLMs made easy: Multiple providers, images, documents, tools, and agents in just a few lines of code.

Built on TypeScript with explicit model registration per provider, JorEl streams responses with configurable buffering and automatic tool call handling via `streamWithMeta()`, returning typed events (chunks, tool calls, reasoning) with message IDs and metadata. It targets Node.js/web developers needing a unified abstraction layer across OpenAI, Anthropic, Groq, Vertex AI, Ollama, and other providers while maintaining access to provider-specific features when needed.

Scores updated daily from GitHub, PyPI, and npm data. How scores work