autogluon-assistant and jorel
About autogluon-assistant
autogluon/autogluon-assistant
Multi-Agent System Powered by LLMs for End-to-end Multimodal ML Automation
The system uses a node-based manager with Monte Carlo Tree Search (MCTS) to orchestrate multiple specialized agents that collaboratively handle data preprocessing, model selection, and hyperparameter tuning. It supports multiple LLM providers (AWS Bedrock, OpenAI, Anthropic, SageMaker) and exposes functionality through CLI, Python API, WebUI, and MCP interfaces, with a dedicated chat mode for non-code ML guidance.
About jorel
christianheine/jorel
LLMs made easy: Multiple providers, images, documents, tools, and agents in just a few lines of code.
Built on TypeScript with explicit model registration per provider, JorEl streams responses with configurable buffering and automatic tool call handling via `streamWithMeta()`, returning typed events (chunks, tool calls, reasoning) with message IDs and metadata. It targets Node.js/web developers needing a unified abstraction layer across OpenAI, Anthropic, Groq, Vertex AI, Ollama, and other providers while maintaining access to provider-specific features when needed.
Scores updated daily from GitHub, PyPI, and npm data. How scores work