mlx-vlm and Local_LLM_Training_Apple_Silicon

mlx-vlm
81
Verified
Maintenance 20/25
Adoption 15/25
Maturity 25/25
Community 21/25
Maintenance 0/25
Adoption 7/25
Maturity 8/25
Community 15/25
Stars: 2,287
Forks: 293
Downloads:
Commits (30d): 41
Language: Python
License: MIT
Stars: 26
Forks: 5
Downloads:
Commits (30d): 0
Language: Python
License:
No risk flags
No License Stale 6m No Package No Dependents

About mlx-vlm

Blaizzy/mlx-vlm

MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.

This project helps you understand images, audio, and video content by describing or answering questions about them. You provide a visual, audio, or multi-modal input and a question or prompt, and the tool generates a textual response. It's designed for anyone working with multimedia content on a Mac who needs to extract information or generate descriptions.

multimedia-analysis content-understanding image-description audio-analysis document-processing

About Local_LLM_Training_Apple_Silicon

GusLovesMath/Local_LLM_Training_Apple_Silicon

Created and enhanced a local LLM training system on Apple Silicon with MLX and Metal API, overcoming the absence of CUDA support. Fine-tuned the Llama3 model on 16 GPUs for streamlined solution of verbose math word problems. Result: a powerful, privacy-preserving chatbot that runs smoothly on-device.

This project offers a specialized chatbot designed to solve complex math word problems. You provide a detailed math problem in plain English, and the chatbot delivers a clear, concise solution. It's ideal for students, educators, or anyone needing quick, private assistance with verbose mathematical reasoning, running directly on your Apple device.

mathematics education problem solving homework assistance personal tutoring quantitative analysis

Scores updated daily from GitHub, PyPI, and npm data. How scores work