fguzman82/CLIP-Finder2
CLIP-Finder enables semantic offline searches of images from gallery photos using natural language descriptions or the camera. Built on Apple's MobileCLIP-S0 architecture, it ensures optimal performance and accurate media retrieval.
No commits in the last 6 months.
Stars
90
Forks
11
Language
Swift
License
MIT
Category
Last pushed
Jul 25, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/fguzman82/CLIP-Finder2"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
unum-cloud/UForm
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts,...
rom1504/clip-retrieval
Easily compute clip embeddings and build a clip retrieval system with them
mazzzystar/Queryable
Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
Ubaida-M-Yusuf/Makimus-AI
AI-powered media search — find images and videos using natural language or visual queries
s-emanuilov/litepali
LitePali is a minimal, efficient implementation of ColPali for image retrieval and indexing,...