david-s-martinez/Dex-GAN-Grasp
DexGANGrasp: Dexterous Generative Adversarial Grasping Synthesis for Task-Oriented Manipulation - IEEE-RAS International Conference on Humanoid Robots (Humanoids) 2024 | DOI: 10.1109/Humanoids58906.2024.10769950
This project provides a way for roboticists and automation engineers to program robots to pick up and manipulate objects using advanced robotic hands. You provide the system with a single view of an object, and it generates stable grasp configurations in real-time. This helps ensure robots can reliably perform complex tasks like assembling products or handling delicate items.
No commits in the last 6 months.
Use this if you need a robotic arm with a dexterous hand to reliably grasp and manipulate a variety of objects in real-world or simulated environments, especially for task-oriented operations.
Not ideal if your robot only requires simple two-finger gripper grasps or if you are not working with advanced multi-fingered robotic hands.
Stars
11
Forks
—
Language
Python
License
—
Category
Last pushed
Oct 15, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/david-s-martinez/Dex-GAN-Grasp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jingyaogong/minimind-v
🚀 「大模型」1小时从0训练26M参数的视觉多模态VLM!🌏 Train a 26M-parameter VLM from scratch in just 1 hours!
roboflow/vision-ai-checkup
Take your LLM to the optometrist.
SkyworkAI/Skywork-R1V
Skywork-R1V is an advanced multimodal AI model series developed by Skywork AI, specializing in...
zai-org/GLM-TTS
GLM-TTS: Controllable & Emotion-Expressive Zero-shot TTS with Multi-Reward Reinforcement Learning
NExT-GPT/NExT-GPT
Code and models for ICML 2024 paper, NExT-GPT: Any-to-Any Multimodal Large Language Model