Unity-Technologies/com.unity.perception
Perception toolkit for sim2real training and validation in Unity
Generates synthetic datasets with ground-truth annotations (bounding boxes, segmentation masks, keypoints) through a modular system of Perception Cameras, Labelers, and Domain Randomization tools. Supports multiple output formats (SOLO, COCO) and integrates with analysis frameworks like pysolotools and Voxel51 for dataset validation. Built as a Unity package targeting versions 2021.3+, enabling GPU-accelerated rendering for scalable sim2real data generation without manual labeling.
990 stars. No commits in the last 6 months.
Stars
990
Forks
186
Language
C#
License
—
Category
Last pushed
Nov 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/Unity-Technologies/com.unity.perception"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
stereolabs/zed-unity
ZED SDK Unity plugin
evo-biomech/replicAnt
replicAnt - generating annotated images of animals in complex environments with Unreal Engine
CMU-Perceptual-Computing-Lab/openpose_unity_plugin
OpenPose's Unity Plugin for Unity users
wtct-hungary/UnityVision-iOS
This native plugin enables Unity to take advantage of specific features of Core-ML and Vision...
Unity-Technologies/SynthDet
SynthDet - An end-to-end object detection pipeline using synthetic data