facebookresearch/theseus
A library for differentiable nonlinear optimization
Embeds differentiable second-order optimizers (Gauss-Newton, Levenberg-Marquardt, Trust Region) and sparse linear solvers directly into PyTorch, enabling end-to-end gradient flow through optimization layers. Supports batched GPU computation with multiple backward modes (implicit, truncated, DLM) and includes Lie group representations and robot kinematics for robotics and vision applications. Designed for hybrid architectures that combine neural networks with domain-specific differentiable models as inductive priors.
2,008 stars. No commits in the last 6 months.
Stars
2,008
Forks
143
Language
Python
License
MIT
Category
Last pushed
Jan 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/facebookresearch/theseus"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
SimplexLab/TorchJD
Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with...
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)
clovaai/AdamP
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)