UKPLab/starsem2023-arithmetic-based-pretraining
Code and data for the StarSem 2023 paper "Arithmetic-Based Pretraining -- Improvin Numeracy of Pretrained Language Models"
No commits in the last 6 months.
Stars
1
Forks
—
Language
Julia
License
Apache-2.0
Category
Last pushed
Jul 23, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/UKPLab/starsem2023-arithmetic-based-pretraining"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
UKPLab/gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled...
galilai-group/stable-pretraining
Reliable, minimal and scalable library for pretraining foundation and world models
CognitiveAISystems/MAPF-GPT
[AAAI-2025] This repository contains MAPF-GPT, a deep learning-based model for solving MAPF...
larslorch/avici
Amortized Inference for Causal Structure Learning, NeurIPS 2022
svdrecbd/mhc-mlx
MLX + Metal implementation of mHC: Manifold-Constrained Hyper-Connections by DeepSeek-AI.