CAMeL-Lab/CAMeLBERT
Code and models for "The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models". EACL 2021, WANLP.
No commits in the last 6 months.
Stars
55
Forks
13
Language
Python
License
MIT
Category
Last pushed
Jun 21, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/CAMeL-Lab/CAMeLBERT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
deepset-ai/FARM
:house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the...
extreme-bert/extreme-bert
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on...
UBC-NLP/marbert
UBC ARBERT and MARBERT Deep Bidirectional Transformers for Arabic
Grenzlinie/MgBERT_LLM_Classification_for_Materials_Science
Source code and result for Paper 'A Prompt-Engineered Large Language Model, Deep Learning...