MoleculeTransformers/moleculenet-bert-ssl
Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.
No commits in the last 6 months.
Stars
2
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 17, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/MoleculeTransformers/moleculenet-bert-ssl"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rxn4chemistry/rxn-onmt-models
Training of OpenNMT-based RXN models
lamalab-org/MatText
Text-based modeling of materials.
CTCycle/ADSMOD-Adsorption-Modeling
Streamline adsorption modeling by automatically fitting theoretical adsorption models to...
HUBioDataLab/SELFormer
SELFormer: Molecular Representation Learning via SELFIES Language Models
VectorInstitute/atomgen
Library for handling atomistic graph datasets focusing on transformer-based implementations,...