KoELECTRA and KoCharELECTRA
These are complementary approaches to the same task: KoELECTRA operates on subword tokens while KoCharELECTRA operates on character (syllable) units, allowing practitioners to choose the tokenization granularity that best fits their Korean NLP application.
Maintenance
0/25
Adoption
10/25
Maturity
16/25
Community
25/25
Maintenance
0/25
Adoption
8/25
Maturity
16/25
Community
16/25
Stars: 630
Forks: 136
Downloads: —
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 54
Forks: 10
Downloads: —
Commits (30d): 0
Language: Python
License: Apache-2.0
Stale 6m
No Package
No Dependents
Stale 6m
No Package
No Dependents
About KoELECTRA
monologg/KoELECTRA
Pretrained ELECTRA Model for Korean
About KoCharELECTRA
monologg/KoCharELECTRA
Character-level Korean ELECTRA Model (음절 단위 한국어 ELECTRA)
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work