akoksal/Turkish-Word2Vec
Pre-trained Word2Vec Model for Turkish
Trained on Turkish Wikipedia corpus using gensim, the model handles agglutinative morphology by learning vector representations across inflected word forms and suffixes. Includes downloadable pre-trained embeddings and demonstrates semantic operations like analogy tasks (king+woman-man=queen) adapted for Turkish grammatical structures. A lemmatization module is planned to improve quality by consolidating morphological variants.
222 stars. No commits in the last 6 months.
Stars
222
Forks
31
Language
Python
License
MIT
Category
Last pushed
Jul 18, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/akoksal/Turkish-Word2Vec"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Planeshifter/node-word2vec
Node.js interface to the Google word2vec tool.
nathanrooy/word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
thunlp/paragraph2vec
Paragraph Vector Implementation
RichDavis1/PHPW2V
A PHP implementation of Word2Vec, a popular word embedding algorithm created by Tomas Mikolov...
tmteam/Word2vec.Tools
.Net Implementation for google word2vec tools.