dccuchile/spanish-word-embeddings
Spanish word embeddings computed with different methods and from different corpora
This project offers pre-computed Spanish word embeddings, which are numerical representations of words that capture their meanings and relationships. It takes large Spanish text collections as input and provides ready-to-use vector files for individual words. This is useful for computational linguists, natural language processing researchers, or data scientists working with Spanish text.
364 stars. No commits in the last 6 months.
Use this if you need high-quality, pre-trained numerical representations of Spanish words for tasks like text classification, sentiment analysis, or machine translation.
Not ideal if you require word embeddings from a very specific or highly specialized Spanish corpus not covered by general-purpose sources like Wikipedia or large web crawls.
Stars
364
Forks
83
Language
—
License
—
Category
Last pushed
Oct 09, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/dccuchile/spanish-word-embeddings"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vzhong/embeddings
Fast, DB Backed pretrained word embeddings for natural language processing.
ncbi-nlp/BioSentVec
BioWordVec & BioSentVec: pre-trained embeddings for biomedical words and sentences
CyberZHG/keras-pos-embd
Position embedding layers in Keras
PrashantRanjan09/WordEmbeddings-Elmo-Fasttext-Word2Vec
Using pre trained word embeddings (Fasttext, Word2Vec)
mb-14/embeddings.js
Word embeddings for the web