RisticDjordje/personalized-autocomplete-next-word-prediction-task
A personalized autocomplete (next word prediction) project using three different architectures: stacked LSTMs, Seq2Seq with Attention and LSTMs and GPT-2, written from scratch.
This project helps developers explore advanced next-word prediction models for creating personalized autocomplete features. It takes text data as input and provides a trained model that can suggest the most likely next word in a sequence. This is useful for engineers building predictive text interfaces or smart content creation tools.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher interested in implementing and comparing different neural network architectures (LSTMs, Seq2Seq, GPT-2) for next-word prediction from scratch.
Not ideal if you need a ready-to-use, production-ready autocomplete system without deep involvement in model architecture and training.
Stars
10
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
Oct 03, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/RisticDjordje/personalized-autocomplete-next-word-prediction-task"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Nixtla/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and...
andrewdalpino/NoPE-GPT
A GPT-style small language model (SLM) with no positional embeddings (NoPE).
sigdelsanjog/gptmed
pip install gptmed
akanyaani/gpt-2-tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
samkamau81/FinGPT_
FinGPT is an AI language model designed to understand and generate financial content. Built upon...