sahilfaizal01/Evaluating-the-Performance-of-Episodic-Transformer-Memory-PPO-vs.-Traditional-Transformer-PPO
Designed and implemented an Episodic Transformer Memory (ETM) framework on a Transformer-XL backbone to enhance long-term memory retention in reinforcement learning (RL) agents for partially observable environments
No commits in the last 6 months.
Stars
1
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 06, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/sahilfaizal01/Evaluating-the-Performance-of-Episodic-Transformer-Memory-PPO-vs.-Traditional-Transformer-PPO"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
transformerlab/transformerlab-app
The open source research environment for AI researchers to seamlessly train, evaluate, and scale...
naru-project/naru
Neural Relation Understanding: neural cardinality estimators for tabular data
danielzuegner/code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from...
neurocard/neurocard
State-of-the-art neural cardinality estimators for join queries
salesforce/CodeTF
CodeTF: One-stop Transformer Library for State-of-the-art Code LLM