NohTow/PPL-MCTS
Repository for the code of the "PPL-MCTS: Constrained Textual Generation Through Discriminator-Guided Decoding" paper, NAACL'22
Implements Monte Carlo Tree Search decoding that combines any Hugging Face language model with a discriminator to enforce constraints during generation, using configurable exploration parameters (c_puct, temperature, repetition penalty) and rollout-based value estimation. The approach supports plug-and-play discriminators from custom PyTorch classifiers to vanilla transformers, with optimizations for unidirectional attention models to enable cached hidden state reuse and significant speedup gains.
No commits in the last 6 months.
Stars
66
Forks
9
Language
Python
License
—
Category
Last pushed
Oct 25, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/NohTow/PPL-MCTS"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
worldbank/REaLTabFormer
A suite of auto-regressive and Seq2Seq (sequence-to-sequence) transformer models for tabular and...
MagedSaeed/generate-sequences
A python package made to generate sequences (greedy and beam-search) from Pytorch (not...
tlkh/t2t-tuner
Convenient Text-to-Text Training for Transformers
styfeng/TinyDialogues
Code & data for the EMNLP 2024 paper: Is Child-Directed Speech Effective Training Data for...
saltudelft/codefill
Contains the code and data for our #ICSE2022 paper titled as "CodeFill: Multi-token Code...