MambaTransformer and HSSS
These are ecosystem siblings where MambaTransformer represents a hybrid architectural approach that combines two sequence modeling paradigms, while HSSS provides a specialized hierarchical variant of the pure state-space model approach that MambaTransformer partially incorporates.
Maintenance
10/25
Adoption
10/25
Maturity
25/25
Community
12/25
Maintenance
0/25
Adoption
9/25
Maturity
18/25
Community
10/25
Stars: 215
Forks: 16
Downloads: —
Commits (30d): 0
Language: Python
License: MIT
Stars: 15
Forks: 2
Downloads: 28
Commits (30d): 0
Language: Python
License: MIT
No risk flags
Stale 6m
About MambaTransformer
kyegomez/MambaTransformer
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
About HSSS
kyegomez/HSSS
Implementation of a Hierarchical Mamba as described in the paper: "Hierarchical State Space Models for Continuous Sequence-to-Sequence Modeling"
Scores updated daily from GitHub, PyPI, and npm data. How scores work