feifeibear/long-context-attention
Fine-tuning ToolsUSP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
No dedicated docs site. Description: 119 chars. Stars signal: 664. Contributors: 23. Score: 4.7/10
Stars: 664. Contributors: 23. Watchers: 5. Forks: 79. Issue ratio: 1.8%. Score: 4.9/10
Last commit: 91d ago. Weekly commits: 0. Latest release: 0.6.4. Maturity bonus: 2.1y old. Score: 6.1/10
Stars/issues ratio: 55. Dynamic language: Python. No dedicated API docs. Permissive license: Apache-2.0. Popularity signal: 664 stars. Score: 6.7/10
Battle-tested: 664 stars. Peer review: 23 contributors. Versioned: 0.6.4. Licensed: Apache-2.0. Age: 2.1 years. Maintenance: last commit 91d ago. Score: 5.7/10
Fork interest: 79. Major ecosystem: Python. Integration-friendly: Apache-2.0. Adoption: 664 stars. Score: 6.5/10