Simple and efficient training framework for long-context models
-
Updated
Jan 12, 2026 - Python
Simple and efficient training framework for long-context models
A PyTorch framework that handles object detection across 6 different architectures (RetinaNet, Faster R-CNN, SSD, FCOS, and more). Takes care of the optimization setup and training quirks for each model.
A comprehensive framework for fine-tuning OpenAI models with streamlined data preparation, training, and evaluation workflows
Intelligent training framework that automatically skips mastered samples and gives 5× more compute to hard ones. Up to 80% compute savings on LLM fine-tuning.
A comprehensive framework for developing YOLO family models, featuring streamlined workflows for training, validation, testing, and deployment through easy-to-use config files, enabling flexible customization to suit various object detection tasks.
ML training orchestration for the Crucible ecosystem. Distributed training, hyperparameter optimization, checkpointing, model versioning, metrics collection, early stopping, LR scheduling, gradient accumulation, and mixed precision training with Nx/Scholar integration.
A PyTorch framework for image classification covering 11 CNN architectures (ResNet, EfficientNet, MobileNet, etc.). Handles the optimization setup and training specifics for each model.
Library for config based Neural Network Training
Fault-tolerant distributed training framework with async checkpointing for LLM's
Modular diffusion model training framework — LoRA, LyCORIS, and full fine-tuning for SD1.5, SDXL, SD3, Flux 1/2, Chroma, Z-Image, LTX2, HunyuanVideo, and Qwen
🚀 Accelerate ML training on the BEAM with CrucibleTrain's unified infrastructure for diverse model types and workflows.
Add a description, image, and links to the training-framework topic page so that developers can more easily learn about it.
To associate your repository with the training-framework topic, visit your repo's landing page and select "manage topics."