Skip to content
@NVIDIA-NeMo

NVIDIA-NeMo

NVIDIA NeMo Framework

NeMo Framework is NVIDIA's GPU accelerated, end-to-end training framework for large language models (LLMs), multi-modal models and speech models. It enables seamless scaling of training (both pretraining and post-training) workloads from single GPU to thousand-node clusters for both 🤗Hugging Face, Megatron, and Pytorch models.

This GitHub organization hosts repositories for NeMo's core components and integrations.

Documentation

To learn more about NVIDIA NeMo Framework and all of its component libraries, please refer to the NeMo Framework User Guide, which includes quick start guide, tutorials, model-specific recipes, best practice guides and performance benchmarks.

License

Apache 2.0 licensed with third-party attributions documented in each repository.

Popular repositories Loading

  1. Curator Curator Public

    Scalable data pre processing and curation toolkit for LLMs

    Jupyter Notebook 953 138

  2. RL RL Public

    Scalable toolkit for efficient model reinforcement

    Python 438 48

  3. Run Run Public

    A tool to configure, launch and manage your machine learning experiments.

    Python 161 61

  4. FW-CI-templates FW-CI-templates Public

    CI/CD templates for NeMo-FW libraries

    Python 3 2

  5. .github .github Public

Repositories

Showing 5 of 5 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…