Fast and cost-effective LLaMA 2 fine-tuning with AWS Trainium
AWS Machine Learning
OCTOBER 5, 2023
We review the fine-tuning scripts provided by the AWS Neuron SDK (using NeMo Megatron-LM), the various configurations we used, and the throughput results we saw. For example, to use the RedPajama dataset, use the following command: wget [link] python nemo/scripts/nlp_language_modeling/preprocess_data_for_megatron.py
Let's personalize your content