National AI platform use
Reference links (for reference only)
Catalog
📄️ Mirror library: mindformers
Mirror Library
📄️ Introduction to Rise Documentation and Use of MindFormers Mirrors
user web-1
📄️ Rise Training Migration and Tuning (Pytorch)
user web-1
📄️ 12.7 Large model migration and optimization of the overall process
user web-1
📄️ 12.8 baichuan2-7b Reasoning about container environment setup
1. Mirror acquisition
📄️ 12.9 pytorch version of chatglm2-6b deployment
I. Pre-requisites
📄️ 12.10 Introduction to Mindformers
The goal of MindSpore Transformers is to build a full-flow development suite for large model training, fine-tuning, evaluation, inference, and deployment: providing the industry's mainstream Transformer class of pre-training models and SOTA downstream task applications, covering a wealth of parallel features. It is expected to help users easily realize large model training and innovative research and development.
📄️ 12.11 How to install Mindformers
●Way 1: Linux source code compilation and installation
📄️ 12.12 Mindformers-chatglm-6b Training & Reasoning & Fine Tuning Summary
1. Reasoning
📄️ Getting Started with the Big Model Kit
Large model suite use