Microsoft and Nvidia team up to train one of the world’s largest language models

Training took place across 560 Nvidia DGX A100 servers, each containing 8 Nvidia A100 80GB GPUs. (plein d'autres chiffres)

Microsoft and Nvidia claim to have trained one of the world's largest natural language models, containing 530 billion parameters.

via Deep Learning Weekly : lire l’article source

Laisser un commentaire