Here's a list of some large language models (LLMs) and their sizes:
- Megatron-Turing NLG (530B parameters) - Developed by Google AI, it is one of the largest LLMs currently available.
- WuDao 2.0 (1.75T parameters) - Developed by BAAI (Beijing Academy of Artificial Intelligence), it is the world's largest publicly known LLM.
- Jur***ic-1 Jumbo (178B parameters) - Developed by AI21 Labs, it is a commercially available LLM.
- GShard-ShardingSwitch (6144B parameters) - Developed by Google AI, it is an experimental LLM that uses a novel sharding technique to train on a m***ive dataset.
- Switch Transformer (1.5T parameters) - Developed by Google AI, it is an LLM that uses a novel switch architecture to improve efficiency.
- PaLM 540B (540B parameters) - Developed by Google AI, it is a LLM that is trained on a m***ive dataset of text and code.
- BLOOM (176B parameters) - An open-source LLM developed by Hugging Face and a consortium of companies and organizations.
- WuDao 1.0 (1.75T parameters) - Developed by BAAI (Beijing Academy of Artificial Intelligence), it was one of the largest LLMs when it was released.
- T5-XXL (11B parameters) - A large LLM developed by Google AI, it is a versatile model that can be fine-tuned for a variety of tasks.
- Jur***ic-1 Grande (137B parameters) - Developed by AI21 Labs, it is a commercially available LLM.
Comment