- 77,437 Tweets
- 9,589 Following
- 12,671 Followers
We are publishing a detailed study of a 280-billion parameter transformer language model called Gopher, a study of ethical and social risks associated with large language models, and a paper investigating a new architecture with better training efficiency.