G42 Group’s Inception, Mohamed bin Zayed University of Artificial Intelligence and Cerebras Systems announced a 13 billion parameter bilingual Arabic-English large language model, trained in just 21 days.
The supercomputer Condor Galaxy 1, developed by US-based AI chip maker Cerebras Systems, and announced just a few weeks ago was recently used to train a new 13 billion parameter bilingual Arabic-English large language model (or LLM) called Jais. It allowed researchers to compete the ‘production training’ of the new AI model in 21 day: a process that could have taken several months on alternative high performance computer systems.
It’s common for LLMs to take months to train, but Jais was trained in just 21 days,”
Arabian Gulf Business Insight (AGBI) asked me to comment on the development and the promise of the Cerebras-G42 collaboration to build the world’s biggest supercomputer network.
It’s actually a complex topic, because of the not only the speed of development of new artificial intelligence models and the AI-friendly high performance computer systems that run them, but also the rapid rise of Abu Dhabi’s AI R&D ecosystem. Abu Dhabi-based researchers have now developed a series of different LLM models, including Falcon 40B, which was ranked first on Hugging Face’s index of open source LLMs earlier this year.
It is no wonder that G42 has decided to invest in the latest supercomputers to provide for the growing need of AI researchers. As a result of the expertised gained, both at home, and via collaborations such as the one with Cerebras Systems, Abu Dhabi technology organisations are gaining world-class capabilities that they could sell globally. The demand for both AI models and the computers that train them is only going to grow!
Meanwhile, you can read my article on Inception’s new Jais 13B LLM here:
- Will GenAI champion the Arabic language? (Middle East AI News)