Google has introduced PaLM 2, its new large language model (LLM), which uses almost five times more training data than its predecessor. The model is trained on 3.6 trillion tokens, compared to the 780 billion tokens of the Pathways Language Model from 2022. PaLM 2 is a general-use LLM that is designed to perform advanced coding, maths, and creative writing tasks. The model is trained on 100 languages and can perform a wide range of tasks, with 25 features and products already available, including Google’s experimental chatbot, Bard. PaLM 2 is more powerful than any existing model, including Facebook’s LLaMA, which is trained on 1.4 trillion tokens.

Greater Transparency in AI Technology

Google and OpenAI, the creator of ChatGPT, have been unwilling to publish the size or other details of their training data due to the competitive nature of the business. However, as the AI arms race heats up, the research community is demanding greater transparency. PaLM 2 is trained on 340 billion parameters, which indicates the complexity of the model. Google says that the new model is smaller than prior LLMs, which means that the company’s technology is becoming more efficient while accomplishing more sophisticated tasks.

Google’s PaLM 2 uses a “new technique” called “compute-optimal scaling,” which makes the LLM more efficient with overall better performance, including faster inference, fewer parameters to serve, and a lower serving cost. El Mahdi El Mhamdi, a senior Google Research scientist, resigned in February over the company’s lack of transparency. OpenAI CEO Sam Altman testified at a hearing of the Senate Judiciary subcommittee on privacy and technology, agreeing that a new system to deal with AI is needed.

As new AI applications quickly hit the mainstream, controversies surrounding the underlying technology are getting more spirited. Google and OpenAI are rushing to attract users who may want to search for information using conversational chatbots rather than traditional search engines.

Enterprise

Articles You May Like

The Future of Nuclear Fusion: Using Machine Learning to Optimize Plasma Performance
EU Official Calls for More Countries to Ban Huawei and ZTE from 5G Networks
Embracing Radical Candor: A New Approach to Leadership
Apple steps up AI hiring efforts as it expands ambitions for the technology

Leave a Reply

Your email address will not be published. Required fields are marked *