Qualcomm and Meta have announced that, beginning in 2024, Qualcomm chips on phones and PCs will support the social networking company's newly-developed large language model, Llama 2. With this technology, applications such as intelligent virtual assistants are enabled.
Qualcomm and Meta have announced that, starting in 2024, the social network company's new large language model Llama 2 can run on Qualcomm chips on phones and PCs. This is significant as LlMs have mainly been running on Nvidia graphics processors because of their necessities for high computational power and data, driving Nvidia stock up more than 220% this year. However, other leading-edge processors for phones and PCs, such as Qualcomm chips, have been left out of the AI boom; Qualcomm's stock has increased 10% this year, lagging the NASDAQ's 36% growth. The announcement offers Qualcomm the chance to promote its processors as suitable for A.I. in the cloud, on a device, which could cut down the high cost of running AI models and lead to improved voice assistants and other apps. Qualcomm will make Meta's open-source Llama 2 modules available on its chips and expects them to enhance apps such as intelligent virtual assistants. Moreover, Meta made its weights (the set of numbers that govern an AI model) open-source, allowing users to employ AI models on their own computers without asking permission or paying for it, in contrast to proprietary models such as OpenAI's GPT-4 or Google's Bard. The firm has a history of collaborating with Meta, having worked together on Quest virtual reality devices and demonstrating A.I. models running slowly on its chips, such as Stable Diffusion.
top of page
bottom of page
Comments