Taipei, Aug 24 (IAINS): Chip-maker MediaTek on Thursday announced that it is working closely with Meta's Llama 2, the company's next-generation open-source Large Language Model (LLM), to enhance on-device generative AI in edge devices.
"Utilising Meta's LLM as well as MediaTek's latest APUs and NeuroPilot AI Platform, MediaTek aims to build a complete edge computing ecosystem designed to accelerate AI application development on smartphones, IoT, vehicles, smart home, and other edge devices," the chip-maker said in a statement.
Currently, most generative AI processing is performed through cloud computing, however, MediaTek's use of Llama 2 models will enable generative AI applications to run directly on-device as well.
This will provide many advantages to developers and users, including seamless performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost.
“The increasing popularity of generative AI is a significant trend in digital transformation, and our vision is to provide the exciting community of Llama 2 developers and users with the tools needed to fully innovate in the AI space,” said JC Hsu, Corporate Senior Vice President and General Manager of Wireless Communications Business Unit at MediaTek.
According to the chip-maker, in order to truly take advantage of on-device generative AI technology, edge device makers will need to adopt "high computing, low-power AI processors and faster, more reliable connectivity to enhance computing capabilities"
"Every MediaTek-powered 5G smartphone SoC shipped today is equipped with APUs designed to perform a wide variety of generative AI features, such as AI Noise Reduction, AI Super Resolution, AI MEMC and more," the company said.
Additionally, MediaTek’s next-generation flagship chipset, which will be introduced later this year, will feature a software stack optimised to run Llama 2, as well as an upgraded accelerated processing unit (APU) with Transformer backbone acceleration, reduced footprint access and use of DRAM bandwidth, further enhancing LLM and AIGC performance.
"MediaTek expects Llama 2-based AI applications to become available for smartphones powered by the next-generation flagship SoC, scheduled to hit the market by the end of the year," it added.