share_log

Aizip Teams Up With Renesas to Demonstrate First-of-Its-Kind Ultra-Efficient Small Language Models (SLMs) and AI Agents for On-Device Arm-based Applications

Businesswire ·  07/23 11:10

CUPERTINO, Calif.--(BUSINESS WIRE)--Aizip, in close collaboration with Renesas, announced today the demonstration of ultra-efficient small language models (SLMs) and compact AI agents on Arm-based micro-processor units (MPUs) for a wide range of applications in edge markets. This advancement paves the way for efficient and effective human-AI interactions in home appliances, enterprise kiosks, and many other edge devices.



Large language models (LLMs) have revolutionized the AI landscape, endowing AI systems with logic and reasoning capabilities beyond simple sensing and perception. By leveraging LLMs and recent advancements in multi-modal representation, AI agents can now interact with their environments to utilize tools or perform tasks based on complex and often ambiguous human commands. However, these advanced AI systems typically require substantial effort to train and significant resources to deploy.

Considerable research has therefore been directed towards developing compact models for on-device edge applications. On-device models offer advantages that include enhanced privacy protection, resilient operation, and cost savings. While several companies have successfully reduced the size of language models for mobile phones, ensuring accurate tool calling for automation applications on low-cost edge devices remains a significant challenge for these SLMs.

Aizip's mission is to enable pervasive intelligence by building ultra-efficient, robust, and scalable AI models that can be deployed anywhere, anytime. Aizip pushes the boundaries of efficient AI, uncovering key insights such as data-centric efficiency and AI-design automation. Leveraging its expertise in developing efficient and robust edge models, Aizip has now created a series of ultra-efficient small language models (SLMs) and AI agents, named Gizmo, ranging in size from 300 million to 2 billion parameters. These models support diverse platforms, including MPUs and application processors for a broad range of applications.

Recently, Aizip teamed up with Renesas and Arm to demonstrate the most compact and efficient AI agents to date on MPUs. These remarkably small and efficient models have been deployed on Renesas RZ/G2L and RZ/G3S boards. Built on Arm Cortex-A55, the Renesas MPUs are now available for commercial use. Despite their compactness, these SLMs and SLM-based AI agents achieve robustness and accuracy comparable with cloud-based large models for domain-specific applications. Utilizing an AI design automation pipeline, these SLMs and SLM-based AI agents can be rapidly generated for new applications.

"We're witnessing a pivotal moment in efficient AI where on-device AI agents can truly make everything think," described Weier Wan, a founding member of Aizip and head of Aizip's SLM and AI-agent development. According to Weier, on the RZ/G2L with a single A55 core running at 1.2 GHz, they can achieve a response time of less than 3 seconds. "With Renesas's readily available IC, we can offer the AI SLM and agents right now. Renesas devices are known for their high quality and mature development environment, and we're excited to team up with them in offering this breakthrough solution."

Renesas, a global leader in MPUs and MCUs, has been aggressively pushing the envelope in the edge and endpoint markets. "We see AI moving rapidly in our direction, and we are ready for it," said Kaushal Vora, Sr. Director of Business Acceleration and Ecosystem of Renesas. "We are particularly excited about Aizip implementing an on-device agent on our MPU. We can envision, for example, RealityCheck HVAC could predict a potential issue with a component in a heat pump; the AI agent acts on this real-time operational information, looking up the appropriate OEM part and scheduling a service call with the technician for replacement. Aizip has been an excellent strategic partner, and we have achieved multiple milestones together," Vora further commented.

"To address the rapid growth of AIoT applications, devices require higher performance and intelligence at the edge," said Lionel Belnet, Senior Director HW Product Management, IoT Line of Business at Arm. "Arm is the platform of choice for AI solutions from cloud to edge, and running Aizip's new SLMs and AI Agents on Arm-based technologies will provide the performance and efficiency needed to scale the edge AI opportunity even further."

"Efficiency is critical for AI systems," explained Yubei Chen, co-founder of Aizip and assistant professor at UC Davis. "Natural intelligence demonstrates remarkable efficiency. From a scientific perspective, there's immense potential for AI to improve, and efficiency is the key to achieving human-level intelligence." The Aizip Intelligent Language (AIL) model series, including AI agents, is an extension of Aizip's efforts to bring efficiency and sustainability to real-world applications. "Aizip is excited to be a part of the Arm ecosystem, utilizing industry-leading design solutions to deliver new AI use cases worldwide."

The partnership between Aizip and Renesas has led to successful tinyML models in audio, vision, and time-series. Now, they are collaborating in the rapidly growing market of SLMs and agents for on-device applications. This reflects the commitment of these companies to providing low-cost, high-performance solutions that benefit society at large. For additional information, please contact info@aizip.ai.

About Aizip, Inc.

Situated in the heart of Silicon Valley, Aizip, Inc. specializes in developing superior AI models tailored for endpoint and edge device applications. Aizip stands out for its exemplary model performance, swift deployment, and remarkable return on investment. These models are versatile, catering to a spectrum of intelligent, automated, and interconnected solutions. Discover more at .


Contacts

Nathan Francis, Nathan@aizip.ai

声明:本内容仅用作提供资讯及教育之目的,不构成对任何特定投资或投资策略的推荐或认可。 更多信息
    抢沙发