Eindhoven, The Netherlands – October 23, 2020 – NXP Semiconductors NV (Nasdaq: NXPI) today announced that the company is enhancing its machine learning development environment and product portfolio. Through the investment, NXP has entered into an exclusive strategic partnership with Canada-based Au-Zone Technologies to expand NXP’s eIQ™ machine learning (ML) software development environment with easy-to-use machine learning tools, and is suitable for Chip-optimized inference engine products for edge machine learning.
In addition, NXP announced that it has been working with Arm as a leading technology partner to upgrade the Arm® Ethos-U™ microNPU (Neural Processing Unit) architecture to support application processors. NXP will integrate the Ethos-U65 microNPU into next-generation i.MX applications processors to provide energy-efficient, cost-effective machine learning solutions for rapidly growing industrial and IoT edge applications.
Ron Martino, senior vice president and general manager of the Edge Processing business unit at NXP Semiconductors, said: “NXP’s scalable application processors provide customers with an efficient product platform and a broad ecosystem to rapidly deliver innovative systems. Arm and Au-Zone carry out these collaborations and technology development within NXP, our goal is to continue to improve the efficiency of processors, while increasing the productivity of our customers and reducing their time to market. NXP is committed to helping customers reduce the cost of ownership cost, maintain high security of critical data, and leverage enhanced human-machine interaction to ensure security.”
Implementing Machine Learning for All Customers
Au-Zone’s DeepView™ suite of machine learning tools provides an intuitive graphical user interface (GUI) and workflow that further enhances eIQ’s capabilities, allowing developers of all experience levels to import datasets and models for rapid training , and deploy neural network models and machine learning workloads on NXP’s edge processing portfolio. To meet the demanding requirements of today’s industrial and IoT applications, NXP’s eIQ-DeepViewML toolkit will provide developers with advanced capabilities to delete, quantify, validate and deploy public and proprietary neural network models on NXP devices. It provides target-specific graph-level analysis capabilities, allowing developers to gain insight into the operation at runtime in order to optimize neural network model architecture, system parameters, and runtime performance. By adding Au-Zone’s DeepView runtime inference engine as a complement to the open source inference technology in NXP eIQ, users will be able to rapidly deploy and evaluate machine learning workloads and performance on NXP devices very easily. A key feature of this runtime inference engine is optimizing system memory usage and data movement for each SoC architecture.
Brad Scott, CEO of Au-Zone, said: “Au-Zone is very honored to be involved in this investment, and it is exciting to have a strategic partnership with NXP, especially with NXP’s roadmap for more machine learning acceleration devices. Our goal in developing DeepViewTM is to provide developers with intuitive tools and inference technology, and this collaboration represents a powerful combination of advanced silicon runtime inference engine technology and development environment that will further accelerate the deployment of embedded machine learning capabilities. This partnership builds on our more than a decade of engineering collaboration with NXP and will enable us to deliver more advanced machine learning technologies and turnkey solutions as OEMs continue to move inference capabilities to the edge of the network.”
Scaling Machine Learning Acceleration
To accelerate machine learning in a wider range of edge applications, NXP will expand its popular i.MX applications processors for the industrial and IoT edge, integrating the Arm Ethos-U65 microNPU into the processor, leveraging this integrated NPU, Complements the previously released i.MX 8M Plus applications processor. The NXP and Arm technology collaboration focuses on defining the system level of the microNPU, which can support up to 1 TOPS of computing power (512 parallel multiply-accumulate operations at 1GHz). The Ethos-U65 maintains the MCU-level power efficiency of the Ethos-U55, while extending its reach to higher-performance, Cortex-A-based system-on-chips (SoCs). The Ethos-U65 microNPU works with the existing Cortex-M cores of NXP’s i.MX family of heterogeneous SoCs for improved efficiency.
Dennis Laudick, vice president of marketing for Arm’s machine learning group, said: “AI and machine learning are driving a wave in industrial and IoT applications. The Ethos-U65 will drive a new wave of edge AI for NXP customers. Smart on a safe, reliable, smart device.”
supplying time
The Arm Ethos-U65 will be integrated in NXP’s future i.MX applications processors. eIQ – The DeepViewML tool suite and the DeepView runtime inference engine integrated into eIQ will be available in Q1 2021. End-to-end software support, including training, validating and deploying existing or new neural network models for i.MX 8M Plus, other NXP SoCs, and future devices integrating Ethos-U55 and U65, will be available through NXP’s eIQ machine learning A software development environment is provided. Register for a webinar jointly hosted by NXP and Arm on November 10 to learn more and read our blog.
The Links: LB043WQ2-TD08 SX14Q005