AI-embedded systems will enable enterprises to perform better
Published on : Tuesday 08-11-2022
Prashant Rao, Head – Application Engineering Team, MathWorks India.
What are the prerequisites for an enterprise to go for AI-embedded devices?

Artificial Intelligence (AI) is increasingly being applied to many industries and applications, and companies are looking to gain value from their AI models, so a natural next step is the discussion of moving the AI model into production. This could mean deploying to the cloud, edge systems, embedded devices, or standalone applications.
Enterprises will want to ensure they are clear on the final requirements of the system and plan to implement the AI with these requirements in mind.
The primary prerequisite when embedded devices are the deployment option is to understand the resource-constrained nature of embedded devices. Many device options are available, like NVIDIA Jetson and ARM Cortex-A/M processors, and they all will have trade-offs and considerations like processing power and memory constraints. Embedded and AI teams will need to work together to understand the requirements and create an AI model that fits the proposed solution.
They will need to have experience working with AI and embedded devices. The AI team will need to work with the embedded team to understand the resource-constrained nature of embedded devices (e.g., NVIDIA Jetsons and ARM Cortex-A/M processors) and tweak the AI solution to deploy successfully.
How does this go hand in hand with the growing trend of edge computing?
Edge computing has been growing steadily over the past years, and advances in AI solutions have improved the performance of edge devices deployed in the field. As enterprises seek to capitalize on the improvements that AI workflows bring over traditional algorithmic approaches, they will seek ways to improve the enterprise’s ability to deploy AI-embedded devices more efficiently and cost-effectively.
What do you think are the challenges – especially for those enterprises lagging on the Dx curve?
The challenge for enterprises is in the amalgamation of the AI and embedded product teams. The constraints faced by these teams are quite different – for example, the AI team may prefer to use larger, more resource-intensive AI models that would be more difficult to implement on certain edge devices, especially at scale. Teams will need to understand how best to fit the AI model onto hardware without losing accuracy. Techniques like model compression are being implemented to ensure AI models can fit onto devices without sacrificing model accuracy. Teams may also be working in different development platforms, so teams need tools that can automatically generate code and help cross-functional teams work together to understand and overcome constraints in a final solution.
How does MathWorks help enterprises in the implementation of AI effectively?

MathWorks has expertise helping customers build domain-specific solutions that cater to customers’ specific applications across all industries. MathWorks provides tools for a complete AI solution, from data pre-processing, to model development, to final deployment.
Simulation is also of interest to enterprises looking to move quickly from prototype to production, and Simulink offers tools for simulating the entire AI solution and catching costly bugs before moving into production. Finally, MathWorks offers tools to deploy to various edge computing platforms, including COTS-embedded platforms from NVIDIA, NXP, and STMicro.
How do you think enterprises benefit from the adoption of AI-embedded systems?
The adoption of AI-embedded systems will enable enterprises to provide solutions that perform better than traditional algorithmic approaches at the edge nodes of the system. The evolution will help reduce the latency and bottleneck of cloud computing systems by shifting some of the AI processing to the edge nodes.
What are the latest features and capabilities of MATLAB and Simulink, and the takeaways from the recent in-person MATLAB Expo 2022?
In September, MathWorks unveiled Release 2022b (R2022b) of the MATLAB® and Simulink® product families. R2022b introduces two new products and several enhanced features that simplify and automate Model-Based Design for engineers and researchers tasked with delivering product innovations and breakthroughs for their organisations. Simscape Battery™, one of the top innovations introduced in the R2022b release, provides design tools and parameterised models for businesses designing these types of battery systems. Engineers and researchers use Simscape Battery to create digital twins, run virtual tests of battery pack architectures, design battery management systems, and evaluate battery system behaviour across normal and fault conditions. The tool also automates the creation of simulation models that match desired pack topology and includes cooling plate connections so electrical and thermal responses can be evaluated.
R2022b also features the new Medical Imaging Toolbox. The toolbox provides tools for medical imaging applications to design, test, and deploy diagnostic and radiomics algorithms that use deep learning networks. Medical researchers, scientists, engineers, and device designers can use Medical Imaging Toolbox for multi-volume 3D visualisation, multimodal registration, segmentation, and automated ground truth labelling for training deep learning networks on medical images.
There are several other updates to the existing products too.
India MATLAB EXPO came back in physical form after a gap of 3 years. This year’s EXPO was held at Bangalore in September. Our objective was to complement the Virtual MATLAB EXPO that happened earlier this year and attracted thousands of participants from India. The new format we introduced in the physical form provided more time and space for the customers and users to interact with Industry as well as MathWorks experts. We had close to 30 technology demo showcases across technologies like AI, Electrification, Model-Based Design, Wireless, etc. Four master classes covered these technologies at depth. In the panel discussion on Impact of AI + X in Engineering and Science, 6 experts from multiple industries talked about deployment of AI and the challenges and solutions in their areas. The other panel discussion was on Building Entrepreneurial Mindset in Engineers of Tomorrow. The panellists including experts from incubators, startups, corporates talked about developing the climate of entrepreneurship. The Keynote was delivered by Latha Chembrakalam, Head of Technical Centre Continental India on ‘Trends in Mobility.’
Prashant Rao heads the Application Engineering team at MathWorks India. Prashant is a regular contributor at industry forums, sharing his views around megatrends in technology and how Artificial Intelligence (AI) is getting adopted across industries and other technologies. He works closely with the academic community to help develop analytical and AI-related skills that make them industry-ready. He can be reached at [email protected]