Why TinyML Cases Are Becoming Popular?

February 19, 2023

img]

This article will provide an overview of what TinyML is, its use cases, and why it is becoming more popular.

Photo by Nubia Navarro (nubikini)

Machine learning is used by all types of organizations to process and analyze large datasets. TinyML (Tiny Machine Learning) is one form of machine learning API that is growing in popularity for its low-power capabilities. TinyML is quickly establishing itself as one of the best options for getting to grips with machine learning from a beginner’s level.

This article will provide an overview of what TinyML is, its use cases, and why it is becoming more popular.

What is TinyML?

Tiny Machine Learning (TinyML) is an optimized machine learning technique that allows ML models (software) to run on embedded systems that use very low-power microcontrollers.

An embedded system is made of computer hardware and software designed for a specific function. They are developed for one purpose only, which is why they differ from typical computer systems such as laptops, PCs, tablets, and smartphones. An example of an embedded system would be an electronic calculator or an ATM.

TinyML enables machine learning on microcontrollers and Internet of Things (IoT) devices in a very optimized way, meaning vast amounts of data can be leveraged and analyzed while using very little power.

Now let’s look at microcontrollers in more detail.

An Overview of Microcontrollers

Microcontrollers consist of the computer processor, RAM, ROM, and Input/Output (I/O) ports of an embedded system, the usual hardware setup that allows the embedded system to process software.

The key benefits of using microcontrollers are:

Low-power - Microcontrollers are low-power hardware, requiring just milliwatts and microwatts to function. This means that microcontrollers consume around a thousand times less power than a regular computer system.

  • Microcontrollers are low-power hardware, requiring just milliwatts and microwatts to function. This means that microcontrollers consume around a thousand times less power than a regular computer system. Low price - Microcontrollers are very cheap, with over 28 billion units shipped in 2020 alone.

  • Microcontrollers are very cheap, with over 28 billion units shipped in 2020 alone. Multifunctional use - Microcontrollers can be used in all devices, gadgets, and appliances.

The Advantages of TinyML

There are three main advantages of using TinyML:

Data can be processed with low latency - Because TinyML allows for on-device analytics without any connection to a server, embedded systems can process data and produce an output with almost no delay (low latency). No connectivity required - The device does not need an internet connection for the TinyML model to work. Privacy of data - As all the data is contained within the device with no connectivity, the risk of any data being compromised is extremely low.

TinyML Implementation

There are a few popular machine learning frameworks that support TinyML.

Edge Impulse is a free machine learning development platform for edge devices (the hardware that provides an entry point into a network). An example of this could be a router or routing switch.

is a free machine learning development platform for edge devices (the hardware that provides an entry point into a network). An example of this could be a router or routing switch. TensorFlow Lite is a library of tools that allows developers to enable on-device machine learning on embedded systems, edge devices, and other stand-alone devices.

is a library of tools that allows developers to enable on-device machine learning on embedded systems, edge devices, and other stand-alone devices. PyTorch Mobile is an open-source machine learning framework for mobile platforms and is compatible with TinyML.

Why is TinyML Becoming more Popular?

Internet of Things (IoT) networks are gaining prevalence across various industries. As such, more edge devices are required to bridge power sources and network endpoints.; TinyML delivers these requirements cost-effectively.

As discussed, TinyML works on low-power microcontrollers that do not require internet connectivity. This allows the device to process and respond to data in real time without using significant resources.

TinyML models can be used as an alternative to a cloud environment, reducing costs, using less power, and offering more data privacy. Everything is processed on the individual device without latency, ensuring impressive connection and processing speeds.

TinyML is becoming more popular in 2022, and researchers expect further growth in the future. Technology research and strategic guidance group ABI predicts that 2.5 billion devices using TinyML systems will be shipped in 2030.

The advantages of TinyML range from instantaneous analytics to no latency, making it an obvious choice in a world that relies on speed. Furthermore, the local processing of data means sensitive information is better protected from cybercriminals when compared to centralized data centers.

The Challenges TinyML Faces

Although the benefits and potential of TinyML are clear, it is not without its challenges that could pose some issues for developers.

Limited memory capacity - Embedded systems that use TinyML are limited to megabytes and sometimes kilobytes of internal memory. This places significant restrictions on how complex TinyML models can be, which is why there are only a small number of frameworks that can be used for TinyML development.

  • Embedded systems that use TinyML are limited to megabytes and sometimes kilobytes of internal memory. This places significant restrictions on how complex TinyML models can be, which is why there are only a small number of frameworks that can be used for TinyML development. Troubleshooting can not be conducted remotely - As TinyML models only run on the device locally, it is much harder for developers to perform troubleshooting to determine and fix any problems. This is where a cloud environment offers an advantage.

How is TinyML used in Practical Situations?

TinyML can be effectively applied to many different industries, pretty much any that use IoT networks and data. Below are several industries where TinyML has been used to power operations.

Agriculture

TinyML devices have been used to monitor and collect real-time crop and livestock data. A market-leading example of this was developed by Imagimob, an edge AI product company from Sweden. Imagimob has worked with 55 organizations across the European Union to understand how TinyML can provide cost-effective livestock and crop management.

In a study, two tractors were fitted with Dialog IoT Kit (Bosch sensor) devices and Android phones to collect data. This data was either logged in real-time by the tractor operator or by analyzing the smartphone’s video stream.

Imagimob’s AI software was installed on the sensors, batteries, and long-range radio devices to improve this method and allow the farmer to monitor crops using the accelerometer and gyroscope sensors, sending almost real-time data over the network.

Retail

TinyML has been adopted by the retail industry to monitor inventories, sending alerts whenever stock is running low and needs to be reordered.

Healthcare

TinyML can also be used in real-time health monitoring equipment, providing patients with improved and more personalized care. For example, hearing aids are battery-powered hardware that uses a low-power microcontroller, meaning they need minimal resources to function effectively. Using TinyML, researchers were able to reduce the latency of these devices without any loss in performance.

In the future, implementing TinyML devices could spread to all industries, helping businesses manage finances, process invoices, better work with clients, collect and analyze data, and more.

Manufacturing

TinyML can be used to power predictive maintenance tools to help minimize any downtime and the added costs resulting from any equipment being out of action.

Conclusion

TinyML is growing in popularity across various industries that use Internet of Things (IoT) devices. This is because TinyML models can run on microcontrollers to provide specific functions while using very little power.

Despite being a low-power solution, TinyML devices can process data with low latency and without needing an internet connection. This lack of network connection also helps to protect collected data from hackers.

TinyML is already being used effectively by the agriculture, manufacturing, and healthcare industries, and its popularity is predicted to grow further.

Nahla Davies is a software developer and tech writer. Before devoting her work full time to technical writing, she managed — among other intriguing things — to serve as a lead programmer at an Inc. 5,000 experiential branding organization whose clients include Samsung, Time Warner, Netflix, and Sony.

11 Billion TinyML Device Installs, 481 Million 5G-Advanced Devices in 2027 and 35 Other Transformative Technology Stats You Need to Know

img]

ABI Research’s latest whitepaper highlights 37 transformative technology stats you need to know for 2023 and beyond

NEW YORK, Feb. 9, 2023 /PRNewswire/ – For the past three years, business leaders and organizations have faced an unyielding procession of challenges. As we usher in 2023, many of those challenges persist, and new ones are emerging. Yet, as unwavering as the challenges have been, technology and innovation have proven to be just as resilient.

ABI 2021 Logo (PRNewsfoto/ABI Research)

In its new whitepaper, 37 Transformative Technology Stats You Need to Know for 2023, global technology intelligence firm ABI Research has identified and highlighted the most impactful forecasts that illuminate the direction in which digital transformation is truly heading.

“From among the many millions of data points ABI Research creates each year, we have focused on the most enlightening stats that will shape the year ahead. The rapid rise of 5G-Advanced, IoT innovation in the supply chain, a big bet on TinyML, and the emergence of the enterprise and industrial metaverse are just some of the many changes on the horizon that are indicative of a more connected, technology-driven world,” Stuart Carlaw, Chief Research Officer at ABI Research explains.

Some stats highlighted in the whitepaper include:

5G Devices, Smartphones, & Wearables:

The migration to 5G-Advanced will reshape the mobile devices and wearables market, as 481.92 million devices with 5G-Advanced will ship in 2027. “The 5G industry has seen phenomenal growth over the past few years, but with the evolution toward 5G-Advanced, there is still much to be done to fully expose the value of the 5G ecosystem and realize its full potential. Many new 5G opportunities have already been unlocked, notably better coverage and system performance, lower latency, further reductions in device power consumption and increases in reliability and efficiency,” explains ABI Research Director David McQueen.

AI & Machine Learning:

TinyML device installs will increase from nearly 2 billion in 2022 to over 11 billion in 2027. “A common theme of the TinyML market is the idea to bring Machine Learning (ML) to everyone, or more accurately, to take ML everywhere. TinyML is most useful in environmental sensors and there are many possible use cases. Indeed, consider any kind of sensorial data from the environment that can be attended to and there is probably an ML model that can be applied to that data. Sound and ambient sensors remain the most prominent environmental sensors and will drive the huge increase of installations of TinyML devices,” forecasts ABI Research Principal Analyst Lian Jye Su.

Story continues

IoT Markets:

Total active IoT device shipments for the supply chain to increase to 613.5 million in 2027. “No market has attracted as much IoT attention in the past 3 years than the supply chain. In that time, the edge device piece of the puzzle has matured enormously to create a web of device form factors and connectivity technologies for a very large range of use cases, from cargo monitoring and returnable transport asset tracking to container and railcar tracking. Over the coming years, supply chain IoT will continue in two directions: growing scale for the existing web of devices, and continuing innovation from a form factor and technology perspective which will allow more types of assets to be tracked at a lower cost point - driving the feedback loop toward more scale,” ABI Research Analyst Tancred Taylor says.

Metaverse Markets & Technologies:

Led by enterprise and industrial markets, metaverse content and service revenue will exceed US$76 billion by 2030. According to ABI Research Principal Analyst Michael Inouye, “The higher growth opportunities, spurred by further developments in key enabling markets like XR, 5G Advanced, and cloud computing will still be ramping up to a more fully realized metaverse. Given the long road to the metaverse and the stronger starting position and momentum within the enterprise and industrial markets, this segment of the market is expected to remain the larger metaverse opportunity throughout the forecast window to 2030.

“Nobody has a crystal ball, but we can say with relative certainty that the challenging climate will persist well into 2023. These statistics should provide insights and actionable data needed to chart a successful course in 2023 and beyond,” Carlaw concludes.

Download the whitepaper, 37 Transformative Technology Stats You Need to Know for 2023, to learn more.

About ABI Research

ABI Research is a global technology intelligence firm delivering actionable research and strategic guidance to technology leaders, innovators, and decision makers around the world. Our research focuses on the transformative technologies that are dramatically reshaping industries, economies, and workforces today.

ABI Research是一家国际科技情报公司,为全球科技领袖、创新人士和决策者提供实用的市场研究和战略性指导。我们密切关注一切为各行各业、全球经济和劳动市场带来颠覆性变革的创新与技术。

For more information about ABI Research’s services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific, or visit www.abiresearch.com.

Contact Info:

Global

Deborah Petrara

Tel: +1.516.624.2558

pr@abiresearch.com

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/11-billion-tinyml-device-installs-481-million-5g-advanced-devices-in-2027-and-35-other-transformative-technology-stats-you-need-to-know-301742527.html

SOURCE ABI Research

Using sensor fusion and tinyML to detect fires

img]

Reading Time: 2 minutes

The damage and destruction caused by structure fires to both people and the property itself is immense, which is why accurate and reliable fire detection systems are a must-have. As Nekhil R. notes in his write-up, the current rule-based algorithms and simple sensor configurations can lead to reduced accuracy, thus showing a need for more robust systems.

This led Nekhil to devise a solution that leverages sensor fusion and machine learning to make better predictions about the presence of flames. His project began with collecting environmental data consisting of temperature, humidity, and pressure from his Arduino Nano 33 BLE Sense’s onboard sensor suite. He also labeled each sample either Fire or No Fire using the Edge Impulse Studio, which was used to generate spectral features from the three time-series sensor values. This information was then passed along to a Keras neural network that had been configured to perform classification, resulting in an overall accuracy of 92.86% when run on real world test samples.

  • Werbung -

  • Werbung -

– Werbung –

Confident in his now-trained model, Nekhil deployed his model as an Arduino library back to the Nano 33 BLE Sense. The Nano sends a message over its UART pins to an awaiting ESP8266-01 board when a fire has been detected. And in turn, the ESP8266 triggers an IFTTT webhook to alert the user via an email.

If you would like to learn more about the construction of this fire recognition system, plenty of details can be found on the project page.

  • Werbung -

The post Using sensor fusion and tinyML to detect fires appeared first on Arduino Blog.

Website: LINK

STMicro Adds TinyML Developer Cloud

img]

//php echo do_shortcode('[responsivevoice_button voice=“US English Male” buttontext=“Listen to Post”]') ?>

Parallel to its existing offline version, STMicro has put its STM32Cube.AI machine learning development environment into the cloud, complete with cloud-accessible ST MCU boards for testing.

Both versions generate optimized C code for STM32 microcontrollers from TensorFlow, PyTorch or ONNX files. The developer cloud version uses the same core tools as the downloadable version, but with added interface with ST’s github model zoo, and the ability to remotely run models on cloud-connected ST boards in order to test performance on different hardware.

“[We want] to address a new category of users: the AI community, especially data scientists and AI developers that are used to developing on online services and platforms,” Vincent Richard, AI product marketing manager at STMicroelectronics, told EE Times. “That’s our aim with the developer cloud…there is no download for the user, they go straight to the interface and start developing and testing.”

ST does not expect users to migrate from the offline version to the cloud version, since the downloadable/installable version of STM32Cube.AI is heavily adapted for embedded developers who are already using ST’s development environment for other tasks, such as defining peripherals. Data scientists and many other potential users in the AI community use a “different world” of tools, Richard said.

“We want them to be closer to the hardware, and the way to do that is to adapt our tools to their way of working,” he added.

ST’s github model zoo currently includes example models optimized for STM32 MCUs, for human motion sensing, image classification, object detection and audio event detection. Developers can use these models as a starting point to develop their own applications.

The new board farm allows users to remotely measure the performance of optimized models directly on different STM32 MCUs.

“No need to buy a bunch of STM32 boards to test AI, they can do it remotely thanks to code that is running physically on our ST board farms,” Richard said. “They can get the real latency and memory footprint measurements for inference on different boards.”

The board farm will start with 10 boards available for each STM32 part number, which will increase in the coming months, according to Richard. These boards are located in several places, separate from ST infrastructure, to ensure a stable and secure service.

Optimized code

Tools in STM32Cube.AI’s toolbox include a graph optimizer, which converts TensorFlow Lite for Microcontrollers, PyTorch or ONNX files to optimize C code based on STM32 libraries. Graphs are rewritten to optimize for memory footprint or latency, or some balance of the two that can be controlled by the user.

There is also a memory optimizer that shows graphically how much memory (Flash and RAM) each layer is using. Individual layers that are too large for memory may be split into two steps, for example.

Previous MLPerf Tiny results showed performance advantages for ST’s inference engine, an optimized version of Arm’s CMSIS-NN, versus standard CMSIS-NN scores.

The STM32CubeAI developer cloud will also support ST’s forthcoming microcontroller with in-house developed NPU, the STM32N6, when it becomes available.

< 回到列表