Why the Future of Machine Learning is Tiny « Pete Warden's blog


51 bookmarks. First posted by moleitau june 2018.


I’m convinced that machine learning can run on tiny, low-power chips, and that this combination will solve a massive number of problems we have no solutions for right now. That’s what I’ll be talking about at CogX, and in this post I’ll explain more about why I’m so sure.
hardware  iot  machinelearning 
5 weeks ago by dunc
Mostly about inference (rather than training), but lots of useful back-of-the-envelope arguments for the particular significance of edge computing in ML

The overall thing to take away from these figures is that processors and sensors can scale their power usage down to microwatt ranges (for example Qualcomm’s Glance vision chip, even energy-harvesting CCDs, or microphones that consume just hundreds of microwatts) but displays and especially radios are constrained to much higher consumption, with even low-power wifi and bluetooth using tens of milliwatts when active. The physics of moving data around just seems to require a lot of energy. There seems to be a rule that the energy an operation takes is proportional to how far you have to send the bits. CPUs and sensors send bits a few millimeters, and is cheap, radio sends them meters or more and is expensive. I don’t see this relationship fundamentally changing, even as technology improves overall. In fact, I expect the relative gap between the cost of compute and radio to get even wider, because I see more opportunities to reduce computing power usage.

A few years ago I talked to some engineers working on micro-satellites capturing imagery. Their problem was that they were essentially using phone cameras, which are capable of capturing HD video, but they only had a small amount of memory on the satellite to store the results, and only a limited amount of bandwidth every few hours to download to the base stations on Earth. I realized that we face the same problem almost everywhere we deploy sensors. Even in-home cameras are limited by the bandwidth of wifi and broadband connections. My favorite example of this was a friend whose December ISP usage was dramatically higher than the rest of the year, and when he drilled down it was because his blinking Christmas lights caused the video stream compression ratio to drop dramatically, since so many more frames had differences!
machinelearning  hardware 
10 weeks ago by mike
Because there’s strong overlap between what tiny computers can do and what machine learning needs.
machinelearning  technology  ai 
july 2018 by danielbachhuber
"CPUs and sensors use almost no power, radios and displays use lots"
from twitter_favs
july 2018 by mellowfish
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away. A few years ago my priority would have been convincing people that deep learning was a real revolution, not a fad, but there…
june 2018 by coarsesand
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
june 2018 by matttrent
Instead I chose to speak about another trend that I am just as certain about, and will have just as much impact, but which isn’t nearly as well known. I’m convinced that machine learning can run on tiny, low-power chips, and that this combination will solve a massive number of problems we have no solutions for right now. That’s what I’ll be talking about at CogX, and in this post I’ll explain more about why I’m so sure.
ml  scale  compute  blog  toread 
june 2018 by cjitlal
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
june 2018 by dylan
A few years ago I talked to some engineers working on micro-satellites capturing imagery. Their problem was that they were essentially using phone cameras, which are capable of capturing HD video, but they only had a small amount of memory on the satellite to store the results, and only a limited amount of bandwidth every few hours to download to the base stations on Earth. I realized that we face the same problem almost everywhere we deploy sensors. Even in-home cameras are limited by the bandw...
ml  predictions 
june 2018 by elrob
via Pocket - Why the Future of Machine Learning is Tiny - Added June 11, 2018 at 10:09AM
june 2018 by mikele
When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away.
ai 
june 2018 by marshallk
Why the Future of Machine Learning is Tiny:
june 2018 by timk
In the last few years its suddenly become possible to take noisy signals like images, audio, or accelerometers and extract meaning from them, by using neural networks. Because we can run these networks on microcontrollers, and sensors themselves use little power, it becomes possible to interpret much more of the sensor data we’re currently ignoring.

For example, I want to see almost every device have a simple voice interface. By understanding a small vocabulary, and maybe using an image sensor to do gaze detection, we should be able to control almost anything in our environment without needing to reach it to press a button or use a phone app. I want to see a voice interface component that’s less than fifty cents that runs on a coin battery for a year, and I believe it’s very possible with the technology we have right now.
misc  !publish 
june 2018 by zephyr777
I don’t know the details of what the future will bring, but I know ML on tiny, cheap battery powered chips is coming and will open the door for some amazing new applications!
ai  iot  machinelearning 
june 2018 by ssorc
This is great (via newsletter) - Why the future of machine learning is tiny:
from twitter_favs
june 2018 by alpinegizmo
- and not AI - the early 1990s use of fuzzy logic is a better analogue
iot  semiconductors  technology  machinelearning 
june 2018 by renaissancechambara
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
june 2018 by bdeskin
Pete Warden is thinking small - in both size and energy consumption terms:
<p>I spend a lot of time thinking about picojoules per op. This is a metric for how much energy a single arithmetic operation on a CPU consumes, and it’s useful because if I know how many operations a given neural network takes to run once, I can get a rough estimate for how much power it will consume. For example, the MobileNetV2 image classification network takes 22 million ops (each multiply-add is two ops) in its smallest configuration. If I know that a particular system takes 5 picojoules to execute a single op, then it will take (5 picojoules * 22,000,000) = 110 microjoules of energy to execute. If we’re analyzing one frame per second, then that’s only 110 microwatts, which a coin battery could sustain continuously for nearly a year. These numbers are well within what’s possible with DSPs available now, and I’m hopeful we’ll see the efficiency continue to increase. That means that the energy cost of running existing neural networks on current hardware is already well within the budget of an always-on battery-powered device, and it’s likely to improve even more as both neural network model architectures and hardware improve.

In the last few years it's suddenly become possible to take noisy signals like images, audio, or accelerometers and extract meaning from them, by using neural networks. Because we can run these networks on microcontrollers, and sensors themselves use little power, it becomes possible to interpret much more of the sensor data we’re currently ignoring. For example, I want to see almost every device have a simple voice interface. By understanding a small vocabulary, and maybe using an image sensor to do gaze detection, we should be able to control almost anything in our environment without needing to reach it to press a button or use a phone app. I want to see a voice interface component that’s less than fifty cents that runs on a coin battery for a year, and I believe it’s very possible with the technology we have right now.

As another example, I’d love to have a tiny battery-powered image sensor that I could program to look out for things like particular crop pests or weeds, and send an alert when one was spotted. These could be scattered around fields and guide interventions like weeding or pesticides in a much more environmentally friendly way.</p>
Machinelearning  power 
june 2018 by charlesarthur
When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away. via Pocket
Pocket 
june 2018 by LaptopHeaven
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
june 2018 by jpfinley
RT : Why the Future of Machine Learning is Tiny:
from twitter
june 2018 by davidvc
A display might use 400 milliwatts.
Active cell radio might use 800 milliwatts.
Bluetooth might use 100 milliwatts.
Accelerometer is 21 milliwatts.
Gyroscope is 130 milliwatts.
GPS is 176 milliwatts.
A microcontroller itself might only use a milliwatt or even less, but you can see that peripherals can easily require much more. A coin battery might have 2,500 Joules of energy to offer, so even something drawing at one milliwatt will only last about a month. Of course most current products use duty cycling and sleeping to avoid being constantly on, but you can see what a tight budget there is even then.
ml  ai  nn  trend 
june 2018 by euler
When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away. via Pocket
IFTTT  Pocket 
june 2018 by michimaurer
Apps? Where we’re going, we don’t need ... apps. That future kitchen is looking awfully ‘clean’ - no buttons, knobs or sliders. Just invisible microphones waiting on your every instruction. “Because we can run these networks on microcontrollers, and sensors themselves use little power, it becomes possible to interpret much more of the sensor data we’re currently ignoring. For example, I want to see almost every device have a simple voice interface. By understanding a small vocabulary, and maybe using an image sensor to do gaze detection, we should be able to control almost anything in our environment without needing to reach it to press a button or use a phone app. “
ifttt  facebook 
june 2018 by alexmel
Why the Future of Machine Learning is Tiny: on machine learning with small devices:
from twitter_favs
june 2018 by tjweir
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
june 2018 by liebo7
Why the Future of Machine Learning is Tiny via 非常有洞见的预言,机器学习的未来在于低功耗的 MCU + 传感器。
from twitter
june 2018 by jcxia43
Hmm... so... (via ). So models could listen out for ad related keyw…
IoT  NN  ML  embeddedAI  AI 
june 2018 by psychemedia
Why the Future of Machine Learning is Tiny:
data-science  learn 
june 2018 by fototropik
Why the Future of Machine Learning is Tiny:
from twitter_favs
june 2018 by davidorban
Why the Future of Machine Learning is Tiny:
from twitter_favs
june 2018 by moleitau
Why the Future of Machine Learning is Tiny:
from twitter_favs
june 2018 by danbri
Why the Future of Machine Learning is Tiny:
from twitter_favs
june 2018 by one1zero1one