Why the Future of Machine Learning is Tiny « Pete Warden's blog


41 bookmarks. First posted by moleitau 14 days ago.


Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
yesterday by matttrent
Instead I chose to speak about another trend that I am just as certain about, and will have just as much impact, but which isn’t nearly as well known. I’m convinced that machine learning can run on tiny, low-power chips, and that this combination will solve a massive number of problems we have no solutions for right now. That’s what I’ll be talking about at CogX, and in this post I’ll explain more about why I’m so sure.
ml  scale  compute  blog  toread 
7 days ago by cjitlal
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
7 days ago by dylan
A few years ago I talked to some engineers working on micro-satellites capturing imagery. Their problem was that they were essentially using phone cameras, which are capable of capturing HD video, but they only had a small amount of memory on the satellite to store the results, and only a limited amount of bandwidth every few hours to download to the base stations on Earth. I realized that we face the same problem almost everywhere we deploy sensors. Even in-home cameras are limited by the bandw...
ml  predictions 
10 days ago by elrob
via Pocket - Why the Future of Machine Learning is Tiny - Added June 11, 2018 at 10:09AM
11 days ago by mikele
When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away.
ai 
11 days ago by marshallk
Why the Future of Machine Learning is Tiny:
12 days ago by timk
In the last few years its suddenly become possible to take noisy signals like images, audio, or accelerometers and extract meaning from them, by using neural networks. Because we can run these networks on microcontrollers, and sensors themselves use little power, it becomes possible to interpret much more of the sensor data we’re currently ignoring.

For example, I want to see almost every device have a simple voice interface. By understanding a small vocabulary, and maybe using an image sensor to do gaze detection, we should be able to control almost anything in our environment without needing to reach it to press a button or use a phone app. I want to see a voice interface component that’s less than fifty cents that runs on a coin battery for a year, and I believe it’s very possible with the technology we have right now.
misc  !publish 
12 days ago by zephyr777
I don’t know the details of what the future will bring, but I know ML on tiny, cheap battery powered chips is coming and will open the door for some amazing new applications!
ai  iot  machinelearning 
13 days ago by ssorc
This is great (via newsletter) - Why the future of machine learning is tiny:
from twitter_favs
13 days ago by alpinegizmo
- and not AI - the early 1990s use of fuzzy logic is a better analogue
iot  semiconductors  technology  machinelearning 
13 days ago by renaissancechambara
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
13 days ago by bdeskin
Pete Warden is thinking small - in both size and energy consumption terms:
<p>I spend a lot of time thinking about picojoules per op. This is a metric for how much energy a single arithmetic operation on a CPU consumes, and it’s useful because if I know how many operations a given neural network takes to run once, I can get a rough estimate for how much power it will consume. For example, the MobileNetV2 image classification network takes 22 million ops (each multiply-add is two ops) in its smallest configuration. If I know that a particular system takes 5 picojoules to execute a single op, then it will take (5 picojoules * 22,000,000) = 110 microjoules of energy to execute. If we’re analyzing one frame per second, then that’s only 110 microwatts, which a coin battery could sustain continuously for nearly a year. These numbers are well within what’s possible with DSPs available now, and I’m hopeful we’ll see the efficiency continue to increase. That means that the energy cost of running existing neural networks on current hardware is already well within the budget of an always-on battery-powered device, and it’s likely to improve even more as both neural network model architectures and hardware improve.

In the last few years it's suddenly become possible to take noisy signals like images, audio, or accelerometers and extract meaning from them, by using neural networks. Because we can run these networks on microcontrollers, and sensors themselves use little power, it becomes possible to interpret much more of the sensor data we’re currently ignoring. For example, I want to see almost every device have a simple voice interface. By understanding a small vocabulary, and maybe using an image sensor to do gaze detection, we should be able to control almost anything in our environment without needing to reach it to press a button or use a phone app. I want to see a voice interface component that’s less than fifty cents that runs on a coin battery for a year, and I believe it’s very possible with the technology we have right now.

As another example, I’d love to have a tiny battery-powered image sensor that I could program to look out for things like particular crop pests or weeds, and send an alert when one was spotted. These could be scattered around fields and guide interventions like weeding or pesticides in a much more environmentally friendly way.</p>
Machinelearning  power 
13 days ago by charlesarthur
When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away. via Pocket
Pocket 
13 days ago by LaptopHeaven
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
13 days ago by jpfinley
RT : Why the Future of Machine Learning is Tiny:
from twitter
13 days ago by davidvc
A display might use 400 milliwatts.
Active cell radio might use 800 milliwatts.
Bluetooth might use 100 milliwatts.
Accelerometer is 21 milliwatts.
Gyroscope is 130 milliwatts.
GPS is 176 milliwatts.
A microcontroller itself might only use a milliwatt or even less, but you can see that peripherals can easily require much more. A coin battery might have 2,500 Joules of energy to offer, so even something drawing at one milliwatt will only last about a month. Of course most current products use duty cycling and sleeping to avoid being constantly on, but you can see what a tight budget there is even then.
ml  ai  nn  trend 
13 days ago by euler
When Azeem asked me to give a talk at CogX, he asked me to focus on just a single point that I wanted the audience to take away. via Pocket
IFTTT  Pocket 
13 days ago by michimaurer
Apps? Where we’re going, we don’t need ... apps. That future kitchen is looking awfully ‘clean’ - no buttons, knobs or sliders. Just invisible microphones waiting on your every instruction. “Because we can run these networks on microcontrollers, and sensors themselves use little power, it becomes possible to interpret much more of the sensor data we’re currently ignoring. For example, I want to see almost every device have a simple voice interface. By understanding a small vocabulary, and maybe using an image sensor to do gaze detection, we should be able to control almost anything in our environment without needing to reach it to press a button or use a phone app. “
ifttt  facebook 
13 days ago by alexmel
Why the Future of Machine Learning is Tiny: on machine learning with small devices:
from twitter_favs
13 days ago by tjweir
Photo by Kevin Steinhardt When Azeem asked me to give a talk at CogX , he asked me to focus on just a single point that I wanted the audience to take away. A…
from instapaper
13 days ago by liebo7
Why the Future of Machine Learning is Tiny via 非常有洞见的预言,机器学习的未来在于低功耗的 MCU + 传感器。
from twitter
13 days ago by jcxia43
Hmm... so... (via ). So models could listen out for ad related keyw…
IoT  NN  ML  embeddedAI  AI 
13 days ago by psychemedia
Why the Future of Machine Learning is Tiny:
data-science  learn 
14 days ago by fototropik
Why the Future of Machine Learning is Tiny:
from twitter_favs
14 days ago by danbri
Why the Future of Machine Learning is Tiny:
from twitter_favs
14 days ago by one1zero1one
Why the Future of Machine Learning is Tiny:
from twitter_favs
14 days ago by moleitau
Why the Future of Machine Learning is Tiny:
from twitter_favs
14 days ago by davidorban