Accelerating AI with the Raspberry Pi Pico’s dual cores

I’ve been a fan of the RP2040 chip powering the Pico since it was launched, and we’re even using them in some upcoming products, but I’d never used one of its most intriguing features, the second core. It’s not common to have two cores in a microcontroller, especially a seventy cent Cortex M0, and most of the system software for that level of CPU doesn’t have standardized support for threads and other typical ways to get parallelized performance from your algorithms. I still wanted to see if I could get a performance boost on compute-intensive tasks like machine learning though, so I dug into the pico_multicore library which provides access low-level access to the second core.

The summary is that I was able to get approximately a 1.9x speed boost by breaking a convolution function into two halves and running one on each processor. The longer story is that I actually implemented most of this several months ago, but got stuck due to a silly mistake where I was accidentally serializing the work by calling functions in the wrong order! I was in the process of preparing a bug report for the RPi team who had kindly agreed to take a look when I realized my mistake. Another win for rubberducking!

If you’re interested in the details, the implementation is in my custom version of an Arm CMSIS-NN source file. I actually ended up putting together an updated version of the whole TFLite Micro library for the Pico to take advantage of this. There’s another long story behind that too. I did the first TFLM port for the Pico in my own time, and since nobody at Google or Raspberry Pi is actively working on it, it’s remained stuck at that original version. I can’t make the commitment to be a proper maintainer of this new version, it will be on a best-effort basis, so bugs and PRs may not be addressed, but I’ve at least tried to make it easier to update with a sync/sync_with_upstream.sh script that currently works and is designed to as robust to future changes as I can make it.

If you want more information on the potential speedup, I’ve included some benchmarking results. The lines to compare are the CONV2D results. For example the first convolution layer takes 46ms without the optimizations, and 24ms when run on both the cores. There are other layers in the benchmark that aren’t optimized, like depthwise convolution, but the overall time for running the person detection model once drops from 782ms to 599ms. This is already a nice boost, but in the future we could do something similar for the depthwise convolution to increase the speed even more.

Thanks to the Raspberry Pi team for building a lovely little chip! Everything from the PIOs to software overclocking and dual cores makes it a fascinating system to work with, and I look forward to diving in even deeper.

Explore the dark side of Silicon Valley with Red Team Blues

It’s weird to live in a place that so many people have heard of, but so few people know. Silicon Valley is so full of charismatic people spinning whatever stories serve their ends it’s hard for voices with fewer ulterior motives to get airtime. Even the opponents of big tech have an incentive to mythologize it, it’s the only way to break through the noise. It’s very rare to find someone with deep experience of our strange world who can paint a picture I recognize.

That’s a big reason I’ve always loved Cory Doctorow’s writing. He knows the technology industry and the people who inhabit it inside and out, but he’s not interested in either hagiography or demonization. He’s always been able to pinpoint the little details that make this world simultaneously relatable and deeply weird, like this observation about wealth from his latest book:

I almost named the figure, but I did not. My extended network of OG Silicon Valley types included paupers and billionaires, and long ago, we all figured out that the best way to stay on friendly terms was to keep the figures out of it.

Red Team Blues is a fast-paced crime novel in the best traditions of Hammett, but taking inspiration from the streets of 2020’s San Francisco instead of the 1920’s. His eye for detail adds authenticity, with his forensic accountant protagonist relying more on social media carelessness than implausible hacking attempts to gather the information he needs. There’s a thread of anger running through the story too, at the machinery of tax evasion that lies behind so many industry facades, and contributes to the world of homelessness that is the mirror image of all the partying billionaires. He’s unsparing in his assessment of cryptocurrencies, seeing their success as driven by money laundering for some of the worst people in the world.

I love having an accountant as the center of a thriller, and Cory’s hero Martin Hench is a lot of fun to spend time with. The plot itself is a rollercoaster ride through cryptography, drug gangs, wildfire ghost towns, ex-Soviet grifters, and it will keep you turning the pages. I highly recommend picking up a copy, it’s enjoyable and thought-provoking at the same time.

To give you one last taste, here’s his perfect pen portrait of someone I’ve met a few too many times:

I’ve known a lot of hustlers, aggro types who cut corners and bull their way through the consequences. It’s a type, out here. Move fast and break things. Don’t ask permission; beg forgiveness. But most of those people, they know they’re doing it. You can manage them, tack around them, factor them into your plans.

The ones who get high on their own supply, though? There’s no factoring them in. Far as they’re concerned, they’re the only player characters in the game and everyone else is an NPC, a literal nobody.

How can AI help everyday life?

Video of an AI-controlled lamp

There’s a lot of hype around AI these days, and it’s easy to believe that it’s just another tech world fad like the Metaverse or crypto. I think that AI is different though, because the real-world impact doesn’t require a leap of faith to imagine. For example, I’ve had a long-time dream of being able to look at a lamp, say “On”, and have the light come on. I want to be able to just ask everyday objects for help and have them do something intelligent.

To make it easier to understand what I’m talking about, we’ve built a small box that understands when you’re looking at it, can make sense of spoken language, and set it up to control a lamp. We’ve designed it to work as simply as possible:

  • There’s no wake word like “Alexa” or “Siri”. You trigger the interaction by looking at the lamp, using a Person Sensor to detect that gaze.
  • We don’t require a set order of commands, we’re able to pick out what you want from a stream of natural speech using our AI models.
  • Everything is running locally on the controller box. This means that not only is all your data private, it never leaves your home, but there’s also no setup needed. You don’t have to download an app, connect to wifi, or even create an account. Plug in the controller and lamp, and it Just Works.

All of this is only possible because of the new wave of transformer models that are sweeping the world. We’re going to see a massive number of new capabilities like this enter our everyday lives, not in years but in months. If you’re interested in how this kind of local, private intelligence (with no server costs!) could work with your products, I’d love to chat.