A few days ago I got an email from a journalist asking about the Starshot project. Of course he was looking for my much-more famous namesake Pete Worden, but I’ve been fascinated by the effort too. Its whole foundation is that we’ll soon be able to miniaturize space probes down to a few grams and have them function on tiny amounts of power. Over the past few years I’ve come to realize that’s the future of computing.
Imagine having a self-contained system that costs a few cents, is only a couple of millimeters wide, with a self-contained battery, processor, and basic CCD image sensor. Using modern deep learning techniques, you could train it to recognize crop pests or diseases on leaves and then scatter a few thousand across a field. Or sprinkle them through a jungle to help spot endangered wildlife. They could be spread over our bridges to spot corrosion before it gets started, or for any of the Semantic Sensor uses I’ve talked about before.
I know how useful these systems will be once they exist, but there are some major engineering challenges to solve before we get there. That’s why I’m excited to be going to the Embedded Vision Summit in a couple of weeks. Jeff Bier has gathered together a fantastic group of developers and industry leaders who are working on making this future happen. We’ll also have a strong presence from the TensorFlow team, to show how important embedded devices are to us. Jeff Dean will be keynoting and I’ll be discussing the nitty-gritty of using the framework on tiny devices.
If you’re intrigued by the idea of these “nano-computers”, and want to find out more (or even better if you’re already working on them like several folks I know!) I highly recommend joining me at the Summit in Santa Clara, May 2nd to 4th.