Why did Apple buy 3D sensor technology vendor PrimeSense? Before I speculate, let me make a necessary digression as to the nature of the so-called "iWatch".
I don't know what Apple is building. I can guess as well (or as poorly) as the next person, and my guess is that they're going to do to the wearable biometrics market with the iWatch what they did to the tablet market with the iPad. Pre-iPad, tablets were terrible, and a common industry question about Apple's widely anticipated entry was, "What's the killer app?" In fact the killer app was the entire iPad experience. Apple took their time, and when they came out with their own tablet, it was so much better than anything that had come before that it single-handedly defined the category.
The world is full of biometric wearables now, and I've owned two myself: the Jawbone Up and (now) the Fitbit Force -- four if you count my two heart rate monitoring Garmin GPS watches. Most of the current crop (the Up, the Force, Nike's FuelBand, et al) are glorified pedometers. My guess is that where the Up, Force, and FuelBand each collect one or two streams of data, the iWatch will collect half a dozen or more: motion, location (via tethered iPhone), heart rate, respiration, blood pressure, blood oxygenation... the list goes on. And my guess is that Apple will tie all this data together in a coherent way that makes it incredibly compelling -- and, dare I say, fun -- to track one's health and achieve personal health-related goals. The killer app will be the entire experience.
What would I like the iWatch to look like? In my dreams, it would look something like this incredible design concept from Todd Hamilton. But who knows? In my experience, Apple's new products tend to be less fanciful than our imaginations (unencumbered by reality such as they are), yet more useful on a quotidian basis.
In any case, to return to the opening question: why did Apple buy PrimeSense? Remember, PrimeSense is the Israel-based company that Apple bought for a reported $360 million late last year. PrimeSense developed the 3D sensing technology used as the basis of the first version of Microsoft's Kinect system.
When the acquisition was announced, I kept waiting for a coherent explanation of why Apple would buy them. The answers seemed to be that Apple wanted to add Kinect-style capabilities either to AppleTV or to the iPhone. Neither explanation holds much water, managing to seem at once both too obvious and insufficiently useful.
But there is an explanation that makes sense to me, and I haven't heard it anywhere else, so I'll put it forward here. What if Apple bought PrimeSense for the iWatch? One of the limitations of biometric wearables is that, being attached to the body at a single point, they typically don't have a great idea of what the wearer is doing. They can measure motion through space and time, but they're only measuring motion of one body part (usually the wrist). That's incredibly limiting. You'd like your wearable to know that you're running uphill, not just uphill; that you're doing dumbbell presses, not bench presses; that you're walking on a treadmill, not on a trail. (I'd just like my Fitbit Force to know that it's not on my wrist as it thinks, but sitting on the floor of the aircraft cabin where it fell without my knowledge the other day, where it proceeded to rack up thousands of phantom steps due to turbulence.)
Remember that one of PrimeSense's key product areas was mobile. Their Capri 3D sensor is claimed to be the smallest in the world. It's nowhere near small enough for a wrist-based wearable, but with Apple's silicon design expertise, one could imagine this changing rapidly, along with power requirements coming down. (Also, PrimeSense first showed Capri over a year ago; who knows how much smaller and more watt-frugal it is by now?)
So why embed a Kinect-style sensor in the iWatch? Because it would give Apple an incredible amount of information about what its wearer was doing. It wouldn't just know you were doing dumbbell presses; it would be able to critique your form. It wouldn't just know you were running uphill, it would know the slope and your stride length. It wouldn't just know you were sitting at your desk, it would know your posture. Combined with the appropriate iPhone- or iPad-based software, a PrimeSense-equipped iWatch could be an all-encompassing activity coach, ready to help you run faster, lift more, even hit a tennis ball better than you ever have. And combining that motion data with biometric sensing capabilities would give Apple unprecedented accuracy in calculating energy usage and body efficiency. If Apple could solve the technical challenges, it would be a home run for the iWatch.
I don't underestimate the challenges in pulling this off. As noted above, the Capri technology would need to be substantially miniaturized, and put on a serious energy diet (whatever the version showed consumed would almost certainly be too much for a wearable). Further, all the 3D sensing solutions I can think of use fixed sensors -- think of the Kinect sitting on your TV stand. An iWatch-based sensor would need to cope with its own movement through space. All this is no small amount of work -- maybe too much even for Apple in the short term. I don't know.
It's just a theory for now. But it's the best one I've heard.