Perspectives

How do we view what’s next?

Does the iPhone X signal the end of the smartphone?

This blog post was inspired by a text from my friend Nicholas Borge, “ iPhone X, Apple Watch 3 thoughts?” To which I responded, “iPhone X is ehh, Apple watch 3 is a game changer.” Nick agreed, “There’s really not much more that can be done with a phone. Watch with no phone required… well that’s a different story. The next great mobile innovation will be making the phone disappear.” Here’s why we think that’s around the corner.

Let’s take a quick trip back in time to 2013. In 2013 Apple acquired a company called PrimeSense (designers of Microsoft Kinect) that created hardware designed to “see” in three dimensions and capture gestures. It used an “RGB camera with depth sensor and infrared projector with a monochrome CMOS sensor which sees the environment not as a flat image, but as dots arranged in a 3D environment.”

At the time, many people thought PrimeSense might be used in an Apple TVset (something we are still waiting on). Fast-forward now to 2015. Apple purchased an augmented reality company called Metaio. At the time of the deal, “Metaio executives felt the offer was low, [but] Apple’s vision for the technology convinced them to sell.”

Being careful not to mess with the timeline let’s fast-forward to January 2017, the team at Patently Apple uncovered the following patent (originally filed in 2014):

According to the present invention, the information is capable of scanning the user’s environment for interesting objects (e.g. interesting pieces in the exhibition). Apple’s patent figure 4 shows a user taking a photo of a building in 3D Depth mode. The system could also be used in Augmented Reality mode and the building could be identified for the user and put into the photo automatically.

Fast-forward to April 2017. Patently Apple also uncovered another patent filed in 2015 that was:

A method for representing a virtual object in a real environment, comprising: using a recorder to capture a real environment in two dimensions; determining position information regarding the recorder relative to at least one component of the real environment; providing three-dimensional information relating to the real environment based on the position information; using the three-dimensional information to identify texture information within the captured real environment; and concealing the removed part of the real environment using the identified texture information.

Now let’s come back to present day. Apple has already launched and released Apple Airpods (or hearables) which many consider critical to improving Siriand AI real-world functionality and at the keynote yesterday they announced a slew of updated products the iPhone X, the iPhone 8, and Watch Series 3. Much of the conversation around the announcement has been based on Apple Face ID.

While that’s a great recap of past events, you may be asking how this all signals the beginning of the end of the smartphone? Two clues at the keynote (and a third from the team at Patently Apple) point to this.

The first was the announcement that the Apple Watch 3 will have its own cellular connection. Nicholas and I both agreed that this is a game changer because it signals that Apple has found a way to miniaturize cellular connectivity in a battery friendly-ish way.

Now the harder clue to spot, is in how the Apple Face ID works. The camera, “uses a set of sensors, cameras, and a dot projector to create an incredibly detailed 3D map of your face.” More specifically, ”It involves an infrared camera, flood illuminator, front camera, dot projector, proximity sensor and ambient light sensor. The dot projector beams out more than 30,000 invisible infrared dots, and the infrared camera captures an image.

Sound familiar? Basically, Apple took what PrimeSense made for Microsoft Kinect and shrank it down to fit on the top of your phone. The technology used to enable Face ID (which still has a couple of bugs) is the same technology that will be required to make Augmented Reality work in real world environments.

The final nail in the smartphone’s coffin is an Apple patent that surfaced in July 2017 that covers: “AR Centric Smartglasses / Head-Mounted Display and Rendering Semi-Transparent UI Elements.” For this patent to be useful you’d need a low powered miniature camera that could sense and position objects in 3D and measure hand gestures (PrimeSense), a miniature LTE chip that doesn’t rapidly drain battery (Apple Watch 3), and advanced Augmented Reality capabilities (Metaio). The pieces are all there, it all fits!

Ten years ago, Steve Jobs took to the stage to introduce a new device he described as “An Ipod, a Phone, and an Internet Communicator.” Little did we know it was all the same device called the iPhone. In a few years could we get a similar presentation that starts, “A smartphone, a camera, and a headset?” Only time will tell.

The wall between the digital world and the real world is falling. JacobPayal, and I plan to demonstrate how we plan to use some of the features announced in IOS 11 for Magnet. Sign-up here and we’ll email you a link to our demo when it’s ready.

Scott Salandy-Defour