X

The iPhone X camera tech may go well beyond phones

Face ID and dancing emoji today, but what about tomorrow? Apple's TrueDepth camera on the iPhone X could be even more.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
3 min read
apple-091217-iphone-x-4120

That front-facing camera array on the iPhone X could be a step in many directions.

James Martin/CNET

After sitting in the  Steve Jobs Theater at Apple's Sept. 12 event and looking at the iPhone X and its new technology for myself, I kept thinking about that front-facing camera array. TrueDepth, it's called. It's a bundle of sensors. It can detect faces, or moving muscles. It can see 3D objects. It can enable more advanced augmented reality.

And, I can see it living on well beyond what we think of as an iPhone .

A larger screen is great. A new design is welcome. But if there's one piece of tech that defines that new iPhone, it's TrueDepth. And it's Apple 's first attempt at a more advanced type of magic camera: one similar to Google's Tango, or Intel's RealSense, or Microsoft Kinect .

I see this tech being used in a bunch or different ways, from Mixed Reality headsets to car dashboards.

Mixed reality headsets need cameras like these. I think of HoloLens , a mixed reality headset with a special set of cameras and sensors for tracking and recognizing the environment well enough to place virtual objects realistically. Or other emerging, experimental headsets like Meta and Avegant's light field headset. The blending of virtual and real objects into everyday space, realistically, like Magic Leap promised but still hasn't delivered.

Could more advanced sensors like TrueDepth start to pave the way for higher end applications? It's hard to know the range of TrueDepth (I think it has pretty short range, hence it being front-facing and not rear-facing), but it's a start. Or, maybe TrueDepth could sit on the inside of future headsets and use face-tracking to control input and eye movements.

Biometrics everywhere. Face ID is mostly hands-free. Maybe that frees it to work on door locks, or laptop screens, or car dashboards, or maybe even the Apple Watch someday. It seems like a technology that could be more flexible to install in more places, because it's more awkward to reach your finger up to scan. Maybe we're heading for that Minority Report future of facial scanning everywhere.

Robots , drones , vehicles and location-aware devices. TrueDepth is meant for photos, Face ID and fun AR tricks right now. But what if it was employed to help robots navigate, or enable a vehicle to see a parking garage better? Cameras and SLAM (simultaneous location and mapping) technologies are what allow robots like Kuri to find their way around, or help an autonomous vehicle drive. TrueDepth might be short-range, perhaps, but what if future sensors can scan longer ranges and help with obstacle avoidance?

iPhone 8 and iPhone X unveiled: Pictures, photos from the Apple event

See all photos

A foot in the door for future computer vision. Cameras are the doorway to a next wave of AI that's able to analyze and "see" the world and intelligently use that for all sorts of use cases. Perhaps the next wave of Siri will gain eyes, and finally enable the sorts of tricks that Bixby Vision and Google Lens promised. Maybe next year?

As it is now, TrueDepth on the iPhone X may mostly be about face unlocking. But as Apple said at its event, neural engines can adapt. Maybe, down the road, this will be about a whole lot more.