Apple Reveals Fruit of PrimeSense and Faceshift Acquisitions
Tuesday, Apple revealed its fall product lineup. iPhones get wireless and fast charging across the board, while Apple Watch gets cellular capability. Of particular interest to the VR community is the new iPhone X, the first iPhone with an OLED display and an array of new sensors. The new iPhone’s display is supplied by Samsung (for now), so theoretically the device should be capable of low-persistence, and the screen’s 2436 x 1125 resolution is comparable to the S8. Apple did not announce a Daydream or Gear VR competitor at the event, but the phone is perfectly suited should they decide to support one. Sadly, Apple’s 120 frames-per-second ProMotion technology, which debuted on the iPad Pro, did not make it to the iPhone this year.
We have long wondered why Apple purchased PrimeSense and Faceshift and if they intended to apply them to AR or VR. PrimeSense developed the technology that powered the original Microsoft Kinect, while Faceshift developed a system that analyzed a user’s face in real time and mapped their expressions onto an animated character. Many people thought Apple would put those sensors on the back of the phone the way Google did with Project Tango, to enable advanced 3D mapping capabilities.
Instead, Apple put the sensors on the front. PrimeSense’s technology is used primarily to power the FaceID facial recognition security system, which replaces TouchID. The infrared projection feature allows FaceID to work in the dark. Apple calls this the “TrueDepth” sensor.
Faceshift’s technology makes an appearance in the Camera app (for Snapchat-style 3D face masks and relighting selfies) as well as “animoji”, 3D emojis that you can animate by making faces at the phone. This led us to the unforgettable moment where Jonathan Ive appears in a design video as sad poop.
It remains unclear how much the animoji rely on the TrueDepth sensor since other face tracking solutions on the market do not require a depth sensor to work, but Apple appears to be making the feature exclusive to the X.
We’ve seen some tweets lamenting the fact that Apple’s use case for this technology is so selfie-focused, but take heart — if developers get a TrueDepth API, some really interesting use cases will still be possible. A 3D scanning app could be made to stream its data to a companion application on an iPad for example, so it’s still possible that the iPhone X could be a useful as a handheld scanner. The sensor should also enable some great 3D avatar creation systems like the one Sony showed off at IFA (incidentally the Sony phone’s sensors are on the back, so you need help to create an avatar; there are definite advantages to the front-facing setup). In addition, indie game developers could certainly benefit from a setup-free facial performance capture app. So the future looks bright for the X’s sensor package.
In addition, Apple announced that iOS 11 will be released on September 19, finally bringing ARKit officially to the public and the first wave of ARKit app releases. [Demos: The Machines; IKEA]
4A-Games, creators of the Metro series are now offering their VR experience ARKTIKA.1 for pre-order on the Oculus store.