Apple has added a feature called “FaceTime Attention Correction” to the latest iOS 13 Developer beta, and it looks like it could make a big difference when it comes to actually making FaceTime calls feel even more like talking to someone in person. The feature, spotted in the third beta of the new software update that went out this week, apparently does a terrific job of making it look like you’re looking directly into the camera even when you’re looking at the screen during a FaceTime call.
That’s actually a huge improvement, because when people FaceTime, most of the time they’re looking at the screen rather than the camera since the whole point is to see the person or people you’re talking to, rather than the small black lens at the top of your device.
The catch so far seems to be that this FaceTime feature is only available on iPhone XS and iPhone XS Max, which could mean it only works with the latest camera tech available on Apple hardware. That could be because of the new image signal processor that Apple included in the A12 processor that powers the iPhone XS and XS Max, which also provide improvements over previous generation phones in terms of HDR and portrait lighting effects.
It’s also possible with any updates or features that arrive in iOS beta releases that they could expand to other devices and/or vanish prior to the actual public launch of iOS 13, which is set for this fall. But here’s hoping this one remains in place because it really seems to make a huge difference in terms of providing a sense of “presence” for FaceTime calls, which is one of the core values of the Apple chat feature overall.
Source: TechCrunch