Site Logo
bad ui ux

A missed opportunity with FaceID

When Apple released the iPhone X with the all new FaceID I was amazed at how good FaceID was. It almost felt like magic and I never had any problems with it besides COVID-19 masks1. Soon after the iPhone was released, I tried different apps that used the FaceID sensors. There was a browser app which used the sensor’s API to let the user control the browser with his or hers eyes. You could look at a button and focus it for a few ms to “click” it, you could look at the bottom of the page to scroll down, you could navigate the whole page and interact with it only using your eyes. This experience wasn’t yet production ready, indeed after a few minutes it was tiring for the eyes. Yet, it was a unprecedented showcase of what might be possible with these sensors in the near future.

Eight years later (I had to look this up, the iPhone X was released in 2017), there’s not much left of this promising feature. Yes, FaceID is still being used and works great (without masks) but besides the feature I only know of the feature that notifications on the lock screen become unlocked when the device owner looks at them. There’s not much innovation left.

I have no deep knowledge about the inner workings of these sensors, but I wish for more innovation there. For example, I thought about receiving a push notification when my phone is unlocked and next to me: why aren’t those shown until I look away or until I’ve read all of it? When I receive a four digit 2FA code via SMS, most of the time I have to open messages app or swipe from the top to bottom to bring up the notification list with the code. Instead, the code could be shown until I move my eyes away from it. There could also be quick replies with the same “button focus functionality” as I experienced it in the browser app in 2017. When I receive a push notification for a chat message, show me a list of quick responses (like on Apple Watch or any email client) that I can focus and select - with my eyes.

The sensor could also be integrated for assistance while using the phone: it’s faster to focus a button in the browser with your eyes instead of moving your finger all the way.

The Apple VisionPro integrates these sensors. They definitely learned a lot from this over the years and are doing excellent work at engineering. However, it is unfortunate that this feature has been neglected in the iPhone.


  1. In fact, this was the time I and probably every other iPhone user wanted TouchID back. ↩︎

comments powered by Disqus