Apple Adds Eye Tracking Feature to iOS18 and iPadOS18, Along with New Assistive Features

According to TapTechNews on June 11, Apple stated last month that eye tracking would be coming to iPad and iPhone. In the just-released iOS18 and iPadOS18, this feature has been officially added.

Apple Adds Eye Tracking Feature to iOS18 and iPadOS18, Along with New Assistive Features_0

Eye tracking belongs to the assistive feature update. According to Apple, through this built-in option, users can control the iPhone with just their eyes. Eye tracking is a feature specifically designed for disabled users, and it only takes a few seconds to set up and calibrate the front camera. Through the on-device machine learning function, all the data for setting and controlling this feature is securely stored on the device side and not shared with Apple. Eye tracking can run in iPadOS and iOS applications without the need for additional hardware or accessories. Through eye tracking, users can browse the elements of an application and use dwell control to activate each element, accessing other functions such as physical buttons, swipes, and other gestures just by using their eyes.

TapTechNews noted that in addition to eye tracking, Apple has also introduced two other assistive features. Among them, Music Haptics will provide a new way for deaf or hard-of-hearing users to experience music using the iPhone's haptic engine; Voice Shortcuts allow users to perform tasks through custom voices.

Apple's WWDC24 Developer Conference Keynote Speech Special

Likes