Apple is set to bring Eye Tracking, Music Haptics and more to iPhone and iPad

Sanjana Dhar
Sanjana Dhar May 18, 2024
Updated 2024/05/18 at 6:49 PM

Apple is all set to bring several innovative features such as Eye Tracking, Music Haptics and more to iPhone and iPad later this year, most likely with iOS 18.

Apple brings in Eye Tracking

This feature is specially designed for the ones with physical disabilities and is powered by Artificial Intelligence. According to Apple, Eye Tracking will give users an “in built option for navigating iPad and iPhone with just their eyes.” This new feature will allow users to navigate through the elements of the app. This functionality will be able to use Dwell Control to activate each element or press physical buttons, swipes and other gestures using their eyes.

It uses the front camera to set up and calibrate in seconds. Besides, all data used by Eye Tracking will be stored on the device itself. Apple says Eye Tracking is a feature that works successfully across apps and does not require any additional hardware to work.

Music Haptics to offer a new experience

For the ones who have difficulty in hearing, are hard of hearing or unable to hear at all, Music Haptics will offer a new experience. This feature will allow users to experience music on the iPhone by using the Taptic Engine to “play taps, textures and refined vibrations to the audio of the music.” According to Apple, Music Haptics will be compatible with millions of songs available on Apple Music and can also be used as an API for developers.

 

For more information please keep reading techinnews

 

Share this Article