AppleIntroducesNewAccessibilityFeaturesForiOSDevices

TapTechNews reported on May 15 that Apple announced today that it will launch new accessibility features later this year, including eye tracking to help disabled users use iPad or iPhone through eye movements. In addition, Music Haptics will provide a new way for hearing-impaired users to experience music on the iPhone; Voice Shortcuts allow users to execute tasks through custom voices; Vehicle Motion Cues help reduce motion sickness when using the iPhone or iPad in a moving vehicle; and visionOS will provide more accessibility features. It is expected that the new features will be available in updates such as iOS 18, iPadOS 18, and macOS 15, although Apple did not explicitly state so. For more information, please refer to the Apple website. Eye tracking debuts on iPad and iPhone The AI-supported eye tracking feature provides users with built-in options to use the iPad and iPhone with just their eyes. Eye tracking is a feature designed for disabled users, which can be set up and calibrated in seconds using the front camera and all data controlling this feature is securely stored on the device through on-device machine learning, not shared with Apple. Eye tracking can run in iPadOS and iOS apps without the need for additional hardware or accessories. With eye tracking, users can browse elements of the app and use dwell controls to activate each element, accessing other features only with their eyes, such as physical buttons, swipes, and other gestures. Music Haptics enables more people to experience music Music Haptics is a new way for hearing-impaired users to experience music on the iPhone. After enabling this feature, the haptic engine in the iPhone will provide light taps, weaves, and subtle vibrations as music plays. The Music Haptics feature is suitable for millions of songs in Apple Music and will be provided as an API to developers to allow more users to experience music in their app. More new voice features iPhone and iPad users can use Voice Shortcuts to add custom phrases for Siri to launch shortcuts and complete complex tasks. Another new feature, Listen for Atypical Speech, provides options to enhance the range of speech recognition. The Listen for Atypical Speech feature uses on-device machine learning to recognize users' speech patterns. These features are designed for users whose language function is affected by conditions such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, and provide new personalized and control features for users who cannot speak or are at risk of losing speech ability, based on the features introduced in iOS 17. Vehicle Motion Cues help reduce motion sickness Vehicle Motion Cues is a new feature for iPhone and iPad that helps reduce motion sickness for passengers. Studies show that motion sickness is often caused by sensory conflicts between what people see and what they actually feel, which can make it uncomfortable for some users to use their iPhone or iPad while riding in a moving vehicle. Vehicle Motion Cues will display animated dots at the edge of the screen that represent changes in vehicle motion to help reduce sensory conflicts without interfering with the main display content. Using the built-in sensors of the iPhone and iPad, Vehicle Motion Cues can identify if the user is in a moving vehicle and provide appropriate feedback. This feature can be set to automatically display on the iPhone or turned on and off in the Control Center. Pause video playback Vehicle Motion Cues on iPhone 15 Pro is a new feature for iPhone and iPad that helps reduce motion sickness for passengers in moving vehicles. CarPlay updates with voice control and more accessibility features Accessibility features coming to CarPlay include voice control, color filtering, and sound recognition. With voice control, users can use CarPlay and control apps with just their voice. With sound recognition, hearing-impaired drivers or passengers can enable alert functions and receive notifications of car horns and alerts. Color filters make it easier for color-bli nd users to use the CarPlay interface and provide other visual accessibility features such as bold text and large text. Accessibility features coming to visionOS This year, visionOS will introduce accessibility features that include system-wide real-time subtitles to help all users, including those with hearing impairments, understand conversations and dialogue in audio within apps in real time. With the real-time subtitles feature for FaceTime in visionOS, more users can easily connect and collaborate with their Persona for a unique experience. AppleVisionPro will add the ability to move subtitles using the window bar in Apple's immersive videos, as well as support for other Made for iPhone (MFi) hearing aids and cochlear sound processors. Updates to visual accessibility features will include adding features such as reduced transparency, smart invert, and dim flashing lights, for users with low vision or those who wish to avoid strong light and frequent flashing. VisionOS will provide real-time subtitles, allowing users with hearing impairments to follow real-time conversations and audio in apps. These features, along with the dozens of other accessibility features already available in AppleVisionPro, provide a flexible input system and an intuitive interface designed for a wide range of users. Features such as VoiceOver, Magnifier, color filters, etc., provide spatial calculation capabilities for users with visual impairments, while features such as Guided Access support users with cognitive disabilities. Users can control VisionPro through a combination of eyes, hands, or voice, and the accessibility features include switch control, voice control, and dwell control, which can also help disabled individuals. Other updates For users with visual impairments, Narrator will include new features such as new voices, flexible volume controls, custom volume control, and custom Narrator keyboard shortcuts on Mac. Magnifier will offer a new Reading mode and make it easy to start detection mode with the touch of a button. Braille users can now start and maintain Braille screen input in new ways to improve control and text editing speed; Braille screen input now supports Japanese; support for entering multiline Braille with Braille keyboard and selecting different input and output methods. For users with low vision, Hover Text will enlarge the text entered in a text box and display it in the user's preferred font and color. For users at risk of losing speech ability, Personal Voices will introduce a Mandarin version. Users who have difficulty pronouncing or reading full sentences can now use shortened sentences to create personal voices. For users with speech disorders, Real-time Voice will include classification and compatibility with real-time subtitles. Virtual touchpad for disabled users The virtual touchpad's assistive touch feature allows users to use a small area of the screen as a resizable touchpad to control the device. Switch Control can now allow the camera of iPhone and iPad to recognize finger tap gestures as switches. Voice Control will support custom vocabulary and complex words.

Likes