Loading stock data...

Apple Announces New Accessibility Features for iPhone and iPad Users

Apple Announces New Accessibility Features for iPhone and iPad Users

Apple Unveils New Accessibility Features for iPads and iPhones

On the eve of Global Accessibility Awareness Day, Apple made several announcements regarding new accessibility features that will be available on its iPad and iPhone devices. These innovative features are designed to cater to a diverse range of user needs, including those with disabilities.

Eye-Tracking Technology without Extra Hardware

Apple has already supported eye-tracking in iOS and iPadOS, but this requires the use of additional eye-tracking devices. However, the company is introducing a new built-in eye-tracking option that allows users to navigate through apps using their front-facing camera. This cutting-edge technology leverages AI to understand what the user is looking at and which gesture they want to perform, such as swiping and tapping.

This groundbreaking feature marks the first time Apple has introduced the ability to control an iPad and iPhone without needing extra hardware or accessories. Users can rely on the camera to navigate through their device, eliminating the need for additional equipment.

Dwell Control: A Revolutionary Feature

The new built-in eye-tracking option also includes Dwell Control, a feature that can sense when a person’s gaze pauses on an element, indicating they want to select it. This intuitive feature allows users to interact with their devices in a more natural and effortless way.

Vocal Shortcuts: Assigning Sounds or Words to Launch Shortcuts

Another new feature, ‘Vocal Shortcuts,’ improves upon Apple’s existing voice-based controls. With this innovative technology, people can assign different sounds or words to launch shortcuts and complete tasks. For instance, Siri will launch an app even after the user says something as simple as ‘Ah!’

This feature offers users greater flexibility and autonomy in controlling their devices using their voice.

Listen for Atypical Speech: Improving Accessibility for Users with Conditions Affecting Speech

Apple has also developed ‘Listen for Atypical Speech,’ which uses machine learning to recognize unique speech patterns. This feature is designed for users with conditions that affect speech, including cerebral palsy, amyotrophic lateral sclerosis (ALS), and stroke.

By recognizing atypical speech patterns, this technology enables users to interact with their devices more easily, promoting greater accessibility and inclusivity.

Personal Voice: An Automated Voice That Sounds Like the User

In addition to ‘Vocal Shortcuts’ and ‘Listen for Atypical Speech,’ Apple has also improved upon its existing ‘Personal Voice’ feature. This technology allows users to have an automated voice that sounds just like them, offering a more personalized experience.

Music Haptics: A New Feature for Users Who Are Deaf or Hard-of-Hearing

For people who are deaf or hard-of-hearing, Apple has introduced ‘Music Haptics,’ which enables users to experience the millions of songs in Apple Music through a series of taps, textures, and vibrations.

This innovative feature will also be available as an API, allowing music app developers to provide users with a new and accessible way to experience audio.

Vehicle Motion Cues: A Feature to Help with Motion Sickness

Apple has also announced a new feature to help with motion sickness in cars. Instead of looking at stationary content, which can cause motion sickness, users can turn on the ‘Vehicle Motion Cues’ setting. This feature puts animated dots on the edges of the screen that sway and move in the direction of the motion.

CarPlay Update: New Accessibility Features

Apple is also updating CarPlay with several new accessibility features, including:

  • Voice Control: Users can now control their devices using voice commands.
  • Color Filters: Colorblind users will be able to access bold and larger text.
  • Sound Recognition: Deaf or hard-of-hearing users will be notified when there are car horns and sirens.

These updates aim to make CarPlay a more accessible experience for all users.

VisionOS: Live Captions During FaceTime Calls

Finally, Apple revealed an accessibility feature coming to visionOS, which will enable live captions during FaceTime calls. This innovative technology allows users with hearing impairments to participate in video calls with greater ease and understanding.

These new accessibility features from Apple demonstrate the company’s commitment to inclusivity and its efforts to create a more accessible world for all users. By leveraging AI, machine learning, and other technologies, Apple is pushing the boundaries of what is possible in terms of device control and user interaction.

As technology continues to evolve, it is essential that we prioritize accessibility and inclusivity in our devices and services. Apple’s latest announcements are a significant step in this direction, promoting greater autonomy and independence for users with disabilities.

By making these features available on its iPad and iPhone devices, Apple is creating a more accessible world, one device at a time.

Tags