Apple has introduced new tools for the iPhone and iPad to help people with disabilities. The Personal Voice tool enables users to train their devices to mimic their voice for phone calls, requiring only a 15-minute training session. Another tool, Live Speech, reads out the user’s typed text during phone calls, FaceTime conversations, and in-person interactions. Users can save phrases they always use for convenience. These tools aim to improve the accessibility of Apple devices for individuals with vision, hearing, and other disabilities. Apple worked with the disability community to develop these features.

To ensure privacy and security, Apple employs on-device machine learning for the Personal Voice tool while answering concerns about “deepfakes” generated by artificial intelligence. Furthermore, Apple has introduced Assistive Access, a unified call app that includes popular apps like FaceTime and Messages. It includes large buttons, clear text labels, an emoji-only keyboard option, and the ability to record video messages. Apple is also updating the Magnifier app for users with visual impairments, making it so that it reads captured text from the iPhone camera when interacting with physical objects.