News
Tech Spotlight: iOS magnifier app.
Note: Guide Dogs does not endorse Apple or Apple products. We have compiled some useful information about the iOS 18 magnifier app update for you to read below.
You may not know all the features included in the magnifier app, so we will run you through them in this brief article. In the latest operating system update, iOS 18, there are a couple of changes we will discuss below.
In the app there is a “detection mode” button. It combines scene and text detection into one feature that is accessible directly from the main magnifier interface. When you press the button, it activates “detection mode,” allowing the device to perform three functions:
- Scene detection: uses Artificial Intelligence to describe the image.
- Text detection: instantly reads any text it detects.
- Point and speak: where the device recognises your finger pointing to text and reads it aloud from that point.
You can use all three modes simultaneously or in any combination, with the option to pre-select specific modes by long pressing the “detection mode” button.
Note: iPhones with lidar scanners (all iPhone Pro models from the iPhone 12 onwards) also have door, people and furniture detection.
Here are some important notes to keep in mind for using “detection mode” on any iPhone:
- Priority hierarchy: “detection mode” follows a sequence when activated. It will first describe the scene, then read all visible text out loud and finally prepare to detect the user’s finger for the point and speak feature. If a new scene is introduced before the mode completes its sequence, it will restart the sequence immediately.
- Motion sensor activation: this mode uses motion sensors to initiate rescanning. If you move the phone’s camera away from the object being analysed, it will automatically stop and begin scanning the new scene. The motion sensor is quite sensitive and may be triggered by small hand movements, so it is recommended to stabilise your hands when using it. To stabilise your hands, keep your wrists in a neutral position and rest your forearm on a stable surface, like the edge of a table.
- Point and speak feature: while using the point and speak feature, the device will emit an audible clicking sound as it detects text. When the clicking stops, it indicates the text has been scanned and is ready for finger-based interaction. The device scans entire blocks of text rather than individual words, so users should point to the middle of a paragraph or block of text for it to start reading. The screen reader won’t be disrupted by the user’s finger blocking the camera’s view, as it has already processed the text. You can slide your finger across the page to read different sections, but lifting your finger will cause the mode to stop, rescan the page and wait for further input.
Currently, the magnifier app’s text detection feature displays recognised text over the live camera view and reads it aloud instantly. In iOS 18, users will gain the ability to convert this detected text into digital form directly from screen capturing without having “detection mode” active, expanding it into a full screen, “reader mode.” This new mode enhances readability by allowing customisation of the background colour, font style and font size for a more personalised reading experience. Users can also activate the text to speech reader while in “reader mode” with the capability of adjusting talking speed within the app interface.
It is important to note that the “detection mode” and “capture” on the magnifier cannot be used concurrently, as the other button will be deactivated once “detection mode” or “capture” is active. This automatically means that the user will not be able to access “reader mode” while in “detection mode”.
iOS 18’s magnifier adds the “activities” feature to the main interface, allowing users to save and customise settings like magnification, colour, contrast and lighting for specific tasks. This makes it easy to switch between pre-set configurations, such as going from reading large print on a newspaper to viewing small text on medication labels without needing to adjust each setting manually.
VoiceOver users can now access these features right from the rotor, without having to open the magnifier app. To set this up, go to Settings > accessibility > VoiceOver > Rotor and Rotor Items. Navigate to the bottom of the list and select live recognition. You can then change your rotor to live recognition from any screen, flick up or down to move through the detection modes and double tap to turn them on or off. Use the scrub gesture or the “end live recognition” button at the bottom of the screen to exit. You can access live recognition from your lock screen as well, so you don’t even have to unlock your iPhone to use it.
Stay tuned for more Tech Spotlight articles in the months to come.
Questions
Have questions about these new features or Assistive Technology in general? Call our Assistive Technology Help Desk between 9.30 am to 12.30 pm, Monday to Friday on 1800 484 333.