An App for All Abilities

Assistive technology is an umbrella term for devices, software, and equipment that support people with disabilities–anything from eyeglasses to wheelchairs to text-to-voice software. Assistive technology helps level the playing field for everyone.

In Bloomberg Connects, assistive technology can take the form of device settings, tools, and external technology to which a person connects their device. 

In this topic, assistive technologies are grouped by the primary ability that they support or augment.

Remember that accessibility tools can serve many different needs across ability categories and that each person builds their toolkit in a unique way. People might use more than one assistive technology at a time and may use different tools depending on the circumstance.


Vision

Thoughtful, scalable design can make digital experiences more accessible to people with low or no vision. This section summarizes the main ways in which the Bloomberg Connects app can adapt to meet different visual needs.

Screen Reader Support

People with impaired vision or permanent or temporary difficulties reading might use a software program called a screen reader to read on-screen elements aloud.

Common mobile screen readers include VoiceOver (iOS) and TalkBack (Android). On-screen elements include buttons and other controls, text, headings, input fields, and specific content types like images, alt text, and logos. Screen readers can be connected to braille displays so the content can be read instead of listened to, which is especially useful for people who are both deaf and blind.

Principles:

  • Screen reader users can interact with content and features to the same extent as non-screen reader users. 
  • There is a clear, logical, and predictable path for assistive technology to follow, so the software can announce and activate the app’s features (this path is called the focus order).

Watch a short demonstration of using VoiceOver on an Item screen in the app:

Alt Text for all Images

Alternative text (or alt text) helps people who can’t see an image understand the purpose, context, and key visual information through words. Alt text does not appear in the app but is read out loud by a screen reader.

Principles:

  • All app images are designed with unique alt text needs in mind. The three types of images are: functional, decorative, and informative
  • Functional images (like the navigation tab icons) and decorative images (like the Wi-Fi icon that appears at the top of the "Unable to load this screen" message) are the same for all guides in the app and the appropriate screen reader behavior is already built in.  As a cultural partner, you don't need to worry about these images.
  • Informative images include most of the images that you add to your CMS for use in your guide. Since these are images that you own, you are responsible for providing helpful alt text so all visitors can enjoy your content. 

For complete details on creating alt text for your images, see Image Alt Text.

Linear Presentation of Maps

When the Bloomberg Connects app detects a screen reader connection, the display format of a guide’s maps defaults to a linear, text-based version of the map.

Principles:

  • A linear, text-based version of the map is easier to navigate via a screen reader.
  • The map defaults to the linear map version when a screen reader is detected. 
  • Both versions of the map are available for all users via an icon at the top of the screen.
The graphical map view (left) and linear, text-based map view (right).

Text Scaling

People can choose to change the size of the text on their device to make it more comfortable to read.

Principles:

  • Text scaling is supported to maximum legible sizes.
  • Enlarged text is readable and generally does not truncate. 
  • App users control their preferred text size via their device settings (iOS, Android).
  • Text size for closed video captions/subtitles is controlled separately (iOS, Android).
The same Item detail screen with text size set to the normal (left) and maximum supported size (right) (iPhone).

Screen Orientation

People can choose to hold their phone in portrait or landscape orientation, depending on what's comfortable and practical for the content they're accessing and any other relevant needs or considerations.

Principles:

  • Landscape and portrait orientation are supported on every screen. 
  • Neither display orientation is privileged.
  • Users do not need to rotate their device to match a given orientation.
The same Item detail screen in portrait orientation (left) and landscape orientation (right) (iPhone).

Color and Contrast 

The app has been designed to ensure sufficient color contrast to make onscreen elements as readable as possible.

Principles:

  • Higher contrast between the foreground and background colors makes it easier to distinguish characters and symbols. 
  • Sufficient contrast between the color of text and its background color ensures readability. 
  • Color is not used as the only indicator of information.
Select app elements before (left) and after (right) meeting color and contrast requirements. 

Hearing

Providing text-based alternatives to video and audio content makes them more accessible to people with a permanent, temporary, or circumstantial inability to listen to/understand audio. 

Audio/Video Transcripts

Audio and video transcripts make content accessible to a wider audience of people who cannot, or choose not to, listen to the audio.

Principle:

  • All audio and video files support transcripts so people can read along with, or instead of listening to, the audio content.

For complete details on creating transcripts for your audio and video files, see Audio and Video Transcripts.

A transcript for an audio file that provides more context on an artwork.

Video Captions/Subtitles

Captions and subtitles are two types of synchronized text that can be overlaid on a video to make the audio content more accessible. 

Captions convey both dialogue and non-speech sounds (e.g., laughter, music, sound effects) and are intended for people who are deaf or hard of hearing. Subtitles convey only the spoken dialogue and are intended for people who can hear but are not proficient in the spoken language or people who prefer to watch a video on mute while reading the spoken dialogue. 

Principles:

  • All videos support real-time closed captions/subtitles via uploaded VTT files, so people can turn the captions/subtitles on as needed. We suggest that you created closed captions instead of open or "burned in" captions to allow for this flexibility.
  • The availability of captions/subtitles in a guide is indicated using the standard “CC” icon on the video player.

For complete details on creating captions/subtitles for your video files, see Creating Video Captions and Subtitles.

An example of video captions.

Motor Control/Dexterity

Some people do not operate mobile apps using touchscreen devices and instead use an external controller such as a keyboard or switch control.

External Controller

People may benefit from an external controller if they have certain motor or dexterity impairments or if they require discernible physical keys.

Principle:

  • Intuitive navigation and interaction is available for people using external controllers.
  • People using external controllers can interact with content and features to the same extent as touchscreen users. 
  • There is a clear, logical, and predictable path for assistive technology to follow, so the software can activate the app’s features (this path is called the focus order).
A wireless keyboard controller is connected to this device and can be used to navigate the screen using the keys. The keyboard controller visually indicates focus on the selected element (in this case, the Filter icon) but does not announce it aloud to the user.
Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.