I have been using Apple tech on and off since I bought my first piece of Apple hardware, a green iPod Nano 3rd Generation in early 2008. One thing that always struck me about Apple, having used a friend’s 2007 MacBook briefly in 2007 before purchasing my own in 2008 was how accessible they made their devices, from the iPod shuffle that could read out the names of tracks, to iOS, which over the years has developed a whole raft of accessibility options so that as many people as possible can use the iPod Touch, iPhone and iPad, to the Mac lineup, all of which come with assitive technology out of (ever shrinking) box. This technology was great in that it could allow people to access the their devices. What happens however if you want to access something bigger. What if you want to access …. life?
Since the introduction of Siri on the iPhone 4S back in 2011 (one of which I owned and loved, despite scoffing at it when it came out. That’ll show me!) Apple, along with Amazon, Google and even Microsoft, have been trying to turn a simple digital assistant into something that can run facilitate in any aspect of your life. Now as dystopian as that may sound, certain “skills” (as Amazon might call them) that you can install to a digital assistant can genuinely make life easier for people with disabilities.
Over the past couple of years I have installed a plethora of smart devices in my home inclusing a few smart plugs from Meross, some smart bulbs from LIFX and VOCOLinc as well as a heat diffuser. I bought an Apple Homepod Mini at the beginning of this year to use as a Homehub. Apple Home made these devices very easy to set up as they are Apple HomeKit compatible, and can be often set up with nothing more than the quick scan of a QR code (which admittedly is an inaccessible way of doing things if you have limited vision). What has resulted is that I have a home that is easier to use, with the ability to turn off those hard to reach plugs instead of just leaving them on (which is very important given current energy prices).
I fiund this useful for people with limited vision as they might not always be able to tell if plus (and even lights) are switched off. While the Apple HomePod has fewer “skills” than an Amazon Echo, the ones it does have can be useful. I’ve found the Intercom feature to be very useful for communicating with people when I’m not home, even for a simple “That’s me away back, please can you put the kettle on?”
Last year, Apple launched the AirTag™. This wee device can be placed in, or clipped on to any valuable item and as such can be pinged by any Siri-enabled devices which will cause the AirTag™ to play a sound which you can then locate. If you have an iPhone 11 or higher, you can be directed to the missing AirTagged device by the U1 chip in the iPhone which will vibrate to let you know how far away you are from the AirTagged item.
Like I said in my last blog post (I promise I’m not being paid by Apple or any of their affiliates, honest!) I can’t even begin to tell you just what an absolutely fantastic invention the Apple AirTag™ is. Being visually impaired and on the Autistic spectrum can make it so that I’ll absent-mindedly put things down (yes, even important items) and promptly lose them. Now with a quick enquiry to Siri on my iPhone I can be reunited with any lost AirTagged items and hopefully avoid any unpleasant and unnecessary meltdowns.
Going back to the iPhone, a flick down from the top right of the Home screen on my iPhone 11 presents me with, among other things, a magnifying glass feature. Not so long ago you’d have to spend hundreds of pounds on a handheld video magnifier with the visual acuity needed to magnify text, or even more if you wanted a full-sized CCTV camera system. Now though, Apple have you covered thanks to the iPhone. Supplement that with the Seeing AI app from Microsoft, and your iPhone can even tell you about your surroundings, read documents and even tell you what colour an item is (though every blonde haired person I’ve tried it on has been unceremoniously told that they have brown, grey or even green hair).
Speaking of colours, the smart bulbs I have installed in my home are very easy to set up and use through HomeKit, though their respective apps will give you more flexibility. I’d like to give the LIFX app commendation for not only being accessible with VoiceOver, but also for observing the text size in iOS. This is an excellent app to access if you are visually impaired. While LIFX has made their app accessible, I must confess that I prefer the behaviour of the VOCOLinc Bulbs. This is a shame because I have found the VOCOLinc app to be quite inaccessible, using small font sizes even if iOS’s text size is turned up, and not having support for VoiceOver. I hope this is something that VOCOLinc will implement in their app.
Anyhoo, I have been able to make scenes that incorporated both the LIFX and VOCOLinc bulbs using the Apple Home app. That said I have found it easier to do this in the VOCOLinc app ironically as it will show you the current colour or temperature, or brightness each bulb is set to if it is switched on, which can make it easier to make scenes if you’ve found a colour you like while messing with the in-app colour wheel. Apple Home by comparison will initially show you a grid of six colours to choose from when setting up a light bulb’s colour. While you can access a colour wheel by tapping on a colour to select it, and tapping on it again, the current colour of the bulb will not be the one that’s selected, which means you have to select it manually, and you have to then preview the entire scene to see if it is iindeed the right colour for what you want. I hope this is something that could be implemented into Apple Home.
I am aware that the Amazon Echo works with a wider array of smart IoT devices, but I feel that I would still use the Apple HomePod over an Amazon Echo or Google Nest as Apple takes privacy seriously, which can make me feel safe.
So I’m sat here in my Apple Homekit enabled Smart Home, and want to kick back and relax. Given an Apple Music subscription I could ask Siri to start playing my favourite music on my HomePod mini. That’s fantastic, but what if I wanted to watch TV? I bought an Apple TV 4K in late January, and have found it to be extremely accessible. I previously owned an Amazon Fire TV 4K, and while it worked beautifully when I first set it up, it started to become quite glitchy, and when I enabled the screen reader, it would not shut up if I was starting a video through Plex. My Apple TV 4K by comparison will talk when I want it to, but won’t when I don’t thanks to my being able to access an Accessibility shortcut.
The interface of the Apple TV is much easier to navigate. Sure you can wind up in the Apple TV+ screen which will show advertisements of recommended programmes, but a quick click of the TV button will take me back to the Home Screen which is set up very much like it would be on iOS, save for the rectangle icons in place of iOS’ square ones. I’ve also found that turning on audio description in the Apple TV settings will make any app that observes the setting deliver audio description. I was pleasantly surprised for example to hear a Disney film being audio described to me on Disney+, a service that I’d never been able to turn Audio Description on in before.
So it seems that while Apple still strives to make their devices themselves as accessible as possible, I feel they are trying to make it so that you can use their devices to make more of life accessible, and that I feel is an amazing thing.
UPDATE: Because of every other device manufacturer’s unhealthy desire to copy everything Apple does, we have seen implementations of various accessibility features in various other devices running Android and, and Windows Mobile when it was still relevant, and I think this is a good thing. The only issue is that some of these accessibility solutions can be a wee bit clunky at best, but we can live in hope that these are updated and improved as time goes on.