Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to 258.. 2

A Third-party Screen Reader for Mac May Be on the Way. 3

Is Blindness-aware AI Customer Service the Future? Be My AI Comes to the Microsoft Disability Answer Desk. 6

A Funny Thing Happened to Me on My Way Home.. 17

Can’t Hear VoiceOver on a Call 20

Transitioning to Windows for Music Production.. 22

Is This an Ableist Job Description for a Disability Role?.. 25

Keypad Problem When Updating iPhone.. 32

Comments on Selena Mills Interview.. 32

Not Happy with the Victor Reader Stream 3.. 34

Any Tips for a New Mantis Q40 User?.. 38

Uber Accessibility and Self-identification.. 40

All the Light We Cannot See, and Apple Accessibility Bugs. 44

My iPhone 15 is Still the Bane of My Existence.. 47

Closing and Contact Info.. 51

 

 

 

Welcome to 258

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

Hello!

This week: there’s enthusiasm about a third-party screen reader for Mac, Be My Eyes introduces its first attempt to use AI for blindness-aware customer service. How well does it actually work?, A funny thing happened to me on the way home, and some Victor Reader Stream 3 users have buyer’s remorse.

A very warm welcome. It’s episode 258 this week.

Area code 258 in the North American numbering plan is once again, not assigned. [Aww sound effect] So one day, area code 258 could be coming to you, if you’re in the United States. Who knows?

Meanwhile, country code 258 is Mosenbique. No no, that’s not it. Mozambique. I thought I had a country named after me for a while there.

They’ve just completed a census in 2023, and there are about 33 million people in Mozambique. So if one of them is you, it is absolutely epic to have you listening to Living Blindfully. A warm welcome to you.

[music]

Advertisement: As you all know if you’re a regular listener to the show, transcripts of Living Blindfully are brought to you by Pneuma Solutions. And they are doing some pretty cool things at the moment. I want to tell you about them.

There are bargains to be had, you see, and Pneuma Solutions is joining the party.

There’s a new plan coming to RIM. This is Remote Incident Manager. We’ve talked about this extensively on the podcast over the years. A great way to help people out.

Of course, the holiday season is coming up for many people towards the end of December. But in the US, lots of people are about to head home for Thanksgiving, and you may be the tech person who gets roped into fixing family members technology. This is where RIM can be very useful, because you can remote into someone else’s computer and get stuff done.

Just in time for the holidays, RIM is introducing a new plan that will give you unlimited time for interactive sessions during the weekend for a year. So this is obviously geared at people who might be assisting others during the weekend, or trying to do a bit of work at home on things.

This plan is going to be available as a $50 add-on for anybody who’s got a current RIM plan. And this begins on Black Friday, the 24th of November, and it ends on Cyber Monday which I think is the next Monday, isn’t it? – the 27th of November.

If you haven’t tried RIM yet or you don’t have a current plan, you’re not left out. It’s available for $99 for anyone without a plan.

Now when Cyber Monday is over, the $50 price for existing customers will be gone, and the plan will be $99 regardless of whether it’s an add-on or a stand-alone plan. So this is one sweet deal, even if you’re not using RIM at all yet. Pay $99 and RIM your heart out for a year every weekend.

If you would like to find out more, (Things will be going live in the next few days as I record this. I’m mindful that this episode goes public just a little bit before Black Friday, so I wanted to get this in there.) keep checking the Pneuma Solutions website for more information. PneumaSolutions.com is where you go. That’s P-N-E-U-M-ASolutions.com.

A Third-party Screen Reader for Mac May Be on the Way

When Apple has special events showcasing Mac hardware like the one that they hosted at the end of October, my mouth waters. The geek in me gets very excited about this.

The gap between the performance of Mac and Windows hardware may narrow next year. but for now, Apple Silicon has a considerable edge.

I used an Intel Mac as my primary computer for 4 years – from 2012 to 2016. There is simply no way that I could have done the work that I needed to do with the same degree of productivity, had I not been able to run Windows in a virtual machine.

VoiceOver has some cool and innovative features. But it’s also lacking in some key areas which I’ve blogged about, and which many of us have discussed on this podcast over the years. There’s also been mounting frustration at the relatively low level of resource allocation Apple seemingly gives VoiceOver on Mac OS.

Now, the extent that a blind person perceives this as a problem is going to vary depending on the work that you do. If you’re an audio engineer, working with Mac OS and VoiceOver may well give you not just an adequate experience but in some ways, a vastly superior one to Windows. Go into any professional recording studio producing music in particular, and chances are good that you’ll be working in Pro Tools for Mac. The audio support in Mac is robust and second to none.

But if you’re working in an office environment using Microsoft Office, things become far more marginal and in my view, considerably less efficient. And if you need to customise proprietary applications for efficient access on the job, well frankly, request a Windows machine.

I have an original M1 Mac which we bought to keep tabs on the platform. Even that original hardware is very impressive. Being able to run many iPhone apps on the Mac, doing FaceTime and iMessage, all of these things are useful. It’s super fast, all while maintaining excellent battery life. I just wish there were a considerably better screen reader.

So like many, I was intrigued to learn of a project in its infancy from a developer in Portugal whose name is João Santos (and I apologise if I’m mispronouncing that name). Now, as you would expect, I have of course reached out to him to see if he’d be willing to talk about his project here on Living Blindfully.

I haven’t heard back from him yet. I hope very much that we can have that conversation on a future episode.

But here’s what we know based on his posts to an AppleVis forum:

He began thinking about the viability of a third-party screen reader for MacOS because of his frustration over what he calls the “general neglect” of MacOS accessibility from Apple.

He’s at the very early stages of the project and there’s much to do, but he has reached the point where he has produced a YouTube video showing Vosh, which is the name of the screen reader. (At this point, it’s a working title.) working in Safari. It already works at the system login screen as well.

He hopes to emulate NVDA’s look, feel and command structure.

He intends for the project to be open source, so developers with the time and skills can collectively contribute to its improvement.

He appreciates the need for scriptability and hopes to implement support for AppleScript, and possibly other languages.

One of the most frequent criticisms we hear of VoiceOver for MacOS is the plague of Safari not responding messages at least some VoiceOver users hear when trying to use the web. To try and alleviate this, he’s looking at a cache, or a static copy if you like, of a web page which the user can then navigate through. If the screen reader determines through a notification event that something has changed on the page, then that snapshot will be refreshed. This could create a more responsive experience and reduce, if not eliminate, the Safari not responding messages.

Already in the spirit of open source, another developer who’s been working on building a suite of frameworks for building a Mac screen reader in Apple’s Swift programming language has offered to collaborate.

A number of people have asked whether this could be ported to iOS. My understanding is that while a third-party screen reader is technically theoretically possible on MacOS, iOS is too locked down. This also appears to be the view of this developer working on Vosh.

Clearly, this project is one to watch, as the developer acknowledges building a screen reader which can be used in real world scenarios is a massive undertaking. And what we have now is a rudimentary proof of concept after a week of work, so it’s important that we stay realistic about what’s been done so far and how long something truly viable might take to come to fruition.

And time will tell exactly how much information a third-party screen reader can access. In other words, what application programming interfaces and other frameworks exist for a third-party screen reader? It could be that we end up with a screen reader that has a user interface more familiar to Windows users, but with the same fundamental flaws under the hood because of operating system restrictions.

A screen reader is a critical piece of technology. It can determine whether we can do a particular job or not.

I hope it succeeds and thrives.

But even the fact that it’s being attempted and that it’s being received so enthusiastically by the blind community should send a very strong message to anyone at Apple who cares about the way the blind community perceives the company, the quality of what they’re producing, and the arm’s length approach Apple takes to its customers – an approach which is not culturally appropriate. I say that because nothing about us without us is the catch cry of the disability sector.

Yet, as we discussed in last week’s episode, when we learned of the call by a non-disabled developer to boycott Apple’s feedback assistant until such time as Apple changes its ways, Apple is squandering a lot of goodwill from people who are willing to generously give off their time and expertise to file detailed reports and enhancement suggestions by not providing mechanisms for customers to track the status of the bugs that they file and to search to see if similar bug reports already exist.

So the cultural problems that have led to the creation of Vosh are not uniquely directed at blind people, but they affect us in a unique way. If a screen reader is subpar, then we can’t effectively use the hardware we paid for. So it’s a subject about which blind people are becoming increasingly frustrated and understandably passionate.

There’s a potential downside of Vosh’s creation as well.

Apple didn’t make voiceover for MacOS out of the goodness of their hearts. They made it because the last third-party screen reader for what we now call MacOS was discontinued, and that put the lucrative education and government markets at risk because Apple’s products would be completely inaccessible and therefore, illegal for many organizations to procure.

So if Vosh becomes viable, what might Apple do? Will they finally appreciate how quickly they’re losing the goodwill of the blind community and resource VoiceOver properly? Or will it be neglected even further because Apple is happy to pass this problem on to blind volunteers? Volunteers who, in the future, may burn out. Time will tell.

But for now, even the concept of Vosh is a much-needed shot across Apple’s bow. And I will always support an initiative built upon the principle that we, the blind, are the best people to create our own solutions. I wish it every success.

What are your thoughts about Vosh? Do get in touch. opinion@LivingBlindfully.com, or give me a call – 864-60-Mosen in the United States. 864-606-6736.

Is Blindness-aware AI Customer Service the Future? Be My AI Comes to the Microsoft Disability Answer Desk

I guess next month, many of us will be thinking about the year in review, and when that is written and when the history of 2023 is written in blindness technology.

I think one of the key developments of this year will be Be My AI, formerly called Virtual Volunteer from Be My Eyes. On this podcast, we were the first to give you a comprehensive demo of it just as it came. It was a little bit sluggish, and it has improved beyond measure since then. And for me, it’s become an essential tool in the toolbox.

And things just keep on evolving.

I’ve got a press release here from Be My Eyes, and they say:

“Today, Be My Eyes is announcing the successful deployment of Be My AI, a state-of-the-art visual assistance tool powered by OpenAI’s ChatGPT video model.

Beginning this month, Microsoft Disability Answer Desk callers who are blind or low vision can now use Be My AI to handle all types of customer service calls, involving highly complex use cases on everything from Excel spreadsheet formulas and interpreting product instructions and diagrams, to rebooting a laptop or installing and updating software, and much more.

This is the first use of AI accepting image inputs to augment traditional customer service for people with disabilities.

In fact, the deployment, which is rolled out globally, is producing stunning early results that meet what Be My Eyes refers to as the 3S success criteria.”

Now, the first of these 3Ss is success. “90% successful resolution rate by Be My AI for Microsoft customers who try it. Put another way, only 10% of consumers using AI interactions are choosing to escalate to a human call center agent.”

And the second S is speed. “Be My AI solves customer issues in 1/3 the time on average compared to Be My Eyes calls answered by a live Microsoft agent. 4 minutes on average for Be My AI, 12 minutes on average for live agent support.”

And the third S is satisfaction. “Customer satisfaction ratings have improved with the implementation of Be My Eyes in Microsoft’s Disability Answer Desk, with interactions averaging 4.85 out of 5 stars.”

And a quote here, “Our new AI-powered accessibility customer service model is producing remarkable results for not only one of the world’s largest and most innovative companies, but also for a community of blind and low vision consumers.”, said Mike Buckley, CEO of Be My Eyes.

“Be My Eyes is the customer service gold standard for companies that want to assist blind and low-vision consumers, and we are incredibly thankful to Microsoft for their leadership, willingness to partner on accessibility CX solutions, and their help communicating the benefits of Be My AI to other enterprises so we can advance our mission to make the world more accessible.”

And another quote, “OpenAI is proud to work with Be My Eyes. They’ve used our AI models to significantly enhance the daily lives of people with low vision or blindness.”, said Brad Lightcap, COO at OpenAI.

“Their commitment to making technology more accessible showcases a remarkable blend of empathy and innovation, underscoring the potential of AI when directed by an understanding of community needs. We’re glad to see Be My Eyes deploy Be My AI successfully.”

” How it works

With an addition of Be My AI at the front end of the experience, blind and low vision consumers can receive rapid, automated support for Microsoft-related products and services through AI-based natural language conversations.

Be My AI can also provide contextual advice-based guidance on the request at hand. For example, Be My AI can guide you through the steps to install a new version of Windows, describe a PowerPoint presentation, and provide suggestions about the best setup based on your computer’s preferences and hardware.

Further, Be My AI responses are given in text so for the first time, customer service is accessible through AI text for people who are deaf-blind via a refreshable Braille display.

And if Be My AI isn’t enough to solve the issue at hand, calls can still be routed to a live agent from Microsoft’s Disability Answer Desk. ”

“Microsoft is proud to be the first company to integrate Be My AI into our customer service with the Disability Answer Desk. The integration of Be My AI creates an innovative, fast, and efficient way for disabled customers to receive assistance from Microsoft.”, says Jenny Lay-Flurrie, Microsoft’s Chief Accessibility Officer.

“By combining Microsoft’s expertise and the innovative solutions from Be My AI, we’re empowering independence for people with disabilities. Thanks to the collaboration between Microsoft and Be My Eyes, users worldwide will now be able to use Be My AI to resolve issues specific to Microsoft’s suite of applications, hardware, or general questions and inquiries.”

“Up until now, Microsoft offered verbal and visual assistance via agents through their disability answer desk.

Now, Be My AI is a first line of automated support that’s proven to resolve the majority of requests without human assistance in initial beta testing. This first-of-its-kind integration meets the constantly increasing demand for accessible customer service.”

“The global population of people who are blind or have low vision is approaching 300 million, and is predicted to increase dramatically over the next 20 years. The large and growing number of blind and low vision people globally, roughly equivalent to the population of the United States, means every company, non-profit, academic institution, and public sector organization has both a moral and business imperative to better serve our community.”, said Mark Riccobono, President of the National Federation of the Blind.

“The deep commitment of Be My AI to have their product development driven by the lived experience of blind people is truly transformative.

As a result, Be My AI is one of the most exciting technological assistance solutions we have ever evaluated. And given the initial results and the customer service environment, which has historically been a significant source of frustration for blind consumers, we encourage every organization to evaluate this remarkable tool as an important option.

Microsoft’s Be My AI-powered Disability Answer Desk is available 24/7.

While blind users of Be My AI will always have the option to use traditional human conversations with a Microsoft agent, the addition of an AI option gives users a new and accessible choice for many requests.

With Be My AI, Microsoft continues to create a more accessible environment for their customers who are blind or have low vision.

Microsoft’s implementation of Be My AI demonstrates its dedication to providing innovative customer support and paves the way for more efficient and effective support solutions.”

That’s the press release from Be My Eyes, and it represents a significant new venture for blind people and Be My Eyes itself.

The Disability Answer Desk from Microsoft is a very good testbed for this, but I think it’s even more interesting to think about other use cases.

Many of us dread calling some customer service representatives where blindness may play a role in how we engage with them, because we know that if we tell them that we’re a blind person, it’s going to freak them out and we might get an apology. “I’m so sorry to hear that.”, or “Have you got a family member there?”.

And over the years on this podcast, we’ve heard stories of people calling their internet service provider, and they say, “My internet’s not working. I need a resolution.”, and the first thing they say is, “What kind of light is flashing on with your router or your modem?”. And then, you say, “Well, I can’t tell you because I’m a blind person.”. And that’s when you start getting the apology and the family member thing.

And it doesn’t matter how tech-savvy you are, you can say, “I know my router’s working in general because I can access things on my local area network and the Wi-Fi’s all up, so it does seem like an internet connectivity issue.”, but they’ve got a script and they’re fixated on all the light we cannot see.

[laughs] See what I did there? Good cultural reference.

But if you have an AI that is actually trained to understand assistive technology and how to engage in an appropriate way with a blind person, then this could be very significant in a wide range of customer service scenarios.

We’ll give you a quick demo of this. We’ll open the app first.

Open Be My Eyes.

We’re at the top of the screen now.

VoiceOver: Get live video support.

Jonathan: And we’ll flick right.

VoiceOver: Call a volunteer, button.

Browse specialized help, button.

Jonathan: We want to go in there.

VoiceOver: In progress. Search. Search field.

Jonathan: I’ll just flick right for now.

VoiceOver: Assistive technology, button.

Careers, button.

Personal health, button.

Technical, button.

Jonathan: Sounds like that’s where we should go.

VoiceOver: Be My Eyes. Closed, button.

Google. Open, button.

LinkedIn. Open, button.

Microsoft. Open, button.

Jonathan: We’ll double tap.

VoiceOver: Microsoft. Back, button. Microsoft.

The Microsoft Disability Answer Desk is where customers with disabilities get support with Microsoft Office and Windows.

Always open.

Info, button.

Supported languages: English dimmed, button.

Add to favorites, button.

Chat with Be My AI, button.

Jonathan: And this is what’s different. You can, if you want to bypass this completely, …

VoiceOver: Call Microsoft, button.

Jonathan: Call Microsoft, so you don’t have to go through this process if you don’t want to.

I’ll flick back, though.

VoiceOver: Chat with Be My AI, button.

Jonathan: And double tap.

VoiceOver: Back, button.

Please wait. Be My AI is writing a message.

[silence]

Be My AI: Welcome to the Microsoft Disability Answer Desk.

This is Be My AI from Microsoft. How can I help you today?

We can try some basic trouble-shooting steps.

You can add a picture of your device to show the technical issue, and also describe what you’re trying to accomplish.

If we can’t arrive at a solution, then you can use the Call Microsoft button to contact the Microsoft Disability Answer Desk.

Jonathan: Well, this is quite timely.

And I realize this might be a bit of a thorny problem to throw at this AI. But I was about to do a wee rant on the podcast [laughs] about a problem I have had since installing the latest version of Windows 11, which is 23H2. And I will explain this problem to you because actually, aside from this demonstration of Be My AI, I would be really interested to know if others are experiencing it. I’m seeing it on 2 computers.

I used to be able to press Windows E, and I would get right into my quick access. And I had quite a few folders, still do have quite a few folders that are pinned to that quick access. And I’d be able to use first letter navigation to get to the quick access folder that I wanted to go to. For example, Living Blindfully, and press enter, and I would be there.

It’s sweet. It worked the same way in Windows 10. It’s been working similarly in Windows 11 until there’s been some sort of refactoring of File Explorer in this latest build of Windows.

When I installed it, what I found was that I was going to this home tab instead. And it seemed to be taking me into my OneDrive, which I definitely did not want to go into. And it was also showing me recent files, which I also wasn’t interested in.

So I went into the preferences for File Explorer, and I did manage to uncheck some of these new things like show recent files that I’m not interested in seeing. All I want is my old behavior back to get the quick access.

Now, I sort of have, except that when I press Windows E, focus doesn’t land in that nice little list view of folders anymore. Instead, I find I have to shift tab away, and tab back in, and then everything’s like the way it used to be. But having to do that every time I press Windows E is a horrible user experience. And I hope that it’s something I’m doing or not doing.

But when I go into preferences for File Explorer, there is no longer an option to open File Explorer to Quick Access.

There are all sorts of other things that you can open File Explorer to, but Quick Access is no longer in the list, at least not for me.

So I’m going to see if Be My AI can help me resolve this because every time I have to try and find something on my Quick Access list, it is frustrating.

So I’m gonna flick right.

VoiceOver: Call Microsoft, button.

Add picture, button.

Message, text field.

Jonathan: The Add Picture button could be very handy. If you’ve got some sort of problem where you’ve lost speech altogether, you could take a picture of your screen and upload that, and the AI will be able to process that.

And obviously, this is a targeted Microsoft AI. It knows about screen readers, it knows about blindness, and keyboard commands. So it may be able to assist you based on that picture.

But now, we’re at the text field.

VoiceOver: Message, text field. Is editing. Insertion point at start.

Jonathan: And I am going to type in, “In the latest build of Windows, how do I make File Explorer open at the Quick Access screen?”

Hopefully, that’s descriptive enough.

And we’ll press Enter.

VoiceOver: Please wait. Be My AI is writing a message.

[silence]

Be My AI is still writing a message.

Be My AI: To make File Explorer open at the Quick Access screen in the latest build of Windows, please follow these steps:

  1. Press the Windows key to open the Start menu and type File Explorer. Press Enter to open File Explorer.
  2. In File Explorer, press the Alt key to reveal the menu bar at the top. Use the arrow keys to navigate to the View tab and press Enter.
  3. Now, navigate to the Options button, which is usually located on the far right of the toolbar. Press Enter to open the folder options dialog.
  4. In the folder options dialog, you will find a drop-down menu labeled “Open File Explorer to”. Use the tab key to navigate to this drop-down menu, and use the arrow keys to select “Quick Access”. Press enter to save the changes.
  5. Press Tab until you reach the OK button, and press Enter to close the folder options dialog.

Now, whenever you open File Explorer, it should open with the Quick Access screen by default.

Please let me know if you need further assistance, or if you are using any assistive technology tools that I should be aware of.

Jonathan: A very keyboard-centric explanation. You are not told to click on anything.

And if you were to talk to a human at the Disability Answer Desk at Microsoft, in my experience, they are exceptionally good. They’ve been trained about screen readers and keyboard access, so they probably would have given you that explanation anyway.

But what is really cool for me is the idea of, say, HP, or Lenovo, or any PC manufacturer, really, where you have an issue. This is another example. You have an issue, and they’re just not able to tell you, as a blind person, how to engage with your computer because they don’t have the expertise when it comes to keyboard navigation.

So that was a very good, clear explanation.

Unfortunately, in the context of the version of the operating system I’m running, it’s wrong. When I do go into the options, there are 10 choices relating to where File Explorer should be opened to, and Quick Access is no longer one of them.

So I’m going to see what happens if I push back.

“This used to work in older versions of Windows, but it no longer works because Quick Access is not offered as one of the list of choices to open File Explorer to.”

VoiceOver: Please wait. Be My AI is writing a message.

[silence]

Be My AI is still writing a message.

Be My AI: I apologize for the confusion.

In the latest build of Windows, the option to set File Explorer to open to quick access has been changed. Here are the updated steps.

  1. Press the Windows key, type File Explorer Options, and press Enter.
  2. In the File Explorer Options window, use the Tab key to navigate to the drop-down menu labeled “Open File Explorer to”.
  3. Use the arrow keys to select this PC, and press Enter.
  4. Press Tab until you reach the OK button and press Enter to close the File Explorer Options window.
  5. Now, open File Explorer by pressing the Windows key + E.
  6. In File Explorer, use the Tab key to navigate to the left pane, where you will find Quick Access at the top. You can use the arrow keys to navigate to it and press Enter to open it. This way, you can quickly navigate to Quick Access whenever you open File Explorer.

Please let me know if you need further assistance, or if there’s anything else I can help you with.

Jonathan: Okay. It needed a nudge, but I believe those are the steps. What it’s essentially saying is that this is what I now have to do.

You can’t press Windows + E anymore and just land in that list of Quick Access things. You’ve got to go to the other pane.

A lot of change is positive progress. Some of it is not. I have no idea what the logic is to make this harder to get to, but we got there in the end. That’s accurate information.

So can somebody recommend for me a third-party file manager? [laughs] It doesn’t cause me to have to do this every time I want to get to my list of frequently used folders.

Let’s give it another question, and this one is an assistive technology question.

“How do I use my Mantis Q40 with Narrator?”

VoiceOver: Please wait. Be My AI is writing a message.

[silence]

Be My AI is still writing a message.

Be My AI: The Mantis Q40 is a Braille display that can be used with Narrator on Windows 10. To use your Mantis Q40 with Narrator, follow these steps:

  1. First, make sure your Mantis Q40 is turned on and charged.
  2. On your Windows 10 PC, press the Windows logo key + Ctrl + N to open Narrator settings.
  3. In the Narrator settings, select Add Bluetooth or other device.
  4. On the Mantis Q40, go to the main menu, select Terminal and then USB or Bluetooth, depending on your connection preference.
  5. Back on your Windows 10 PC, in the Add a device window, select Bluetooth.
  6. Your PC will start searching for available devices. Select your Mantis Q40 from the list of devices.
  7. Once connected, Narrator should automatically start sending output to your Mantis Q40.
  8. You can now use the Braille keyboard on the Mantis Q40 to control NARATOR.

Please let me know if you need further assistance or if you have any questions regarding the steps.

Jonathan: Well, a couple of things there.

There is no Braille keyboard on the Mantis Q40. The Mantis Q40 has a QWERTY keyboard.

And second, I didn’t actually tell it, to be fair, what operating system I was using. But for whatever reason, it made an assumption that I was using Windows 10 without asking me for clarification about what operating system I was using.

I will type, “I am not using Windows 10. I am using Windows 11. Are the instructions any different?”

VoiceOver: Please wait. Be My AI is writing a message.

[silence]

Be My AI is still writing a message.

Be My AI: The instructions for connecting the Mantis Q40 to Windows 11 are very similar to Windows 10. Here are the steps for Windows 11:

  1. First, make sure your Mantis Q40 is turned on and charged.
  2. On your Windows 11 PC, press the Windows logo key + A to open the Quick Settings.
  3. In Quick Settings, click on the Bluetooth icon to make sure Bluetooth is turned on.
  4. Now, press the Windows logo key + I to open the Windows Settings.
  5. In the settings, select Bluetooth and devices from the left-hand menu.
  6. Click on Add device, and then select Bluetooth.
  7. On the Mantis Q40, go to the main menu, select Terminal, and then USB or Bluetooth, depending on your connection preference.
  8. Back on your Windows 11 PC, your PC will start searching for available devices. Select your Mantis Q40 from the list of devices.
  9. Once connected, Narrator should automatically start sending output to your Mantis Q40.
  10. You can now use the Braille keyboard on the Mantis Q40 to control Narrator.

Please let me know if you need further assistance or if you have any…

Jonathan: It sort of stops there. That could be a 17.2 beta issue. I’m running a beta of iOS at the moment.

But what is quite concerning there is it is missing out a very important step, which is that you’ve got to download and install Braille in Narrator for the Mantis to work as a Braille display or for that matter, for Narrator to work with any Braille display. There’s no reference in those instructions to that fact.

So if you follow the steps, you will probably get the Mantis working as a Q40 keyboard (even though it erroneously tells us that the Mantis has a Braille keyboard), but you won’t get Braille unless you go through the steps of downloading and installing Braille. And this is absolutely silent about how to do that.

What happens, though, if you’ve really got a problem?

And the one that I have called the Disability Answer Desk for, on Bonnie’s behalf, actually, is this one.

“My PC has no sound, so I cannot hear my screen reader, which is JAWS.”

VoiceOver: Please wait. Be My AI is writing a message.

[silence]

Be My AI is still writing a message.

Be My AI: I understand that not having sound can be a significant issue, especially when relying on a screen reader like Jaws.

Let’s try some basic troubleshooting steps to resolve the issue.

  1. First, please make sure that your speakers or headphones are properly connected to your PC.
  2. Press the Windows logo key + I to open the Windows settings.
  3. Use the Tab key to navigate, and the Enter key to select system.
  4. Then, navigate to sound in the left-hand menu and press Enter.
  5. In the sound settings, make sure that the output device is set to your speakers or headphones.
  6. Check the volume level to make sure it’s not muted or set too low. You can use the volume keys on your keyboard or the volume slider in the sound settings.
  7. If you’re still not getting sound, it’s possible that there might be an issue with the audio driver. Press the Windows logo key + X, and select device manager.
  8. In the device manager, navigate to audio inputs and outputs and expand it.
  9. Right-click on your audio device, and select update driver. Follow the prompts to update the driver.
  10. If updating the driver doesn’t resolve the issue, you can try right-click…

Jonathan: Right. We’ve got that problem again where it has just stopped there.

I mean, the point I would make is that all that information would be very useful for me because I have a Braille display. But if your sound’s on the blink, then there’s not really much that can be done, I suspect.

And this is where you’ve got to use your judgment and say, alright. The AI has done what it can. There will always be times when you’ve got to have somebody come in and log into your computer remotely (if you can make that happen), and take charge, and hopefully get those things restored for you.

So it seems to me that if you’ve got a serious technical issue, you probably just want to get on the Be My Eyes to Microsoft directly right away.

But if you’ve got all sorts of how do I questions when you’re using an application or a feature in an operating system, then this could save you a lot of time. And of course, you can ask it questions to your heart’s content, you know you’re not holding up an agent, the AI never gets tired of you. Isn’t that comforting?

But as we’ve seen with these examples that I just thought of at random, (not to trip it up, these are just everyday things that I immediately thought of trying), the AI can give quite convincing answers that are either wrong, (as it was the first time we asked about file explorer because it picked an older version of Windows), or incomplete, (as was the case with the Mantis question, where it didn’t give the critical step of enabling Braille in Narrator).

So this is Be My AI in a customer service context for Microsoft. I’m sure there’ll be more in future.

Have you been kicking the tires of Be My AI in the context of the Microsoft Disability Answer Desk? I’d love to know your experience. How’s it worked out for you?

Drop me an email with an audio attachment, or write it down. opinion@LivingBlindfully.com.

You can give me a call on our US number if you prefer, and leave a voice message. 864-60-Mosen is that number, 864-606-6736.

[music]

Voiceover: Stay informed about Living Blindfully by joining our announcements email list. You’ll receive a maximum of a couple of emails a week, and you’ll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want.

Join today by sending a blank email to announcements-subscribe@LivingBlindfully.com.

Why not join now? That’s announcements-subscribe@LivingBlindfully.com, and be in the know.

A Funny Thing Happened to Me on My Way Home

A lot of people are going to be traveling in the United States over the next week because of Thanksgiving.

And I should report that Bonnie made it safely to the United States. She’s got the Amex out, and she and her friend have gone for a facial and they’ve gone to a steakhouse and all sorts of scary things. I’m delighted that she’s having such a good time over there.

And on this podcast, we sometimes hear about travel things that have gone wrong, usually where airlines haven’t treated us particularly well, and it’s right that we point those things out.

But sometimes, there were just interesting, odd, cute, curious little stories. And this one happened to me last week.

And since there is so much travel going on, even if you’re not in the United States, (you may well be traveling for the holiday season coming up in a month or so), I thought I would recount this because it was really interesting. [laughs]

I won’t disclose too much that might divulge the name of the individual. I’m going to be careful about that because this person did me a favor. But it was interesting.

I was in a city where I had taken a day trip. I left home at 7.

I caught an [8:15] flight.

I was back in Wellington again by about 7 PM, so it was a very quick trip.

And I’d done what I came to do. And I went into the Air New Zealand Kōru Lounge, which was the frequent flyer lounge, to get a bit of work done.

And I said to the person at the desk, “If you wouldn’t mind getting some assistance for me when it’s time to board, I really would appreciate that.”

And she said, “That’s no problem. We’ll get that organized.”

So the boarding call was getting closer, and I was listening carefully for it.

And the boarding call came, and there was no assistance. So I thought, “No big deal. I’ll just head towards the counter, head towards the door. Either I’ll find somebody who’s also on the plane, or maybe the assistance will turn up.”

So I stood up. I went towards the counter.

And sure enough, there was a passenger who said to me, “I’m on that flight. How can I help?”

And I said, “If you wouldn’t mind me taking your elbow, and I’ll just walk with you, that’d be super.”

He said, “No problem at all.”

So we chatted away, got to the plane, and then of course, the flight attendant said, “Oh, you two aren’t sitting together. Would you like me to fix that?”

I said, “No, it’s fine. We’re just walking on board together. We don’t even know each other.”

She said, “Okay, no worries.”

So I got my window seat. I can assure you it’s for the sleep, not the view. I can nod off on any plane at the drop of a hat because I’ve done so much flying. So I like the window seat where I can curl up and not be annoyed.

And I did sleep most of the way back because actually, I had been doing my Beatles show on Mushroom FM which started at 3 in the morning [laughs], so it had been a very long day by the time I caught my flight just after 6.

And we got to Wellington, where I live, and I got off the plane, and there was no meet and assist at the other end.

And I thought, “OK, well, I’ll just follow the throng.”, you know.

And then the same passenger said, “Would you like me to guide you into the terminal?”

I said, “Thank you, I would really appreciate that.”

We got into the terminal, and I said to him, “Look, depending on where you’re going, if you’re just heading out to a cab, I’ll walk with you.

If you’ve got baggage to collect or whatever, just drop me off here, and I’ll sort it out.”

And he said, “Have you got a bag?”

I said, “No, I’ve just been up for the day.”

He said, “Where are you heading now?”

I said, “Home.”

And he said, “Where is home?”

And I thought, “Hmm, that’s an interesting question.”, but I gave him the suburb that we live in.

And then we’re walking along. And remember, this is a terminal I’m very familiar with. And we start going down a flight of stairs I know I have never been down before.

I’m thinking, “What is this? This is a new part of the terminal. It’s actually quite quiet.”

And this is the point when I tell people the story, and they say, “What? He might have been wanting to kidnap you or something.” But he didn’t seem like the kidnapping type. I’m not really sure what the kidnapping type is.

Anyway, we walked down these stairs I’ve never been down before, and we’re outside now.

Then there’s a vehicle there with its door open, and it’s kind of idling.

And I said, “Oh, thank you very much. Is this a Wellington Combined Taxi?”

And he said, “Oh no, this is a Government Crown Limousine. I’m taking you home.”

So we get in the back of this Government Crown Limousine with a chauffeur in the driver’s seat.

And now that it was clear we were going to be spending a few minutes together or more, I thought, “Well, I want to find out who this guy is who’s so important, that he’s got a Government Chauffeur-Driven Limousine taking him home.”

I said, “What do you do?”

And he told me. [laughs]

And I thought, “Oh my goodness, this is just as well I’ve been on my best behaviour.”

So I turned up at home in a Government Crown Limousine, and the chauffeur opens the door, and gets me out, and makes sure I’m at the front door. And I do wonder what the neighbours might have thought of me, turning up in this chauffeur-driven Government Limousine.

Isn’t it extraordinary, the things that can happen to you when you travel?

I bet you’ve got some travel stories.

You have some travel stories, maybe good and bad? Good time to talk about travel and fun blind experiences in general, actually. It could be quite a light topic, you know, so do be in touch. opinion@LivingBlindfully.com and 864-60-Mosen, 864-606-6736.

Bonnie had a good laugh when she heard that story, and thought it was pretty cool the way that I had arrived home.

[music]

VoiceOver: Living Blindfully is the podcast with big dreams and no marketing budget.

Oh dear, dear, dear.

This is where you come in. If Living Blindfully is something you value, we really would appreciate you taking a few minutes to write a positive review of the podcast.

Positive testimonials help spread the word, and encourage new listeners to give us a try.

If you have a moment to give us a 5-star review, we really would appreciate that.

Thank you for supporting Living Blindfully.

Can’t Hear VoiceOver on a Call

Caller: Hello, Jonathan. Dave from Oregon here.

I was wondering if you or our listeners might have a solution for this.

I’m running an iPhone 14 Pro. And when I called my service provider last night to ask about equipment (and that’s Verizon in this case), I was greeted by the typical electronic voice.

It said, “Press 1 for this, 2 for that, 3 for that.”

So I pressed 3.

And then it said, “I am sending you a text message. Please click on this message and go through the link and hit accept, and I will look up your account information.”

This is all mechanical voice.

When that stopped speaking, suddenly, some very loud, annoying hold music cropped in. So loud, that I couldn’t hear VoiceOver.

So I guess my question is, is there any way that you or others might know of that I can dial down the sound coming from the phone itself so that I can hear VoiceOver?

Maybe this has been discussed before, I don’t know. But it’s really frustrating.

Jonathan: Dave, my first question for you is, are you wearing hearing aids? Are you wearing made-for-iPhone hearing aids?

Because if you are, this is a problem that we’ve talked about quite extensively on the podcast in recent times where when you’re on a call, VoiceOver is so quiet for some made-for-iPhone hearing aid wearers in some circumstances, that it’s impossible to hear it as soon as you get hold music.

I count myself very fortunate that I’m a Braille reader and I have a Braille display. And that means that when this happens, I can at least, if I need to, check a text message or even just do some work while I’m on hold for an interminable period of time.

The way you used to be able to fix this is that on the Rotor, you could add a feature called Audio Destination.

And if you go into VoiceOver Settings, and then you choose Rotor, you will still find Audio Destination on the Rotor.

The trouble is, there is a very long-standing bug which I reported to Apple after we had a long discussion on this subject, and we isolated it. We sort of crowdsourced the bug diagnosis, and we have narrowed it down to a problem for made-for-iPhone hearing aid wearers.

I gave them diagnostics, and we did all that kind of stuff, and the bug has vanished into the Bitbucket.

So what you can sometimes do, if you’re lucky, sometimes, the Rotor loses its focus. I can’t make it do it reliably. But every so often, you can flick up and you are adjusting the Audio Destination, even though you don’t see it on the Rotor anymore.

And it seems to be that when you set it to External Audio Route, if you then can manage to do that and you make a call, VoiceOver is nice and loud.

Now, there are some things you can do to mitigate this a little bit. One thing that you can try is that you can put the VoiceOver volume on your Rotor, if it’s not there already, or it may be in your VoiceOver quick settings, and make sure that VoiceOver is adjusted from its default volume of 80% to the full volume of 100%, and that will change the balance between voiceover and other media.

If by chance you don’t wear hearing aids and you’re using airpods or something similar, that could be enough to fix the problem for you. But I suspect that given what you’re describing, it is the good old made-for-iPhone hearing aid bug that has been around for a long long time now, and Apple continues to not fix it.

When I’m working here in the studio which also serves as my home office, I connect my iPhone’s USB-C port to my mixer. And the good thing about that is that if I’m making phone calls during the day, I never have the problem because I’m not using my made-for-iPhone hearing aids as made-for-iPhone hearing aids at that point. Everything’s coming through the mixer.

So I do sometimes, when I’m going to make a call where I know I might be on hold for a while, actually have to come down here and plug into the mixer through the USB-C port. Or of course, you can just use a pair of wired headphones or something if you can manage that, and plug them into the USB-C port. Then, it’s all resolved.

So for me, it’s definitely relating to made-for-iPhone hearing aids. It is just beyond staggering that Apple inflicts this sort of serious bug on made-for-iPhone hearing aids for such a long period.

Now, Dawn Davis is writing in about this, but it sounds like her problem is a little bit different. It starts off the same, but then, I’m not so sure because she says:

“I have not heard any other comments on this topic, but I have an issue with my iPhone 15 Pro and iOS 17 where if I am on a call or listening to something and need to press a button, say, in the menu of a phone call, VoiceOver is so quiet, that I cannot hear it. This means that I have lots of trouble ringing certain companies which rely on menus to navigate to the right department, etc.”

Now, this is where it gets different. She goes on to say:

“if I am reading a book or listening to a podcast and a spoken notification comes in, I have to stop what I am reading to hear what is on the screen.

I find this extremely annoying, and have never had it in any other version of VoiceOver or on any other iPhone.

Has anyone else been experiencing this problem, or is it just me?”

So Dawn, the first part of your email does sound like the problem I was describing, which Dave called in about.

If you are not able to hear notifications when you’re just listening to a book, then that’s something different.

I would check two things.

First, you might want to go in and make sure that the volume of VoiceOver itself is cranked up to 100%.

And second, you might want to check that audio ducking is enabled.

And that should work. I have not had any issues with the volume of VoiceOver beyond when you’re on a call.

Transitioning to Windows for Music Production

Matthew Whitaker writes:

“Dear Living Blindfully team,

I am reaching out for some expert advice, as I transition my music production settings from Mac Studio to a Windows environment.

I intend to utilize NVDA for screen reading, and will be connecting my PreSonus StudioLive 64S system, which facilitates 65 channels of audio through a single USB cable to my computer.

I plan on using Reaper for recording, which has worked well with my current gear on my Mac.

Connected to my PreSonus StudioLive 64S are all the instruments in my studio including mics for drums, percussion, mics for my Hammond organ, a couple of stereo keyboard inputs, bass and guitar setup, and vocal mics.

Here are a few questions I have regarding this transition:

  1. Audio routing for streaming.
  2. If I need to route audio from NVDA speech or Reaper to OBS for streaming, how can I achieve this on a Windows setup? Are there Windows applications similar to Loopback by Rogue Amoeba for audio routing?”

Well, I’m not familiar with this particular audio interface, Matthew. But certainly with the Audient Evo 16 which we demonstrated in episode 257, you could just create a particular pathway for all those things. So you could have your screen reader being sent to a particular set of outputs, and you could also have those outputs being bounced back to certain inputs.

Now, there is software that will do this. I think Virtual Cables is one.

There’s another one that is pretty commonly used now that I forget because I have a mixer to do these things.

But yes, I believe that there is such software, and it is possible to get this done.

Perhaps, others can comment on options for doing this.

“2.”, says Matthew, “Windows laptop recommendations.

I am also on the lookout for a sturdy Windows laptop suitable for music production, streaming, and gaming, which would smoothly integrate with my PreSonus StudioLive 64S system and manage plugins from Native Instruments, iZotope, Roland Cloud, Waves, UJam, and Universal Audio.

I am open to expanding my plugin collection depending on the accessibility improvements over time.

Any recommendations?”

It sounds like you want a pretty intense gaming type machine with plenty of processing power, so that will probably mean that you won’t get the smallest or lightest laptop in the world.

I am using a Lenovo ThinkPad X1 Carbon, but I don’t think it will have the grunts that you need for serious production.

It’s fine for spoken word stuff that I am doing. I can apply a few plugins and do editing, and I really enjoy being able to edit anywhere. and because it has built-in 5G, it means that everything is staying in sync with the cloud.

But I suspect you will want something a bit more punchy, and maybe somebody has some recommendations for you in that category.

Now we’re on to Matthew’s item 3, “Computer keyboard recommendation.

I am in search of a full sized computer keyboard with an application key.

Any suggestions?”

I guess it depends on whether you like mechanical keyboards or not.

I’ve got a Steel Series mechanical keyboard. It is great to type on. And of course, it’s noisy. Mine’s got the Cherry switches, I think it is, and it’s just brilliant to type on for long periods.

That might not be the best option for you. If mainly, you’re doing production, you need something a bit quiet. And I guess anything will do in that regard.

But maybe others have some recommendations.

  1. Backup solutions.

I am considering options for reliable backup solutions. I’ve heard that utilising a NAS (Network Attached Storage system) could be effective, but I am curious about its accessibility with NVDA.

Are there other backup solutions you would recommend that are accessible and reliable?”

Well you know, there is always the cloud. You could point Reaper to a folder on a cloud service like Dropbox or OneDrive.

I like doing that because I have my desktop and my laptop in sync. And no matter what project I’m working on, it’s always there on either machine in its current form, and that’s pretty brilliant.

For full system image backups, there are a number of options. We did have somebody recommend something quite a lot of episodes ago, so you might want to search for backup, or image or something like that, and it might come up.

But I’m using the old Windows 7 backup utility which at least still ships with Windows 10, and we back up the entire Mushroom FM computer via that Windows image to my Synology network attached storage drive, which is kind of idiosyncratic, but it is usable.

He says:

“I am optimistic about a smooth transition to a Windows setup, given my prior experience with Reaper on my Mac Studio.

Your insights on the aforementioned enquiries would be invaluable in ensuring a seamless transition and optimised setup.”

Well, let’s also open it up for people who might have recommendations on really nice grunty laptops, on keyboards, on I guess, virtual audio cable type technology as well.

And good luck with the switch back to Windows!

Is This an Ableist Job Description for a Disability Role?

Caller: Hey, Jonathan. It’s Jim from sunny Florida.

Wanted to share something with you.

I just was playing around on LinkedIn. I was actually doing it more for an organisation, non-profit that I work with, volunteer with should I say. And I got a job advertisement sent to me about Nova Southeastern University.

Now, I’m not trying to start a big cabal about Nova Southeastern because maybe, just maybe, whoever wrote this didn’t give it a lot of thought. Maybe, you know, obviously, we’ve had this conversation, all of us as a community before. There’s a lot of work we have to do.

And I also want to say they are a non-profit. So, you know, they may not be required, most likely not be required, (I’m not a lawyer, as you know, my background is in rehabilitation), they may not be required to subscribe to the tenants of the Americans with Disabilities Act.

But that’s where my “but” ends, as I say, but what about?

So they have an advertisement for their assistant director of disability services at Nova Southeastern University in Davie, Florida.

When reading the listings, basically what I heard in various ways is only able-bodied need apply. They talk about being able to speak, they actually gave a visual description of what you need to be able to see to do this job. You need to be able to stand to do this job. And there are many other things that just got really frustrating.

So if you get any copious free time and you want to look it up, … I was going to try to share it to you. But from LinkedIn, there wasn’t a way to do that.

But what I would say, because I went to the University of Florida which some of you already know, and we had, you know, different directors in the different leadership roles with different abilities and disabilities. And so I really became frustrated with this.

So again, I’m not trying to start a cabal on Nova Southeastern.

What I would ask, though, is if anyone does work with them or work with other organizations like them, if they take an opportunity to politely educate them about things like this?

I’m hoping it’s not just a HR AI or something crazy like that that wrote this. I doubt it. But you know, but I wanted you to know that this particular job description was quite frustrating for me because the essential functions were listed not just in skills and abilities, but there were some actual ability, physical abilities there that I know people have done jobs like this all over the world, and not been able to stand or see. And in some cases, speak or hear. And so I don’t know, it’s kind of frustrating.

Jonathan: Well yes, I can understand why you’re frustrated with this, Jim, having heard your description of it.

And now, I have read the actual advertisement.

Let’s read it together. And it’s gonna take me a while to read this. But I want to do this because as you listen, I want you to think, could I do this job? Or do I know a blind person or a low vision person who could do this job? ’Cause the hammer’s gonna fall a bit later, I promise you.

It says:

“We’re excited that you are considering joining Nova Southeastern University.

Nova Southeastern University, NSU, was founded in 1964, and is a not-for-profit independent university with a reputation for academic excellence and innovation.

Nova Southeastern University offers competitive salaries, a comprehensive benefits package including tuition waiver, retirement plan, excellent medical and dental plans, and much more.

NSU cares about the health and welfare of its students, faculty, staff, and campus visitors, and is a tobacco-free university.

We appreciate your support in making NSU the preeminent place to live, work, study, and grow.

Thank you for your interest in a career with Nova Southeastern University.”

I feel like I should be playing some really cool music. You know, do the whole voiceover thing, yeah.

Anyway, …

“NSU considers applications for all positions, without regard to race, color, religion, creed,, gender, national origin, age, disability, marital or veteran status, or any other legally protected status.”

We move on to the job title, which is Assistant Director, Student Disability Services.

It is located in Fort Lauderdale. That is a nice part of Florida, isn’t it?

The work type is full-time, with full benefits.

It pays about 76,000 US dollars a year.

Let’s look at the essential job functions.

“1. Plans, directs, supervises, and coordinates work activities of subordinates and staff including hiring, coaching, evaluating, and terminating, all in accordance with university policies and procedures.

  1. Supports the daily activities of the office, including but not limited to triaging calls and walk-in times, responding to email inquiries, and referring student concerns to the appropriate resources.
  2. Reviews and interprets relevant documentation and student accommodation requests for students with disabilities.
  3. Responsible for housing accommodation requests, works with the director on any facility or accessibility requests. For example, classroom access, program access, regional campus, etc. Processes accommodation verification forms for registered students.
  4. Participates in interactive process with students to review accommodation requests, determines and grants reasonable accommodations, monitors, adjusts, and evaluates individual accommodation plans, and informs students on the accommodation process.
  5. Participates in the weekly file review process.
  6. Maintains confidential notes on the office’s overall case management system and workflow, maintains files and documents of students with disabilities on campus.
  7. Interprets requests and determines an action plan for essential and emergency matters.
  8. Contacts and responds to faculty, staff, and campus constituents inside and outside of the university using judgment and discretion.
  9. Prepares special or one-time reports and summarizes, replies to inquiries, and selects relevant information from various sources.
  10. Collaborates with director and case manager (s) to create and implement social and educational programs. Develops campus-wide awareness on disability, accommodations, and accessibility-related topics.
  11. Arranges testing accommodations for students with disabilities. At times, may be responsible for sitting in as a proctor and/or reader and scribe.”

Okay. So it’s not until we get to point 12. And I’m checking off my little list here thinking, yes, a blind person could do this, a blind person could do this, where we have one issue here that might require some sort of accommodation for it to be done.

Now we’re on to point 13.

“13. Provides assistance to students with disabilities at commencement and convocation ceremonies.

  1. Attends and participates in meetings and special events as assigned, including staff, division, and college functions.
  2. Assists with overall supervision of case managers, graduate assistants, and student employees within the office.
  3. Represents the office and shares personal information with internal and external contacts in the form of training sessions and presentations and at university events.
  4. Learns and maintains current knowledge of federal, state, local laws and regulations related to disabilities.
  5. Contributes to the continuous improvement of case management and the Office of Student Disability Services.
  6. Completes special projects as assigned.
  7. Performs other duties as assigned or required.”

Now, we’re on to the marginal job functions.

“1. Serves on college and university committees.

  1. Contributes towards creating a positive and respectful workplace.
  2. Uses access to sensitive and/or not yet public university-related information,only in the performance of the responsibilities of the position, and exercises care to prevent unnecessary disclosure to others.
  3. In the absence of the director, assumes the leadership of the office.
  4. Continually seeks efficient ways of providing services by minimizing procedural requirements and bureaucracy.”

So this is looking pretty good, right? This is looking like a job that a blind person could generally do, other than that one function for which there might be an accommodation.

Now, we go on to required knowledge, skills, and abilities.

“1. Proficient, in-depth knowledge of federal, state, local laws, rules, regulations and college and university policies related to disabilities.

  1. Advanced knowledge of MS Office, including Word, Excel, PowerPoint, Outlook, Database, Maxient, and Data Entry Skills.
  2. Customer and personal service proficient, in-depth knowledge of principles and processes for providing customer and personal services. This includes needs assessment, meeting quality standards for services, and evaluation of user satisfaction.
  3. Knowledge of best practices and innovations in the field of higher education disability services.”

Alright. So far so good.

Now, let’s move on to the skills required.

  1. Proficient skills in talking to others to convey information effectively.
  2. Proficient skills in communicating effectively in writing as appropriate for the needs of the audience.
  3. Critical thinking. Proficient skills in using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions, or approaches to problems.
  4. Judgment and decision making. Proficient skills in considering the relative costs and benefits of potential actions to choose the appropriate one.
  5. Complex problem solving. Proficient skills in identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.
  6. Service orientation. Proficient skills in actively looking for ways to help people.
  7. Strong organizational and attention to detail skills.”

Apparently, that one requires no further explanation because we move on to point8.

“8. Demonstrated creativity, initiative, and follow-through.

  1. Excellent interpersonal communication skills, with the ability to maintain good working relationships with students, faculty, staff, and other constituents.
  2. Strong supervisory training and leadership skills.

What abilities do you need?

The first ability you need is the ability to work under pressure in a demanding academic environment, demonstrate decisive insight, and develop novel approaches to achieving access.

  1. Deductive reasoning. The ability to apply general rules to specific problems to produce answers that make sense.
  2. Inductive reasoning. The ability to combine pieces of information to form general rules or conclusions, including finding a relationship among seemingly unrelated events.
  3. Problem sensitivity. The ability to tell when something is wrong, or is likely to go wrong. It does not involve solving the problem, only recognizing there is a problem.
  4. Ability to interpret and explain guidelines, policies, and procedures.
  5. Ability to work effectively and collaboratively with faculty, staff, administrators, students, and partners.
  6. Ability to monitor case management using a database.
  7. Ability to work in sensitive and stressful situations professionally and confidentially.
  8. Ability to work flexible hours, including late nights and weekends.”

So you would think that a disabled person would be ideal for a role like this, wouldn’t you? Because they can be a kind of a mentor, they can be a bit of an inspiration, and also they’ve had lived experience, first-hand experience of some of the challenges that disabled students might be experiencing.

But get this. This is where things really start to go off the rails with this position description.

We’re going on to the physical requirements and working conditions.

“1. Speech recognition. Must be able to identify and understand the speech of another person.

  1. Speech clarity. Must be able to speak clearly so others can understand you.”

So if you have something like Apple’s Personal Voice, which we’ve been talking about on here in recent months, or some older technology, then apparently not for you, either.

“3. Near vision. Must be able to see details at close range within a few feet of the observer.

  1. May be required to work nights or weekends.
  2. Must be able to travel on a daily and/or overnight basis.
  3. May be exposed to short, intermittent, and/or prolonged periods of sitting and/or standing in performance of job duties.
  4. May be required to accomplish job duties using various types of equipment/supplies, to include, but not limited to, pens, pencils, mobility devices, and computer keyboards.”

The ad does go on a bit further. But I wanted to read that because first of all, I think it’s important for people to hear this and say, “Yeah, I could do this. I could do this. This is a job that I could do the most of”, perhaps with the exception of the proctor type arrangement, for which surely there might be some other arrangement made for a candidate who can do much of this work.

The ad does conclude by saying that the university is committed to observing the Americans with Disabilities Act. So it might be that if you as a qualified person applied for this, then they would make some sort of arrangement regarding the bit you might not be able to do like the proctoring.

But it begs the question, is putting all of this stuff in the ad itself a huge disincentive for disabled people to apply at all?

One of my pet peeves has always been this driver’s license requirement. We’ve talked about this on the show before, where an employer concludes that because somebody needs to get from A to B in a timely manner, they must have a driver’s license, whereas we know that there are plenty of alternative techniques we can use to do that thing in a different way.

And this seems all the more puzzling because it is a disability-specific job. Because if these officers can’t write inclusive job ads, we can hardly expect the rest of the world to be friendly to us, can we?

What do you think of this? Have you seen job ads like this, where in organisations or offices where disabled people should be welcomed with open arms, they in fact are not? There’s a lot of coded language in this ad.

opinion@LivingBlindfully.com, if you’d like to comment on this.

You can also call our listener line in the United States – 864-60-Mosen, 864-606-6736.

Advertisement: Living Blindfully is brought to you in part by Aira, and I thank them for their sponsorship of the podcast.

You know we’ve become used to some businesses offering free Wi-Fi. It’s a nice touch, and it makes us feel valued whenever we come across it.

And I know similarly that when I learn about a business that has purchased Aira Access, it’s like putting out a massive “Blind people are welcome here.” sign. I know that if I need it, I’ve got a trained professional agent available to me to provide assistance, and that means that the business cares enough to pay for that. I appreciate that.

From airports, to Starbucks, to Target, and more, Aira Access can assist you to navigate, shop, browse and not be reliant on friends, family or others who may not understand our needs as well. And don’t forget that as well as the offerings in physical locations, there are other businesses providing Aira Access that can be used from home.

So you should check out the app to find out what’s available. To do that, just open the Aira Access section in the Aira Explorer app on your favorite device. You can also visit Aira’s website to find out more at Aira.io. That’s A-I-R-A.I-O.

Keypad Problem When Updating iPhone

In the UK, Brian Gaff has put fingers to keyboard.

Actually, no, I think that might be misrepresenting what he’s done. I think he may be dictating this message on his iPhone.

He says:

“I just tried to do an update from 17.1 to 17.11, (although it doesn’t benefit me using an iPhone XR).

And when it asked for a pin number before it’s verified the update, there is no keypad on the screen, even with all Bluetooth devices in the house switched off. It seems to be looking for a keypad or something that it can’t find, and the only thing I could do was to actually enable the Bluetooth keyboard and type the pin number in on that.

So if you haven’t got a keyboard with you when you do the update, it’s not going to work, is it? I just couldn’t get past it because obviously with no keypad to type in, you can’t enter the pin number.”

Sorry, I’m being a bit hesitant there because there’s no punctuation, and that was quite a difficult message to read. [laughs] But I do get the gist, Brian.

And I have seen this too, actually.

I was just doing an update on the iOS 17.2 beta today, as I put this podcast together, and I had this very thing. Normally, the keypad is visible. Today, it was not, and I had to use the Mantis keyboard to enter my pin.

So not sure what’s going on there, and whether if you’ve never connected any kind of Bluetooth keyboard to your device in history, whether that will solve the problem. But it was a curious one.

Comments on Selena Mills Interview

Voice message: Hi, Jonathan! Edvard here again. Hope you’re doing well.

I keep listening to your episodes, and they are great as always.

Just a little feedback on your last episode, when you were talking to Selena about the definition of the word “blind”.

Now, I personally think that the word shouldn’t be banned. People can use it if they want to.

But it’s really to people’s preference of how they like to identify themselves as.

So for example, me. As I said in my last message, I can still see lights, and I have light perception. So me, I identify myself as vision impaired because again, that’s seeing something.

So I don’t say I’m totally blind, even though people do say, “Oh, you’re still totally blind.” But no, that’s just my personal opinion.

So all I want to say is like it’s really up to each individual of how they like to identify themselves as. It was very interesting, however, anyway.

The other thing I want to say is I’ve been experiencing this weird bug in Microsoft Teams. I don’t know if you use Microsoft Teams, or have you had any experience with it?

But basically, if I connect to a meeting on my computer, people are hearing this weird echo. And when they speak, they hear the echo of their own voices as well. And it’s only when I connect to the meeting. When I disconnect, it’s back to normal.

But when I reconnect, they go like, “Ah, Jesus. Here’s Edvard here again. Oh, here’s that echo back.”

Well they don’t really act this way but I’m just saying this is just to be a little bit funny.

But anyway, if you have any solutions on how to fix that, please let me know because when I reconnected on my phone, I had absolutely no problem. It’s just my PC that does that.

Now, there was an audio setting to turn off to prevent echo. But that’s muting my microphone too, so people wouldn’t be able to hear me if I turn this off.

So if you have any tips on how to fix that, please let me know.

Jonathan: Sounds like a nice excuse not to turn up to all those Teams meetings at work.

Yes, I do use Teams very regularly.

And if I were to make a diagnosis based on what you’ve said, it would be that you’ve got the wrong input device selected in Windows.

If you go into the settings for your device by pressing control comma and you can arrow through until you get to devices, it’s called in Teams, see what your microphone input is set to, because it could be set to some sort of device that’s echoing your output back, and that’s causing people to hear in Teams what they’re saying coming back to them with a little bit of latency, and that’s what would be causing the echo.

So I would go in and try and select a different input device in that dialog. And I suspect that if you just select one that’s purely the microphone and no other special name like what you hear or whatever, then that will do the job.

I don’t know what audio interface you’re using, whether you’re using on-board sound on your PC. But I suspect that that’s where your issue lies.

Good luck! What a shame you’ll then have to attend all those Microsoft Teams meetings again.

Not Happy with the Victor Reader Stream 3

Voice message: Hi there, Jonathan, and everyone out in Living Blindfully land. It’s Pam MacNeill here from Silver Stream in Upper Hutt, as opposed to Silver Stream, Maryland.

Hi, everyone.

I purchased a Victor Reader 3 when they first came out, and I took it on holiday with me to the Gold Coast in Queensland, Australia in September. and I had lots of books loaded onto it that I was looking forward to reading.

However, I found very quickly that the Stream lost its place so that when you turned your book back on, you were at a different place to what you expected, (Normally, it would be at the place you were the last time you’d turned it on), which was really frustrating.

I managed to get the books off the Stream. I had taken my laptop with me so I put them onto my phone, onto Voice Dream Reader instead. And so I was able to read them.

But I was so disappointed that I contacted Humanware, and was told that an update would be coming out and I thought, well, I’ll hang on then for the update, which has just come out.

Now, what I’m finding is that when I press the key to read the book, half the time, nothing happens. I can press it 5 times, and nothing will happen.

And then, I’ll walk away and sort of curse a bit, and then come back and press it, and it will play.

But I mean, that’s ridiculous. And I’ve tried it several times, and it just keeps doing this.

I must admit. So far, I’ve been so frustrated. I haven’t actually tried anything else with it.

So I will be interested to see what others think, but I’m very unhappy with it. And yes, I do think that they rushed the release unnecessarily. It would have probably been better to have slowed it down and made sure that there weren’t any bugs.

And the thing that upsets me the most is that I have had every iteration of the Victor Reader, and none of them have had any bugs. So why is this so buggy?

Jonathan: Thanks for sharing that, Pam. And she also says in the email with the attachments that I just played you:

“As an appendix to this, I must say I dislike the removal of the vibration on powering up and down the unit.

Also, the placement of the speaker is counterintuitive since if holding the unit in one hand, the speaker is covered up by the fleshy part of one’s palm.”

And Rich Beardsley’s writing in, and starts off on the same theme. He says:

“I hope everyone is well, and I want to wish everyone here in the States that celebrates a happy Thanksgiving.”

Well, the same goes for me, too. I hope everybody in the States who’s listening has a wonderful Thanksgiving.

Rich says:

“I can’t believe the holidays are here already. It feels like we just celebrated Thanksgiving and Christmas.”

Not to me, mate. [laughs] Not to me. I’m really looking forward to the Christmas break. That, I can tell you.

Rich continues:

“I wanted to comment on some of the points raised in the last 2 episodes.

First, the Victor Reader Stream 3rd generation.

I had one for a little over a month. But after talking with some friends and looking at the SensePlayer, I decided to sell my Stream 3 and purchase that device instead.

When it comes to the Stream 3, there are things that I like, but there’s a lot I don’t like. There are even some things that I feel the Stream does better than the SensePlayer, but I’ll get to that later.

Just like every other Humanware product I’ve seen, the Stream 3 feels very high-quality.

Sometimes, it’s hard to type with the rubber buttons, but it could still be done.

I appreciate that they created a good menu system. On the older Streams and the Trek, if you miss an option, there’s no backing up. You have to keep pressing 7 until it loops back around, or exit the menu and go back in.

Although it was in mono, the Stream 3’s speaker wasn’t bad, and the mic was also pretty good.

When it comes to downloading, I appreciate that downloads can go on in the background while you’re doing other things on the device. Not being able to do this on the SensePlayer isn’t a deal-breaker since things typically download fast, but it is something I would like to have.

Another positive about the Stream is that you get notifications for downloaded files and software updates. I can understand why you don’t get notifications for downloads on the SensePlayer, but not software updates. The only way to find out about SensePlayer updates is to subscribe to the HIMS newsletter, or manually check for updates.

Now for things I don’t like, and there’s quite a bit.

The first thing is that there’s no user-replaceable battery. I understand that most smartphones today do not have this either, but here’s the thing. If I need to get the battery in my iPhone replaced, I can bring it into an Apple-certified repair place, and they’ll replace the battery and give me my device the same day.

With a product like the Stream, you have to ship the device to them, wait for them to receive it and start the work, then you have to wait for them to send it back to you. Depending on shipping delays and how long their process takes, who knows how long you may have to wait to get your device back?

When it comes to the internal storage, Humanware could have done better. It’s great to see that they made the internal storage accessible, but why only 16GB? The Trek has 32, and the SensePlayer has 64.

When I asked someone about this, they said it was because of the maps that you had to download on the Trek. While that makes sense, 16GB is not enough space for a lot of people. Yes, you can put an SD card in. But if someone doesn’t want to do that, they should have more space.

The next problem is not with the stream, but with Humanware as a company. Credit where credit is due, assistive technology would not be what it is today if it weren’t for Humanware.

Recently though, the company has demonstrated that they do not care about their user base or their products.

When the Trek came out in 2018, the company had given us a list of features we were going to get, like an FM radio, support for indoor navigation, and access to Galileo. But none of them were added, and the Trek hasn’t been updated for 3 years at this point.

With the Stream 3, people have been reporting bugs, and it seems like nothing is being done to address them. I’m still subscribed to Humanware’s newsletter, and I haven’t got anything about an update or saying that the company is still working on the product since version 1.1 was released.

The only thing they’re doing now is sending information on sales they’re doing, and occasional updates to their Connect 12 magnifier tablet.

When Humanware released the BrailleNote Touch Plus in 2019, they claimed that the device was going to leave Android 8.1. And 4 years later, we’re still running that version of Android, and it hasn’t received any security updates since 2019.

I’ve spoken with some of the tech support reps there, and they said that the Touch will not be leaving Android 8.

Not only does this pose a security risk. It means that sooner or later, the device that you spent $6,000 on will eventually become a paperweight when Google changes APIs and apps no longer run. Not sure if you saw, but Perkins Institute here in the United States made a post detailing some of the issues caused by running such outdated software.

Humanware has continued this pattern with 3 of their devices now, and that’s why I’m done with the company and I’m considering selling my Touch.

From my observation over the past couple years, it seems like the company almost forgets that they have these products, then they suddenly remember them and push out an update.

Keysoft for the BrailleNote Touch is still updated, but nothing on the Android side.

The SensePlayer is running a version of Android 11, and the BrailleSense 6 is getting Android 12 soon. So unless Humanware steps up their game, they’re going to be left behind.”

Thanks, Rich.

Just before I go onto a next topic, a comment from me.

I don’t have any particular skin in this game. I don’t own any Humanware products at the moment. Well, except for the Mantis, I guess which is produced by Humanware, and I am happy with that. It is getting updates.

I’ll never own a Stream or a SensePlayer, for that matter. Because for me, my iPhone does all those things. I just want one device that does everything well. The operating system updates come out like clockwork, and for all the bugs, at least it’s one single modern device that does so much, one less thing to carry, one less thing to charge, one less thing potentially to lose when you travel so much, and I don’t really want a notetaker either.

But I would just say that I think the trouble Humanware has is that they’re a small company with a small engineering team, and these products are expensive to manufacture and maintain. Maybe they are a little bit over-committed, the education market is still very much calling for these sorts of devices, and they may well not really care about the version of Android that’s being run.

But that, in no way, negates what you’re saying.

I have heard, though, that the Stream 3 is still actively being worked on. We know Audible support, for example, is in the works. So at least, I think we can say with confidence that that one is very much alive and actively being maintained.

Continuing with Rich’s email, though, he says:

“The next topic I wanted to discuss is Apple and the Feedback app.

When it comes to minor issues, Apple is usually quick to fix them.”

Well, that’s not my experience, Rich, but Okay. [laughs]

“For the critical ones, like VoiceOver randomly deciding it doesn’t want to talk anymore, nothing ever got done about that, and I gave up on trying to report it. I still experience it occasionally on iOS 17, but it’s not as bad as it was with iOS 16, and it’s much easier to make it start talking again when it decides to do that.

The problem with this type of reporting system is that they always ask you to reproduce the problem. With the VoiceOver issue, there was no way to reproduce it. You will be using your phone normally, then it decides it doesn’t want to talk.

I’m also having problems with VoiceOver having a major delay in Safari, but that’s not as critical as losing speech.”

Yes, I can confirm that one as well, Rich. Safari has become quite sluggish in places, especially when you’re on a site. Only some sites though, and you want to enter something into the address bar.

Rich continues:

“On the topic of personal voice, it’s not bad for what it is. While I think it’s a feature that could benefit a lot of people, I think they should have waited and fixed the issues with their existing accessibility features before deciding it was a good idea to make new ones.

I agree that the stuff ElevenLabs is doing is good technology, but it’s also scary. You could potentially run into a lot of issues with this tech if bad actors got their hands on it.

Finally, I want to respond to Chris. I get where you’re coming from with the whole JAWS script and NVDA add-on thing. But until every developer cares to learn about accessibility and how to make sure their stuff is accessible, some of these solutions will be needed.

VoiceOver recognition on iOS may not be perfect, but it works. I know you said you don’t use it. But sometimes, you may not have much choice, and using screen recognition is your best option.”

I mean, I think accessibility is definitely a big part of it, Rich, but it’s also efficiency, as I was saying in response to Chris’s email. Efficiency is so critical if we’re going to succeed on the job and be as productive as we can, and as productive as our sighted peers.

“To Jonathan,” concludes Rich, “thanks as always for a great show, and keep up the good work.”

Well thank you, Rich, for a very thought-provoking email. I really appreciate the time you took to put it together.

Any Tips for a New Mantis Q40 User?

Let’s talk Mantis Q40, the Braille display with a QWERTY keyboard.

“Hello, Jonathan,

My name is Rene Nielsen, and I’m from Denmark, Europe, country code 45.”

Oh man, we’ve well past your episode, Rene. [laughs]

Good to hear from you.

“Today, I received my new Mantis Q40, and so far I’m very impressed. I’m looking forward to working with it.

It goes without saying that I’ve read the manual before turning on the device.”

See, it is worth saying that, Rene, because a lot of people don’t. I’m one of those geeky people who reads the manuals for everything, cover to cover. But a lot of people don’t.

Anyway, and Renee says:

“And it is now up to date. Hooray!

My question for you is, do you, as a longstanding user, have any additional tips and tricks to share regarding Mantis as such, and in connection with JAWS and iOS? I intend to use Mantis, both as a Braille display and external keyboard.

Looking forward to hearing your response.”

Well, it’s all about personal preference I think, Renee. But I like to change the thumb key assignments, both on the Mantis itself and in screen readers that work with the Mantis. My personal preference is to have the very left-hand thumb key advance the display. This is because I’m used to reversing the panning controls on other Braille displays. And I find that with my reading style, it does speed me up because my right hand’s reaching the end of the line, and I can just tap that left-most thumb key with the thumb on my left hand, and it advances. I really like that.

And if you go to LivingBlindfully.com and you search for Mantis, you will find an episode in which I demonstrate how to do this, particularly on iOS. Because in iOS, it’s not particularly obvious.

You can change the assignments in Keyboard Manager in JAWS. But unfortunately, JAWS, in my view, has taken a serious regression, at the moment at least, in JAWS 2024, in that they have assigned the thumb keys to accommodate the new split Braille feature, which is a feature I like very much, but the way it’s been done means that those functions are in scripts that are not directly controllable from Keyboard Manager.

You can go into Keyboard Manager and reassign the default commands to those keys. You just can’t, at the moment, assign split Braille commands from Keyboard Manager. And I think that’s highly regrettable because I’d like to use split Braille, but I need to be able to change my thumb key assignments so that I can be as productive as possible.

You can go in and mess around in the other script file. It’s a bit messy, but you can do it.

But I’m hoping that now that the pressure’s off (I presume the pressure’s off), and JAWS 2024 is released, that we will get those scripts in Keyboard Manager so we can assign them to the keys that we like.

So that would be the only thing I can immediately think of, assuming you’ve read all the manual and you know all the power of the Mantis, is to play with those thumb key assignments and find out which works best for you.

Others may also have some hints that they want to pass on to you. Please feel free to share them. opinion@LivingBlindfully.com, or on the phone at 864-60-Mosen in the United States.

I’m sure you will enjoy the Mantis. It’s a great product.

[music]

Advertisement: Mastodon is the social network where the Living Blindfully community is most active. Join us there for conversation about the most recent episode, and items of news to help you live your best life with blindness or low vision.

All you have to do from any Mastodon instance is follow podcast@LivingBlindfully.social. That’s podcast@LivingBlindfully.social.

Uber Accessibility and Self-identification

Let’s return to a subject that comes up quite a bit on the show. This relates to Uber.

Catherine writes:

“Hi, Jonathan,

I listened with interest to the discussion on episode 256 about your recently discovered ability to self-identify in the Uber app as someone using a service dog.

I live in the US, and eagerly loaded up Uber to check and see if this feature was available for me.

Sadly, it doesn’t seem to be.

I went to Help, Accessibility, and clicked on all the possible submenus there that seemed like they might be a place where I could do this, but no such luck.

I am not currently a service dog user. But I would like to be able to identify myself as a cane user, both to save me the trouble of texting the driver about this each time I call a ride, and to allow for better tracking on Uber’s end if a driver engages in discriminatory behavior.

Perhaps if many cane and service dog users can self-identify as such and a driver engages in a pattern of canceling rides as soon as they are made aware of their rider’s status, that will make it easier for Uber to bar drivers from the platform who demonstrate the pattern.

I did see one interesting thing when I was touring the accessibility section of Uber – a blurb about drivers who are deaf or hard of hearing. I of course firmly believe that deaf and hard of hearing drivers should have equal access to the Uber platform. But I did have an experience a few years ago where a deaf driver picked me up.

I was going to a place I had never been to before, in a town I was just visiting for work. I was completely unable to communicate with the driver, and he actually dropped me off at the wrong place.

I had to seek sighted assistance after the drop off to get to my destination, which luckily was at least a walking distance of where the driver dropped me off.

This was a very scary situation, and I did not feel safe in the car because I could not effectively communicate with the driver. I think I even tried to text the driver from the app while I was in the vehicle, but that didn’t work.

I saw today in the accessibility menu that Uber has some helpful ASL symbols that they show sighted riders when the driver is deaf to aid with communication. But of course, this would not help blind riders, and the driver may feel equally unsafe or concerned if they have no effective means of communication with the rider via visual hand signals or other gestures.

So Jonathan, a philosophical question for you and your listeners.

If Uber gives us the ability to self-identify as blind and we call for a ride, should the app at least make an effort to match the blind rider with a hearing driver?

This feels discriminatory in my gut, but it’s also a safety issue. That incident with the deaf driver happened years ago, but I still think about it and wonder if there’s something else I could do to ensure that I get to my destination safely if I should have a deaf driver again.

Thanks for your thoughts, and congrats to Bonnie on her New Zealand citizenship.”

Ooh! You do have the conundrums to give us, don’t you, Catherine? Quite a conundrum.

Thank you very much for that very thoughtful email.

I can see a scenario where I get into an Uber and I don’t realize the driver’s deaf, and wonder why they’re not speaking to me.

I guess they might be able to text me. But then, you kind of think, do I want my driver looking at their phone as opposed to concentrating on the road? I mean, it’s not so bad if you’re at the beginning of the journey. The destination does appear in the Uber app, though.

But if you need to give instructions as you get closer to the destination or something like that, then the driver should be concentrating on the road, not looking at your texts. So it is a bit of a dilemma.

On balance, I would say it might be good for both parties, as you say, who may both be feeling a bit unsafe, if that match didn’t take place.

But I would be really interested in others’ thoughts on this, and their experiences with deaf ride share drivers. Please feel free to get in touch.

Now, in an effort to be helpful, I will show you exactly where I have found this feature relating to service animal disclosure, and then people will know whether it’s available where they are or not. I’m curious about why it might be available in some markets, but not others.

So we’ll open the Uber app.

Open Uber.

Shall we have a look at my rating? Oh boy!

VoiceOver: Account tab, 4 of 4, button.

Jonathan: Okay. What am I rated?

VoiceOver: Jonathan Mosen. Edit account. Average rating from drivers: 4.96 out of 5.

Jonathan: I mean, I suppose that’s not too bad.

But what infuriates me is that I notice that sometimes, my rating goes down after I’ve been in an Uber with Bonnie and Eclipse, the seeing eye dog to eclipse all dogs. And that’s just fundamentally dodgy in my opinion, because Eclipse is a very well-behaved, well-groomed dog.

Anyway, I will flick right.

VoiceOver: Personal, button.

Help, button.

Jonathan: And we want to go into help.

I’ll double tap.

VoiceOver: Jonathan Mosen.

Jonathan: And handily, if I perform a 4-finger tap on the bottom half of the screen to get to the last item on the screen, …

VoiceOver: Accessibility, button.

Jonathan: There’s the accessibility button, which I shall double tap.

VoiceOver: Select accessibility.

Jonathan: Now, I’ll flick right.

VoiceOver: Resources for riders with disabilities, button.

Using TalkBack and VoiceOver, button.

Identify yourself as a passenger with an assistance animal, button.

Jonathan: And here it is in our version of the Uber app.

So I’ll double tap.

VoiceOver: Possibly, back.

Help, heading.

Identify yourself as a passenger with an assistance animal, heading.

Jonathan: And there’s quite a bit of information here.

VoiceOver: Service assistance program (SAP) information page.

You may choose to identify yourself as a person who travels with an assistance animal. This unlocks certain features designed to improve your experience when riding with Uber, including:

  • Tracking of driver. Partner-initiated cancellations involving assistance animal.
  • Getting priority support by our dedicated customer service team, in the event that your trip does not go to plan.
  • Reminding driver partners of their legal obligation to accept riders with an assistance animal.

If in the future you no longer wish to be identified, you may opt out at any time via this help article. If you do this, Uber will no longer involve you in the program, and the above features will no longer apply to your Uber profile.

You may opt in again at any time.

Once you’ve toggled one of the options below and clicked submit, your Uber account will be updated. You’ll also receive a confirmation email when program features are live and active on your account.

For more information about Uber’s privacy practices, please visit this website.

A team has received some questions about the program, so we thought we’d collate the frequently asked questions below.

FAQs

Uber’s service assistance program

Will you share my accessibility needs with driver partners?

No. Accessibility needs of program participants will not be published onto any public forums or made visible to driver partners at any time.

Will I need to wait longer for a driver partner to accept my request?

No. None of the program features will impact wait times, or reliability of the service.

We’ve received some feedback around trip experiences involving longer wait times or increased cancellations since opting into SAP.

The launch of the program in October 2021 coincided with higher demand, traditionally seen over the end of year period, and so it was observed that riders in general across Australia/New Zealand experienced longer wait times and higher levels of trip cancellations.

To date, therefore, we’ve confirmed that this experience was not isolated to SAP participants.

We understand that this can be frustrating, and can assure you that we’re consistently working hard to improve the rider experience more broadly, as well as for SAP participants.

What is the support phone number?

Australia: +611800990532

New Zealand: +64800005374

How do I report a service denial?

Tap through the below in your rider app and follow the instructions from here. Help > Accessibility > Report a service denial, assistance or assistance animal issue.

How do I submit feedback?

You can write in to our support team by following this link.

Jonathan: So that’s what we have, and it sounds like this is an Australia and New Zealand initiative, and that it may have been around for a while.

So I’m not sure, Catherine, that it would suit your purposes because it’s not designed to have you self-identify as a cane user, only a service animal user, and perhaps that could be one suggestion that could be made to expand the program.

But when Uber were talking with me about this, I said I don’t really feel comfortable switching this on because most of the time, I’m not traveling with a service animal. And as you can tell, it’s very much service animal specific.

So that’s how it works, at least in Australia and New Zealand. I don’t know whether it’s available in some other countries. It would be good to hear.

All the Light We Cannot See, and Apple Accessibility Bugs

A couple of items from Jeanie Willis here in New Zealand.

She is a fan, a big fan, of All the Light We Cannot See, which we had a caller tell us about in last week’s episode.

Jeanie says:

“Well, I had a pretty late night last night, as I got captivated by this drama that Netflix had recommended to me.

I haven’t read the book so can’t comment on that, but I think the TV series is very well done, and can’t wait to finish it tonight. I think it might be another late one.

I was fascinated to hear this morning that the actress herself is blind. I’m assuming we are talking about the adult actress, as there is also a child actress playing the main character in flashbacks.

This certainly explains how natural and real this character is, and leads me to wonder if the critic reviews don’t think it is good acting because she isn’t the groping around or conversely superhuman, can-do-anything-blind portrayal they are used to.

The audio description is beautifully done, and does give a small appropriate amount of focus to some of her movements and navigation, such as mention of lightly brushing the back of her hand on the back of a table as she passes it. But it all seems very natural.

It is an absolutely gripping drama set in World War II, France, a coastal town towards the end of the war, where she broadcasts each night on a local radio from her attic, reading pages from 20,000 Leagues Under the Sea.

My only initial criticism was that I haven’t come across any Braille readers who are quite that fast and fluent as she reads aloud. I guess I’ll have to take that back, now that I know she isn’t just pretending to read the Braille.

There is a scene in an underground, partly sea-filled cavern where she has gone to find oysters, as food is very scarce, where she is almost captured by an SS soldier who has been trying to find her.

I was impressed with the realism of what she would and would not be able to do to defend herself as a blind person, the disadvantage, and in other ways, some advantages of not needing to see to get away in the dark, and also the advantage of lack of expectation from the soldier that she would be able to do anything to defend herself.

So far, this is definitely a thumbs up for me, a great drama regardless of the fact that the main character is blind. In fact, the same story could have been told just as well if she wasn’t. She just happens to be.

And it is very everyday, just another French girl aspect of this portrayal that I am impressed with.

But a few little things like the main title coming up in Braille dots, and then changing to letters, is a nice touch.”

You make me want to watch it, Jeanie. I hope to get around to doing that.

And if you have seen All the Light You Cannot See, what do you think of it? Do let us know.

The only thing I got to watch last week was on Apple TV Plus, Season 4, Episode 1 of For All Mankind because Bonnie and I love that show, and we’ve been on tenterhooks, tenterhooks we’ve been on, waiting for the next season to come out.

But while I’ve got all this time on my own, while Bonnie’s in the US, maybe I can fill a bit of it with All the Light We Cannot See, because you can’t work 24/7.

And wait, there’s more.

Jeanie Willis is going to talk about Apple stuff.

She says:

“Oh, great.

After hearing the letter from the developer boycotting the Apple process, I am realising that the various emails I have back from Apple Accessibility in my inbox asking me to resend, yet again, various screen recordings and diagnostic logs of the ongoing problems of handwriting, glitchy sounding VoiceOver on my iPhone, and a watch that doesn’t want to wake properly, probably means that they have closed my initial reports because they couldn’t reproduce it, and didn’t tell me and/or now need these new logs each time they update and have closed my old reports.

So I am starting to think it is unlikely that it is even being looked into or investigated. I’m wondering if it is even worth bothering emailing them anymore.

Such a huge difference to the process I am currently working on, reporting on accessibility issues with the MuseScore music notation software.

MuseScore is an open source, volunteer-developed program. And when I log an issue on GitHub, all can search for it and see that it has been logged, and where working on it is at.

They can come back to me via that at any time for more info, and I’m updated with notifications whenever there is new activity on an issue I have either logged or commented on. This is how it should be.

The collaborative nature has helped them find some issues they couldn’t initially replicate, as a few of us using it who are blind were able to quickly work out what might be different in our setup to theirs and make suggestions for things to try. They wouldn’t have got there without this back and forth, as they had no reason to have the live Braille panel open, and it turned out tabbing across it caused the item in question to lose selection. And of course, I wouldn’t have worked that out because I couldn’t see the selection had been lost.

The point being that working together gets things sorted.”

Well, that is a maxim for life as well, isn’t it, Jeanie?

I completely agree. It is such a shame that Apple is burning goodwill at such an alarming rate with the community. A community who really does appreciate the marvels that Apple has delivered.

But the way that they will not properly engage with this community in a way that transparently chronicles, and tracks, and allows us to search for, and contribute to bug reports, it is an opportunity lost. And it is also a very serious issue, given how frequently we use these devices in vocational settings these days.

My iPhone 15 is Still the Bane of My Existence

Avnish is writing in again, and says:

“Dear Jonathan,

Thank you very much for responding to my last email, where I had listed 4 major problems with my shifting from the SE 2022 to the Apple iPhone 15 Pro.

Taking your great advice into consideration, and with my never say die spirit, I am now able to slowly crawl back into using my iPhone like I did previously with the SE 2022.

However, …” Oh no! I knew there was a but coming.

“However, I must confess. Every day brings a new host of problems.

However,we shall overcome. Yes, overcome, with your advice and my doggedness.”

Righty.

Latest major problem number 1. As I usually do, I fully charge my phone at night, lock the screen, and put it on my bedside table. Perhaps I did wake up in the middle of the night and pick up the phone and place it in the drawer in my bedside table so that I would not knock it off accidentally.

I usually wake up in the morning when my mother calls me about 9am. However, I got no calls and I woke up at 10am today.

I was a bit concerned because my mother is 97 years old, and ailing. I was worried why she had not called me today.

Quickly, I opened my bedside drawer and grabbed my new Apple iPhone 15 Pro, the bane of my life.

Nothing. No noise. No VoiceOver. Nothing.

I tried the Rays to Wake, and presented my sleepy face to the camera. Nothing.

I pressed the lock button and expected it to say Face ID not working, but nothing. No audio. The phone was as quiet as a church mouse.”

Wouldn’t a church mouse squeak, though? Anyway, I’ll carry on.

“I clearly remembered I charged it to full 100%, so what was wrong?

I tried every trick I knew to wake up the phone, but it was totally dead.

I quickly put in the charging cable to see if the phone would charge. However, I was sure it was charged to 100% just before I went to sleep and did not use it during the night.

Sure enough, the phone woke up as soon as I plugged in the juice. It showed battery percentage 1%.

I quickly charged the iPhone to 30%, and then headed over to settings and battery. The battery settings showed me that my phone had discharged between midnight and [5:30] AM. It went from 99% to 0% in 5.5 hours.

What was the app which took up all the juice?

Camera! It confirmed that the camera was on from midnight till [5:30] AM. But I am absolutely sure I didn’t turn on the camera at midnight.

So what would have happened?

The way I figure it out is that when I picked up the phone and dropped it in my bedside drawer, I may accidentally have touched the camera button, and the camera came on. Is that possible? To me, that seemed the only probable explanation iPhone battery from 100% to 0% overnight.

Phone dead.

Important calls could not be missed.

So what is the solution?

I figure out the only way was to remove the camera and flashlight button from my lock screen. It seems there is no way to do that. At least, the Apple software engineers have given us no clear method to do this.

When I went to screen time and tried to remove the picture/camera from the lock screen, it removed it completely, even from my home screen. I could not access the camera at all if this was done. In fact, even my Be My Eyes app could not access the camera when I had removed it from the screen time settings.

So obviously, this is not the solution.

I have searched the web and found no solution to this problem.

As is usual, when there is no apparent solution to something, you have to ask Jonathan.”

[laughs] No pressure.

“By this lengthy email, I am doing this exactly.

Please help!

I wish my problem emails were only Now and Then.”

That’s a good song.

And Avnish has sent another email after further investigation, and makes the following points:

“1. It’s important to note that as of now, there is no official way to remove these buttons from the lock screen without jailbreaking the iPhone, which is not recommended due to potential security and warranty issues.

  1. If you’re concerned about accidentally activating the flashlight or camera, you can also consider turning off Raise to Wake in the Settings app, under Display and Brightness to prevent the flashlight from turning on unintentionally.”

Yes, a comment from me. I would highly recommend turning off Raise to Wake. It’s annoying, and I don’t really see too many benefits for a blind person of having it on because sometimes, you lift the phone up and it starts to chatter away, and it’s not a feature I like at all. So it’s one of the first things I would do when setting up a new iPhone is turning off Raise to Wake.

Avnish says:

“I am still very keen to know your solution to this problem, and how you would resolve it. Please do elaborate in your next podcast.”

Jonathan: Alright, I’ll have a go at this.

My guess, Avnish is that you’ve left the torch on. It’s called flashlight in some countries, we call it the torch here.

It’s pretty easily done because you can just double tap flashlight or torch, whatever it’s called where you are, on the lock screen and then it’s on.

You’re obviously not the first person to have done this because there’s a VoiceOver feature, just for this purpose.

If you go to Settings, Accessibility, VoiceOver, Verbosity, there’s an option there that relates to torch/flashlight alerts.

And when you enable this, you will apparently get an alert if you leave the flashlight/torch on for too long. I’ve not actually heard this work because I’ve never accidentally turned it on. So I don’t know how long you have to wait and what the alert consists of.

I actually have just switched it on, and am waiting here to see if anything happens. And so far, nothing has.

So maybe somebody who’s used this can tell me exactly how it works, how long you have to wait, and what kind of notification you get when you’re told that the flashlight or the torch is enabled?

But it sounds like that’s a definite must have, if you don’t have that on already, to switch it on.

Actually, quick edit here in realtime. I just did lock my phone screen and unlock it. And as soon as I did that, VoiceOver said torch on. So it’ll probably say flashlight on if you’re calling it flashlight in your market.

So that may be all that it consists of, but I guess it’s better than nothing. And it’s critical to make sure you don’t accidentally tap that button.

Another thing I do as well is disable the Tap to Wake feature. Some people like this. I do not. It’s where you just tap the phone and it causes it to wake up.

That is definitely going to increase the likelihood of turning that flashlight on.

To find that setting, it’s under settings, accessibility, touch, and the Tap to Wake feature is there. That way, if you’re putting the phone in the drawer and you accidentally touch the screen, then it’s not going to wake it up and potentially inadvertently enable that flashlight.

It shouldn’t make a difference but just to be absolutely sure, you might want to close apps in the app switcher that have been using the camera as well.

“But wait, there’s more.”, Avnish continues.

“I also mentioned earlier a host of other problems with the iPhone 15 Pro.

Another one which I have discovered now is when I am using text shortcuts. I use the text shortcuts extensively. In fact, I have at least 500 shortcuts in my phone.

Say for example, I use a phrase very frequently like best wishes and many blessings.

I have assigned the shortcut key BW for this phrase. Whenever I type BW, the whole phrase is printed.

In all my previous iPhones, when I typed BW, the whole phrase would be printed and voiceover would read the whole phrase to me.

Lo and behold, here comes iPhone 15 Pro.

Now, when I type BW, it does type the phrase, but it doesn’t read it out to me so I don’t know if it’s printed or not.

To some, it may seem a simple problem. But when I use 500 shortcuts and most of my emails are typed so quickly, this becomes a major deterrent in fast typing.

Could you find me a quick fix for this problem? It would really help me save so much time and energy.”

I’m not able to reproduce this one.

I use shortcuts as well. It is a great feature of iOS.

I have a whole bunch of Uber ones, for example, that I send to Uber drivers depending on my circumstances. And when I type them, immediately the moment I’ve typed the final part of the shortcut here, the full phrase, I press the spacebar to insert it, and I’m done.

The only thing I can think of is that this doesn’t relate to iPhone 15 Pro at all, which is what I have. But in some way, relates to some of the new prediction features in iOS 17. This one, I think Apple should be able to help you with.

I don’t have a fix for you because, as I say, it’s working fine for me as it always has. And like you, that’s exactly how it’s always worked.

The caveat here is that I am running the beta of iOS 17.2. It’s possible that there was a bug before and I never noticed.

So if you’re not running the 17.2 beta, you might want to wait until 17.2 comes out and see if that resolves it.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.

And be sure to check out that sweet deal that is briefly available to RIM subscribers – 50 bucks for a year’s worth of weekend unlimited RIM access.

It’s ninety nine dollars for everybody else.

And you get unlimited access to rim for a year over the weekend.

Awesome sauce, as I hear some people say. Awesome sauce!

Closing and Contact Info

Alright. I’m going to leave it there.

Thank you very much for your company this week. It’s been a pleasure as always. I appreciate very much you listening.

And a special thank you to our Living Blindfully plus subscribers.

Also, thanks to Derek and Hannah, working behind the scenes on the Living Blindfully team.

Remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: opinion@LivingBlindfully.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.

[music]