Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.



Welcome to 289.. 2

Send a Text or Voice Message to us on WhatsApp.. 2

Outstanding technical support from Pulseway. 4

Accessibility of X for iOS Badly Broken.. 6

Beatles Book. 7

The Lenovo ThinkBios Utility. 9

Dr. Nicholas Giudice Discusses How We Ensure Blind People Can Use Autonomous Vehicles, and Robot Guide Dogs. 14

How to Use the Capslock Key in JAWS Desktop Layout 36

My Recent App Advocacy Experience.. 37

TV Apps for Deaf-Blind People.. 39

Orbit Writer. 40

Beware of Y2Mate.. 41

The Bonnie Bulletin Ahead of Convention Time.. 41

Closing and Contact Info.. 49




Welcome to 289


Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.


Coming up on the show this week: now you can send your thoughts to us with WhatsApp, we review Lenovo’s ThinkBios utility – an accessible way to configure the BIOS of your ThinkPad, and Dr. Nicholas Giudice discusses blind people using autonomous vehicles and even robot guide dogs.

Thanks for joining me for another episode! And if this is the first time you’ve joined us, a special welcome to you.

It’s our 289th episode, and our famine is over in terms of episodes that link to area codes. Because this is episode 289, and area code 289 belongs to Ontario. My understanding is not the main bit of Toronto, but I believe it does reach some of the outer suburbs of Toronto, so nice part of the world. And if you’re listening to us from that part of Canada, enjoy your moment in the sun, and a happy Canada Day to you on the 1st of July.

Send a Text or Voice Message to us on WhatsApp

Now, I’ve got something exciting to tell you.

As I mentioned in last week’s episode, I’ve got a bit more time on my hands these days. And one of the things I’m enjoying doing is just playing with new technology and thinking about ways that we can enhance this podcast and make it easier for you to contribute to Living Blindfully. And one thing we go to trial to see how it works out is integrating WhatsApp with Living Blindfully.

So from now on until further notice, you’ll be able to send a voice message on WhatsApp. And if you prefer to send me a text message via WhatsApp as well rather than an email, you can do that, too.

If you do send a message on WhatsApp, do make sure you let us know who you are and where you’re from, even just your first name and where you’re from. It’d be great.

There are several advantages of integrating WhatsApp with Living Blindfully.

One is that some people do find it a bit of a chore to record a voice memo, or in some other recording app, and then attach it and email it off. There’s just too much friction, as they say. Whereas if you have WhatsApp installed, it’s second nature to record a voice message.

The second is it’s free internationally, so you don’t have to call a US number if you want to make a contribution.

And the third is the audio quality is so much better than the jolly old landline telephone. So everybody’s winning.

If you don’t have WhatsApp yet, it might be a good idea to get it. It’s an incredibly popular messaging app. It’s cross-platform, so iOS, Android, Windows, and Mac users can all communicate with each other. There are quite a few blindness-related groups and other things going on on WhatsApp. You can contact many businesses these days on WhatsApp, and it is actually super accessible. I think you’ll be very impressed.

You can get WhatsApp for free from all the usual places, and install it. If you have it integrate with your contacts, then you may find that quite a few people in your contacts are already on WhatsApp and would be delighted to hear from you on it.

If you would like to contact us on WhatsApp, (we’ll have a link to this in the show notes, of course), but you can add this number to your contacts as well. And whenever you want to get in touch with Living Blindfully, you can, if you choose, text us or send us a voice message. We don’t take calls, of course, because this isn’t a live show. So just text or voice messages.

Our new shiny WhatsApp number is +44-7874-464-152. So I’d suggest you add that to your contacts, then you can be in touch with Living Blindfully whenever you want. There’s also a link on our website, and I’ll come to that in a minute. So that number again, +44-7874-464-152 to get in touch on WhatsApp.

You can, of course, still use, and you can also give us a call on 864-60-Mosen if you want to use our landline voicemail number which will give much inferior quality. But we want to be as inclusive as possible, so everyone who wants to can have their say.

Now, because there is this increasing plethora of ways to get in touch, (because I am thinking we might even extend to tools like Signal in the future), there is now a single page where all the contact methods are listed in one place. And I’ll be referring to that frequently from now on. So you can go to that web page, and choose the contact method that suits you best. That page is an easy one to remember. And if you go there, there’s actually a link that will cause WhatsApp to pop up if it’s installed on your phone. It’s all filled in for you, and you can send us a message after choosing that link. The email and phone number are also there. So that’s, for all the ways you can get in touch with the show. That new WhatsApp number, +44-7874-464-152.

Advertisement: Now, Living Blindfully is brought to you in part by Pneuma Solutions, who sponsor our transcripts. We’re very grateful for that.

But I’ve got to ask, what the dickens, what the heck is going on at Pneuma Solutions? Have they completely lost their plot?

The reason why I ask you this is because I’ve got to tell you that right now, RIM has gone absolutely free from the 1st to the 18th of July.


Yes. I know you could be heading away to the conventions, or just going away on holiday or whatever, and you might need to remote back into your computer while you’re away. There could be all sorts of reasons why you might want to use RIM over this particular period, and you can do it absolutely free. So this period covers NFB, ACB, and SiteVillage in the United Kingdom as well.

You could be quite entrepreneurial with this, you know. I mean, if you come across somebody who’s in need of a little bit of tech support when you’re at one of these conventions, you’d be able to RIM into their machine and assist them, and it won’t cost you or them anything. This is madness, I tell you, and you want to be in on this.

If you don’t have RIM already, this is a great opportunity for you to kick the tires. If you do have RIM installed already, you can use it without worrying about it timing out or anything like that. It is the RIM free-for-all from July 1st through to July 18th.

To find out more and to get RIM, if you don’t have it already, just go to

Remember, it’s available for PC and Mac. And if you’re on one, you can control the other. So a PC user can assist a Mac user and vice-versa. is where you go.


Outstanding technical support from Pulseway

We live in a world where accessibility problems are far more common than they ought to be, so we spend quite a bit of time on this podcast talking about those. But to balance it out a little bit, I wanted to share a really positive tech support experience I had this week.

It started off in a pretty dodgy way for years now. I mean, it might be over 10 years, I think.

I’ve been using a tool called Pulseway. And when I started using it, it was very simple. You would just install this little client on your PC, and then you’d install an app on your iPhone. And it would allow you to do things like check for Windows updates, check whether the computer was up or not, restart the computer. If you’ve got a computer that supports power over Ethernet, then you can even power the computer up. It’s a very handy little tool.

I have it installed on the Mushroom FM computer, and that’s what I used it for. And because the Mushroom FM computer is still running Windows 10, every Sunday my time, when we have a live broadcast on Mushroom FM, I give the Mushroom FM computer (which we affectionately call the Mushroom Pot) a quick reboot, and all’s well for another week, usually.

Now, for quite some time, I’ve been getting these emails from Pulseway saying, we want you to upgrade to our new Pulseway Teams plan. I’m on this thing called a Pulseway Pro plan. and I’ve just ignored them, to be honest, because what I’ve got worked. They kept saying we don’t really officially support this anymore, and I kept thinking I don’t care. It does what I need to do, it still works. Leave me alone.

Well, I went to reboot the Mushroom Pot on Sunday like I do (after my invoice had just been processed for another year. It’s really cheap this thing, by the way. I think it’s like $17 a year, or something), and it wouldn’t work. I logged into the iOS app, and it couldn’t see my system anymore. It looked like the system was still running on the Mushroom Pot side.

I wrote to them and I said, “It’s not working anymore. Help!”

And they said well, we told you, we told you that this wasn’t supported anymore.

And I said, “Alright, you got me. What do I have to do?” I need this to be working again because I’m heading off to the convention of the NFB.

Now, I use RIM most of the time. But sometimes, it was just really handy to pull out my iPhone and reset the machine or whatever, so they served two quite different purposes for me.

It turned out that they had to migrate my license, and it was a little bit complex. And they said, “We’re going to get on a Zoom call, and we’re going to have your account manager and an engineer on the Zoom call.”

And I thought, this is extraordinary! Because I’m paying these guys $17, or whatever it is, a year. I mean, it’s just one device that I’m controlling, and they’re treating me like some sort of major enterprise customer, giving me an account manager and getting an engineer on there. And you never know how these things are going to go, because some people can’t really cope terribly well with the fact that you are a screen reader user.

But I got on this call, and they were absolutely fantastic. There were one or two accessibility challenges with the migration. I was on my main office PC, and we actually used RIM. So I shared the screen with Zoom. And then, we got in to the Mushroom Pot via RIM on this machine that I’m on now. And they controlled it, and they did the migration for me. It was a slightly complex thing to get done.

But now it is done, and I was just so impressed with their demeanor, their cheerful way of handling this. The fact that they gave me all of that resourcing, and they’re not charging me anymore as well because I’m a legacy customer and I’ve been with them so long.

They’ve given me this much upgraded thing now where I can do things like get notifications when the machine goes offline, or when a critical update’s waiting to be installed. Any number of things that I need to know about. If someone was to log on or anything like that, I get a push on my iPhone now. So I’m getting a substantially better service than I got before, but they’ve locked in the price. And the service that they gave me when I migrated was just outstanding. I can’t recall the last time I had such a good tech support experience.

Now, there are a couple of things I still need to get to the bottom of. For example, I find the web page of the new instance that I have and the client on Windows easier to control, for some reason, with NVDA than JAWS. I’m not quite sure why that is. I’m sure there’s a reason, and we’ll probably get to the bottom of that when I get a moment to draw breath.

But we’re back up and running now. The iOS app is as accessible as it ever was. And just in case I need to restart the machine while I’m away and I just have my iPhone with me, I can now do it.

So thank you to Pulseway for going to all that trouble for a tiny customer paying them very little. That is super impressive.

Accessibility of X for iOS Badly Broken

Well, as the old saying goes, from the sublime to the ridiculous, because this is the complete opposite of the story I just mentioned. And I’m prioritizing this one because it is important.

It’s come in from Francisco Crespo. He says:

“This morning, I upgraded X”, (that’s the artist formerly known as Twitter) “to version 10.46, and found out that the Musk administration broke the accessibility of the most essential feature, the timeline. So now, when you scroll through the timeline, VoiceOver will only speak the user who sent the tweet, but you’ll have to open the post to read its text. It’s a terrible regression, and we don’t have an accessibility team we can report this to.

Is someone else experiencing this same issue?

Unfortunately, switching to Mastodon is not an option for me. I would lose access to many news sources if I do.

I hope they fix it soon. But I’m thinking, who could I reach at X to report the problem? Hope someone in the audience will have an idea.

As always, thanks for putting the podcast together. It’s a great resource.”

Thank you very much, Francisco!

I can confirm this problem.

I count myself lucky that when I do want to use X, I still have Spring for Twitter, which is still working. Don’t tell Elon, but it is still working. So that’s not affected, and that might be an option for you in the short term.

I see quite a bit of chatter about this around the place, and that an X engineer has responded and apologized, and said that a fix is coming. But I’ve also heard other people say that some of the problems with the app go a bit deeper than even the timeline. Not everybody’s confirming this.

I don’t use the native X app myself. I never have. I think it’s a pretty dodgy user experience, compared to some of the other ones that were out there.

But let’s hope that this gets addressed in a prompt manner, and that people will be up and running as soon as possible.

This is actually a timely conversation, really, because it gives me a chance to press a new button.


Voiceover: X marks a spot where you can follow Living Blindfully. Receive show previews and alerts the moment we publish new audio or a transcript.

Because of the short usernames on X, we’re LiveBlindfully. That’s all one word.

Follow LiveBlindfully on X, and stay in touch.

Beatles Book

Voice message: Hello, Jonathan!

I know you’re a Beatles fan. As am I. And this is why I am interested to know what you think of a book that I read.

The book is called “How the Beatles Destroyed Rock and Roll”, (And if you’re like me, you’re gonna hate the title. I’ll tell you more about that in a minute.), and the author is Elijah Wald. E-L-I-J-A-H W-A-L-D.

I think the title is controversial. I think the author meant for the title to be controversial. I don’t know if that makes it more irritating or what, because I’m irritated by the title.

I think it could have been called How the Beatles Destroyed Rock and Roll and Created Rock. That’s my personal title for the book.

Even so, I really liked the book because there was a lot of information about music from the 1890s (I think that’s when it started), to the 1960s that I didn’t know about. A lot of pre-rock and roll kind of stuff.

So if you want to read it and give your opinion, I’d really be interested. And maybe other listeners would, too.

It is on Bookshare here in the US, and it’s also on Kindle. It’s also on BARD, too.

But I don’t know of a commercial audiobook version, and I think, based on the topic, it might actually work. Or maybe I think that just because I love reading books about music, and I almost want to hear someone narrating about it and be able to imagine it.

Jonathan: Oh well, speaking of imagine, yeah, imagine there’s no critics. It would be a much better world.

I mean, look. I’ve read this book. It’s some time in my life I won’t get back.

I do agree with you that it does have some interesting facts about music dating way back. And it’s trying to trace the origins of rock and roll. And then, essentially saying that a bunch of white kids hijacked rock and roll and turned it into something else.

And I mean, that’s true. You can’t really fault that. Music does evolve. Everything evolves.

But I think there’s an element of hypocrisy on the part of the author in the sense that he uses the Beatles in the name of the book to sell lots of books. Why? Because the Beatles are popular. And if you can get the Beatles into your title, you’re guaranteed to sell to the Beatle crowd.

And then, it seems like he spends a lot of time telling us how wrong we were for liking the Beatles so much. You get a certain kind of literary critic and music critic that kind of thumbs their nose at the masses for liking what they like.

But to some degree, John Lennon would probably agree with this guy in the sense that John Lennon felt that the Beatles had lost track of their roots. He had different views on this at different times, So he felt like the Beatles had sold out to get signed by EMI to get Brian Epstein to manage them. He later seemed to express regret at the Sgt. Pepper era, which he at some points in his life, thought was a bit pretentious.

And that was one of the reasons why the Get Back project was conceived. He wanted to get rid of all the studio wizardry and get back to good old rock and roll roots.

But the Beatles were like sponges. they were soaking up so many influences and bringing them into their music. And when you consider the way that these kids’ music evolved so quickly when they were in their 20s, it’s just extraordinary.

So sure, maybe the Beatles did, as you say, destroy rock and roll, and they created something else.

The thing about the Beatles is, and Billy Joel made this point in a fantastic lecture I heard from him, how do you define the Beatles? How would you pigeonhole them in today’s niche radio market? These are the people that did Yesterday and Helter Skelter. They’re the people that did Here, There, and Everywhere and Come Together. They were incredibly gifted musicians just soaking up so many influences.

As for me, I am focusing on the fact that if you were in the United States in 1964, you’d have just passed the anniversary of the release of the Hard Day’s Night album, which was actually released first in the United States on the 26th of June, 1964. It was a different track listing, of course, from the official Parlophone release, and it contained some George Martin instrumental work from the movie.

The A Hard Day’s Night Parlophone album was released on the 10th of July, and that’s the one that I tend to celebrate because I consider the Parlophone releases the official releases, even though I know they’re not the releases Americans grew up with.

So on the 10th of July, on Mushroom FM, I will be going through A Hard Day’s Night. We’ll hear from the Beatles themselves, who will be talking about the album. We’ll play all the tracks from the album. I’ll give you some backstories about the tracks, and we’ll also play some other tracks that were recorded at around the same time but did not feature on the album because there was another EP that came out about that time.

So if you are a Beatles fan, you might want to catch that A Hard Day’s Night special at 2 AM Eastern. And then again, it’s 2 PM Eastern on the 10th of July.

If you want to find out when that is in your timezone, head over to, and the schedule will be displayed in your own timezone.

The Lenovo ThinkBios Utility

Caller: Hello, Jonathan! This is Larry Reba.

I am taking classes to become CompTIA IT certified. I’m taking my A1 CompTIA classes, and one of the things we had to learn in the class was about the BIOS UEFI screen, which in most computers is not accessible to a totally blind person.

Now, I heard you say that the Lenovo ThinkPad is accessible for the BIOS, you see.

So I was wondering. Could you, at some future point, give a demonstration of the computer’s accessibility on that screen? I personally would like to hear it and see how it goes, but I’m sure there are other totally blind techies who may or may not know that this menu actually does exist.

Jonathan: Alright, Larry. Why not? Why not?

And to make sure we try and take as many people on this journey as possible, let me start with some basic information about what we’re doing here.

First, what does BIOS stand for? It stands for Basic Input Output System. It’s a small piece of software stored on a chip inside your computer’s motherboard. When you turn on your computer, the BIOS is the first software that runs. Its main job is to start up the computer and get it ready to use. It checks all the components like the keyboard, mouse, and your hard drive to make sure that they’re working properly. And then, it tells the main operating system, like Windows, to start up.

So because the BIOS is running before Windows is running, your screen reader isn’t running at this point, and that’s what makes it inaccessible to a blind person.

When a user enters the BIOS setup, they can change various settings to control how your computer works. Let’s go through a few of the things that you can change if you get into a BIOS utility.

Probably, the big one is the boot order. This decides which device the computer is going to look at first to find the operating system. You might choose to boot from a hard drive, which is usually the case. But you could change to a USB stick, or a CD or DVD, some sort of removable media. This can be quite useful for trialing alternative operating systems.

This is also a way to set the computer’s date and time at a deeper level.

You can enable or disable certain pieces of hardware, like USB ports, or built-in Wi-Fi.

Particularly on laptops, many computers allow you to toggle whether the function keys perform their standard Windows functions, which is generally what those of us who use screen readers want, or whether they perform functions relating to the system like increasing the brightness, changing the volume, or toggling Wi-Fi on and off. On some computers, you can toggle the behavior of the function keys with a special key when you’re using Windows, but not all computers allow this. And if that’s the case for the one you have, you may only be able to toggle the function keys for optimal use by a screen reader user by changing a setting in the BIOS configuration utility.

It’s a powerful thing, this BIOS utility. So you can, if you want to, set a password to prevent unauthorized people from changing your BIOS settings.

You can also adjust settings that affect the speed of your computer like the speed of your processor, and also memory settings.

While configuring the BIOS can be useful, it’s important to be really careful in here. If you change the boot order incorrectly, for example, your computer might not even start up properly. I mean, if your computer’s looking for a USB drive and that USB drive isn’t there, then your computer’s not going to boot.

Disabling important hardware accidentally can also cause parts of your computer not to work at all. For example, turning off USB ports means that you can’t use USB devices.

Changing CPU or memory settings can improve performance. But if you set them incorrectly, you can cause your computer to become unstable, or even not start at all.

Setting a BIOS password is a great idea for security. But if you forget it, you might have trouble getting back into the BIOS to make changes.

Historically, it’s been necessary for a blind person to ask for sighted assistance to make BIOS changes. You may be able to get it done by yourself using tools like Seeing AI, but it’s a really cumbersome process.

Lenovo offers an accessible way for a blind person to configure BIOS on ThinkPad devices, thanks to the ThinkBIOS Utility. Now, this isn’t quite as good as the access sighted people have because if you make a change that locks you out of Windows, it also locks you out of the ThinkBios utility, because it’s a Windows application. But it’s certainly better to have access to configuration in most circumstances on your ThinkPad device, compared to no circumstances at all, which is the case for most computers.

To install this, use your search engine of choice and find ThinkBios Configurator. You can download that and go through the process.

There’s also a user guide, should you need that.

I’m running it now. When you run it, it takes maybe a few seconds to load the settings from BIOS into the utility. And then you’ve got some material at the top, and a very handy accessible table with 50 rows in it, and we’ll have a look at that in just a moment.

But I’m at the top of the screen now.

JAWS: ThinkBios configurator. ThinkBios config tool v1.35, Lenovo commercial deployment readiness team.



Select a previously created file of settings.

Browse, … button.

Jonathan: The fact that this is accessible is a very nice byproduct of what this tool is actually for. And that is that because ThinkPads are so reliable and good business workhorse machines, you will find ThinkPads deployed in many businesses. And this gives the IT department of a business the opportunity to configure all BIOSes on ThinkPad devices the same way. So you can get it the way you like it, save it, and then upload those settings to other machines. It’s a very straightforward process.

JAWS: Apply config file button, unavailable.

Jonathan: So once you’ve chosen a file to upload to the system, you can apply the file, and all the changes will be applied to the BIOS at once.

JAWS: Create a .ini file containing the current settings of the target machine in the working directory.

Export settings, button.

Jonathan: You can export your own settings. And it might be a good idea to do that, if you really do want to play in here. This can be a dangerous place, kind of like the registry editor. And if you inadvertently make a change, it could be dodgy, as I tried to indicate in the introduction. So at least if you’ve got a .ini file containing a good set of settings, you may be able to get back to a good place, if you can get Windows to run, of course, and get in here.

JAWS: Security actions.

Supervisor password set on the target machine checkbox, not checked.

Use different credentials to connect the target machine checkbox, not checked.


Target local button, unavailable.

Target remote, button.

Accessing settings on LocalHost X1 Carbon Gen 9.

Save changed settings, button.

Jonathan: Once we start exploring this big table of options, if you make a change, you will need to save it. And then, the utility will upload the change to the BIOS of your ThinkPad.

JAWS: Restore BIOS defaults, button.

Jonathan: This is a handy button to know. If you really just want to get things back to their default behavior, this is how you do that.

JAWS: Reset to current settings, button.

Table with 4 columns and 50 rows.

Jonathan: Now, you do have this big table here, and this is one way to navigate. You can investigate settings this way.

It also works surprisingly well just by tabbing through. So if I go back to the top, and I’ll just tab, …

JAWS: File upload, edit.

Export settings, button.

Supervisor password…

Use different credentials…

Target remote, button.

Save changes…

Restore BIOS default…

Reset to current settings…

Table column 2, row 2.

Wake on LAN, combo box, enabled.

Jonathan: Now, I don’t have any auto forms mode malarkey going on on JAWS. So if I just turn forms mode on, …

JAWS: Wake on LAN, combo box, enable.

Jonathan: And it’s a standard combo box. So if I want to disable Wake On LAN, I just up arrow, …

JAWS: Disable.

Jonathan: and it’s now disabled. Or it would be, if I saved this.

JAWS: Enable.

Jonathan: I’ll press tab.

JAWS: IP V6 network stack, combo box, enable. 2 of 2.

Jonathan: So this is amazingly accessible.

JAWS: UEFI boot priority, combo box, IP V4 first. 2 of 2.

Mac address pass through, combo box, disable. 1 of 2.

Always on USB, combo box, enable. 2 of 2.

Track point, combo box, automatic. 2 of 2.

TouchPad, combo box, automatic. 2 of 2.

Jonathan: If you want to, you can disable the TouchPad. But you can also do that from the Windows TouchPad utility.

I personally find myself accidentally bumping the TouchPad sometimes, and getting somewhere where I don’t need to be. So I disable it from within Windows. And then if I’m getting someone to assist me, or someone wants to use the computer, they will whine at me about the fact that the TouchPad’s disabled. And I go into Windows and re-enable it again. I don’t think I’d want to disable this at the BIOS level.

JAWS: FN control key swap, combo box, enable. 2 of 2.

Jonathan: They made a change, I think it was with the 10th generation ThinkPad. But before that, ThinkPad had the FN key on the very far left where most Windows users expect to find the control key. And the control key would be to the right of the function key. It used to trip a lot of people up, and users would complain about it a lot, so they provided a BIOS setting to swap them over, which I certainly used. Because while I don’t think I changed the default when I last used ThinkPad, (I’m not sure if you could back then) I’ve used so many other laptops where the control key’s at the far left. It was tripping me up all the time. So it was great to be able to get in here and swap that over.

Now, ThinkPad have succumbed to the will of the people, and the control key’s where you expect it by default, in current models.

JAWS: FN sticky, combo box, disable. 1 of 2.

FN key as primary, combo box, enable. 2 of 2.

Boot display device, combo box, LCD. 1 of 2.

Total graphics memory, combo box, 256 mb. 1 of 2.

Boot time extension, combo box, disable. 1 of 6.

Speed step, combo box, enable. 2 of 2.

Jonathan: now, there are a lot of settings here, and I’m not going to explain them all, or go through in great detail. But what I’m proving here is that this is a 100% accessible way to configure your BIOS, as long as you can get into Windows. So be aware of that caveat.

But the fact that this utility does exist and it’s so accessible is a big feather in the cap of Lenovo because it’s better than nothing, which is what we’ve historically had.

So that’s a quick overview of the ThinkBIOS Utility for ThinkPads from Lenovo.

Advertisement: Living Blindfully is brought to you in part by TurtleBack. For over 2 decades, TurtleBack have been making leather cases for the products we use every day.

Many of us love the Orbit Writer because it’s a handy little device that allows you to input Braille on the go on a range of devices, and TurtleBack have a leather fitted case for it. It features a protective magnet-secure cover, a zipper compartment, and a non-skid base. Because the case is fitted, all ports and buttons are fully accessible. But the device that you purchased is protected.

There are plenty of other product at their store as well. So why not check them out? That’s

And when you’re there and you make a purchase, if you use the code LB12 (that’s LB for Living Blindfully, and the number 12), you get 12% off all your purchases at check out.

If you prefer to make a phone call, they’re standing by to receive that call at 855-915-0005. That’s 855-915-0005.

And remember that coupon code, LB12 on the website at

Dr. Nicholas Giudice Discusses How We Ensure Blind People Can Use Autonomous Vehicles, and Robot Guide Dogs

When was the last time you were inconvenienced by your Uber or taxi not turning up on time, or your guide dog got refused, or your friend who kindly offered to drive you somewhere forgot about the commitment that they made?

I want my self-driving car, and I want it yesterday.

Ready or not, the world is changing fast. Autonomous vehicles have the potential to introduce some of the most impactful change for blind people. So how will it work? What’s the current state of the industry? And how do we influence this new technology to ensure that it’s useful to, and usable by blind people?

Dr. Nicholas Giudice of the University of Maine has been giving considerable thought to these questions. And he’s blind himself, and I think that’s relevant to the discussion. Nick, it’s a real pleasure to talk with you. Thanks for coming on the show.

Nick: Yeah. Thanks for having me.

Jonathan: You’re the director of the VEMI Lab. Can you give me a bit of an overview of the kind of work that you’re doing in your role?

Nick: Yeah. So I started the lab. I don’t actually now direct it.

I’m just focusing on the research side of things, and my role here is thinking about topics of interest to me and our group (we have a large group here), and overseeing lots of different projects, looking at ways to fund those projects, as is always the case of an academic, and trying to talk with people.

This term is like siloed labs in academia. You hear this all the time, but it’s all too often the case. You get someone studying something. It may be interesting. And they write their paper, and give their conference talk. And then, they’re done.

And I really believe that, you know, if you’re going to make a difference in what you’re doing, how do we get it out to people? – people that can actually use it. How do we get input from those people to make it work better?

It’s this idea of, I have some ideas. I certainly don’t have all the most of the best ideas. I get it from the participants that come in and say hey, Nick, that’s kind of a cool idea, but you’re on the wrong path. Why don’t you think about this, or why don’t you do that? And so I think that’s really important for designing stuff that actually works.

And so that’s another big part of what I’m doing, and why I’m excited to talk with you and your community today. Hopefully, we’ll have questions, and can give some feedback as well.

Jonathan: Having heard a bit about your work and read some of it, I know that you’re passionate about the whole concept of “Nothing about us without us”, which has kind of been the catch cry of the disability sector for some time.

Jacobus tenBroek, who was the first president of the National Federation of the Blind, had a great line. He said, “My road to hell is paved with your good intentions.” [laughs]

And having been in the product management business for quite some time, I have seen all sorts of people come to my door with products that they thought were the next big thing for blind people, but they hadn’t actually thought to ask an actual blind person about the viability of those products. This is a real issue, isn’t it? – actually getting us involved in that conceptual phase of a piece of work.

Nick: You know, this idea of lived experience, I think it’s kind of a cliche or catch word that initially, I was like you know, you hear this and it’s important. Until I really started doing research and realizing how many people, as you said, may have great ideas and great intentions, but don’t have any idea of… like, “Oh, I saw a blind guy have trouble crossing the street, so I came up with a solution.”

I’m like, “Well, have you talked with the person?”

“Well, no. I saw they had a problem, and so I solved it for them.”

And I realized how much, and it is most often.

I call it the engineering trap. Like it’s not driven maliciously. It’s mostly driven by, you know, people have some idea. They think that they have a solution, but they don’t actually find out about user needs. And they get stuck in their own intuitions, which are often uninformed.

And at the end of the day, it’s the end users, the people that really know what the issues are, what works, and what doesn’t work.

And so I think if you don’t bring that into a team, whether it’s having someone that’s on the team that’s blind, or you talk with blind folks and get that input, you just aren’t going to end up making stuff that people actually want to use.

And as you know, we have various mobility devices and electronic travel aids. And a lot of what I do is in the navigation space, broadly defined.

And these things have been around since the 1960s, but the majority of them haven’t been accepted. They’ve just mostly failed because they’re not actually solving real problems that people want solved. I guess I’ll put it that way. And I think that’s something that people are beginning to realize, but it’s a slow process.

Jonathan: Yeah. I’ve come to the view over time that there is actually a culture about the way that the blind community likes to be interacted with. And in the same way that a company like Apple, for example, spends a lot of time thinking about how do you engage with a distinct geographical market like China, and they will behave differently in that market, I think behaviour needs to be different when you’re catering to the blind community. Because we are very sensitive about having things done to us, for good reason.

It happens far too often. It seems to me that a lot of the successful products embrace blindness culture. They’ll turn up to the blindness conventions. They’ll be very accessible in terms of how they engage with people. And they’re the products that tend to work.

Nick: Yeah, I would agree with that.

Although with the caveat that I think that there are, you know, …

98% of blind people have very similar problems around, let’s say, navigation. The challenges are the same. How do we figure out where something is? You want to make sure you’re crossing a street safely.

But sometimes, the small percent of differences make it hard for people to know how to make that connection. And I guess what I mean is, I think of blindness as about not being about vision loss. Most of the challenges, it’s really about information loss and how to get that information.

Jonathan: Yes.

Nick: And I find people have very different views on it.

For me, I’m willing to pay for information, for instance, if it’s going to help me do what I want to do.

Some people really object to that. They say well, I shouldn’t have to pay for this. A sighted person is not paying for that information, so I don’t want to have to pay for it.

So I think sometimes, these ideologies get in the way, especially with technology design. And I think then, people get stuck in this rut of saying well, I’m on this side of the equation and you’re on that side.

And what I’m interested in is how do we kind of bridge that to say okay, we’re still dealing with an issue that most of us have. And maybe, there’s solutions.

There’s not a one-size-fits-all solution for anything. But for me, in a lot of cases, I’m willing to put resources out to get information, if I think it’s going to get me what I want, or something that I’m having a problem with.

Jonathan: When you talk about autonomous technology, what do you mean by that?

Nick: It’s a broad topic. I mean, this whole idea of intelligent systems and autonomous technology, it can be, you know, you can have an autonomous vacuum, you can have autonomous machines, robots that go around our campus now delivering food.

I’m interested in it, in terms of autonomous transportation, particularly autonomous vehicles, mostly, it’s because it’s such a pain in the butt to get around when you don’t see, especially when you live in an area that doesn’t have a ton of public transportation. And I’ve always wanted, I’ve liked driving. I’d always enjoy driving my friends’ cars in a parking lot, and they’d be going left, right, straight. We just missed something.

But the independence and agency that driving provides most sighted people, I think that they most often take for granted, just because it’s something that they just do.

And when you’re relying on buses, or trains, or friends, or Ubers or whatever to get around, often, there’s challenges involved with that.

The level of independence that an autonomous vehicle, something that would let me drive when I want, where I want, how I want, has always excited me. So when this started kind of coming into popular culture, where the auto industry is now putting billions and billions of dollars into this, and we started realizing this is happening, although slower than Elon would have told us, it made me realize that the people that are the driving limited populations that currently either can’t drive, or have challenges driving, older adults, are some of the people that are going to have the most benefit from these new types of vehicles.

But they also have some of the biggest challenges. If you look at a Tesla or most cars with any type of automation, now, there’s a big touchscreen, and there’s no real clear access to that information. And that’s just the models of what these things are going to be are completely touchscreen-based vehicles, no human driver, and they’re going to be ride-shared. I mean, that’s what pretty much all manufacturers are saying.

So now, you’re going to have an Uber that you’re going to need to go out and find, but there’s no human to talk to to figure out where it is. There’s no way to communicate with it because it’s all a visual-based cockpit.

And so this has really driven my interest in saying well, here’s the potential for something that’s really amazing and could really change the landscape of transportation and independence, but it’s being developed in a way that’s not going to work for a lot of people. And so that’s really driven a lot of my interest in saying well, how do we build trust, what we call human-vehicle collaboration? How do you get the humans and the vehicles to work together? And how do you do this in a way that’s inclusive and accessible?

Because so many of the people that are going to benefit the most are the ones that need to be thought about, and aren’t being thought about in current designs.

Jonathan: So it’s profoundly concerning, isn’t it? Because I think people have been holding out hope that eventually, we will get to the self-driving vehicle nirvana and a lot of our problems will be over. But it sounds like there is the potential, without intervention, for us to end up in a worse situation because of the way that these devices are being designed.

Nick: Yeah, I think that’s exactly it. I mean, I think there’s huge amount of interest. So if you ask 8 out of 10 just random people you stop on the street, or if you ask 10 people, 8 of them are going to say no, I’m not going to get into these things. I don’t trust them.

The majority of blind folks that I’ve talked to, and older adults as well are like, yes, I’m really excited about it, you know, this is going to make a huge difference.

But absolutely, if the kind of trajectory of how these are being designed now doesn’t change, then there’s going to be huge problems.

I think one of the benefits, though, is that auto manufacturers are starting to realize oh, we have all these potential users, passengers that we never used to consider before. We didn’t think about blind people as customers. But now, they will be, and we have to start thinking about them, especially if these are going to be ride-shared situations.

We’ve interacted with a couple of companies. I’m heartened that they are realizing that this is an issue, and realize that something needs to happen. They don’t know what, they don’t know how, but they’re at least saying you know what, we see this as a problem.

Jonathan: And we’re in this period of transition, it seems, where even fully autonomous self-driving vehicles (and perhaps, it would be helpful for us to talk about the 5 levels that are defined there in a second), but even these fully autonomous vehicles seem to, in some cases, require someone to have a driver’s license, even though they don’t have to control the vehicle. There’s kind of a, we need human control in case of emergency override.

Nick: Yes, [laughs] that’s a problem. So if we talk about a fully autonomous vehicle, there’s kind of 2 levels – level 4 and level 5. These are levels that are set of kind of how autonomy can vary from no autonomy to level 5, which is fully autonomous with no ability to manually operate the vehicle. So there’s no steering wheel, there’s no pedals. So you can’t drive that vehicle.

But yet the current, at least in the US, and this is true in most places. There’s kind of this patchwork of policies and licensures and things that are out there that, as you said, a lot of them require a license, even though it’s kind of dumb because even if there was a problem, you couldn’t drive the vehicle. There’s no steering wheel. And so it doesn’t really make any sense.

But the technology is way ahead of the policy, and the policy is not at all integrated across different states. And so you have this whole weird patchwork.

It seems like there needs to be something like an operator’s license is what we’ve kind of argued for, where you maybe have something about age or other factors that kind of guarantee who you are and verify who you are. But it has nothing to do with whether you have a current driver’s license, or whether you can drive, because that’s irrelevant to these future cars.

Jonathan: Self-driving vehicles are taking a lot longer to roll out than I had anticipated. Is the problem there fundamentally societal, or technological? I just get the feeling that there are a lot of human beings who are very reluctant to surrender into control of their vehicles, and that’s really what’s holding things up.

Nick: Yeah. You know, it’s the $64 million question. There’s a lot of reasons, and I don’t know.

I would say that the technology everyone thought was better than it is. That’s part of it.

Part of it is cars are being tested in places that have often clean environments, for lack of a better way of thinking about it. So in Maine, where we are, it can be snowy a lot of the year. The lines get rubbed off the road by snow plows. These are things that cars would really need to use. And so they’re not being tested in these types of environments.

But I think it is that trust issue.

A big part of it is that auto manufacturers want these things to happen. They will reduce accidents to near 0. And there’s obviously going to be some issues during this transition of semi-autonomous, which we can talk about, which you hear about in the news.

Jonathan: Mmm-hmm.

Nick: But it’s an issue I talked about earlier. Especially in the States, driving is like this fundamental right that people think they have. And they don’t want someone driving them without them having any control, like you may have to actually go the speed limit, or follow traffic laws. And that’s not things that people want to be told they have to do.

It’s like the classic trolley problem, if you take an ethics class. Like, how will the car decide to handle if it’s going to hit a kid that runs out in front of you, or it’s going to swerve into a brick wall and kill you? And people don’t want to get into a car that they don’t know how it’s going to deal with that situation.

So I think there’s a lot of kind of this black box that happens when you automate something.

I think, one of the things that would be really helpful is to try to visualize that black box. So how do you get these vehicles (and I want to do this in an accessible way), to show the decision space? So when the car is going to do something – it’s going to swerve, it explains why. If the sensors are picking something up and it starts slowing down, it tells the passengers why.

I think right now, that’s all just kind of happening. And I think until that’s done better, people aren’t going to trust them.

Jonathan: I mean, that classic trolley problem is such a difficult thing, isn’t it? Because this is a computer we’re talking about, and it will make a rational decision that it’s programmed to make. It won’t be a gut instinct thing. It will make a judgment call about whether you are more worthy of saving than the person who’s sort of jumped out at you from nowhere, and that is an incredibly complex philosophical issue.

Nick: And it’s going to need to be implemented in code. I mean, I can just imagine the lawyers lining up to think about how this is going to fall out when something goes wrong.

Jonathan: So how safe is this technology at the moment, in your assessment? If we switched to self-driving vehicles at the level that they are right now, would the world be a safer place? Would there definitely be fewer accidents?

Nick: No. I’m saying this as my opinion.

But the laboratory for autonomous vehicles is the road. So normally, when you run an experiment, we do something in the lab, we develop it, we iterate on it, we get input. And if we ultimately put it out there, it’s pretty well-developed and good. And you’ve done your error checking and usability, hopefully, and all these other things.

But that’s happening on the road. The big data that’s being collected is in all these algorithms – machine-learning algorithms. They’re getting data as they drive, and that’s refining and optimizing the system.

But it’s happening as a process that doesn’t take into consideration so many factors. Like a car recently was just stopped, wouldn’t go out in San Francisco, for getting the newspaper. But it got confused by steam, for instance. It didn’t know what to do with steam. That just wasn’t something that was part of the training algorithm. So there’s going to be so many things like that if you think about the real world.

And that, in conjunction with autonomous cars being driven alongside, … If we have manually driven cars, humans are the most error-prone mammals you can think of, and we operate vehicles and do things that aren’t rational.

I think what will be needed is that all cars, even if they are manually driven, will need to have some baseline sensor suite of the cars being able to talk to each other. I think that will reduce problems.

But the path to full autonomy through semi-autonomous, I think is certainly going to have some hiccups and have some dangerous situations. No doubt.

Jonathan: Yeah. And then, there are some cybersecurity questions, I guess.

I read a sci-fi novel that a listener recommended to me actually, 2 or 3 years ago. And the whole hypothesis was somebody with nefarious intent hijacked a bunch of self-driving vehicles. So cybersecurity, I guess, must be a consideration there as well.

Nick: I think I read that same one. Yeah.

Jonathan: [laughs]

Nick: I mean, it’s scary, you know. It’s easy to spin out different ways of how this would be dealt with, and absolutely going to be something that needs to be thought of more, especially as we start getting these things out on scale.

Jonathan: So I have heard from a couple of blind people in fairly recent times who have been in San Francisco, and taken a self-driving vehicle. I think they may be the Waymo vehicles, but I could be wrong about that. And it sounds like there is some work going on, at least with Waymo and some of these companies, in terms of thinking about what the user experience for a blind person would be like.

And in your work, you’ve essentially dissected this journey into different component parts about what is required for blind people to successfully use these autonomous vehicles.

Nick: Yeah. I mean, I think this is true. And I’m sure, it’s probably resonating with what you found, too.

I mean, information access technology, assistive technology, often there’s some problem that it addresses, but it addresses a part of the problem. And in the navigation domain, that’s so often the case.

We get systems that people want to help with some aspect of outdoor navigation, some aspect of indoor navigation, some aspect of transportation access.

But a successful trip is only successful if you can start somewhere and get somewhere on the whole trip.

And what’s happening with autonomous vehicles, and a little bit of work that is going on with humans, and even less, and kind of the accessibility part of that is often, it’s looking at okay, how do we deal with in-cabin access? Like I was talking about before, how do we make controlling the vehicle, interacting with the AI, how do we make that accessible?

But that’s only one part of it. How do you get to the car if it’s going to be ride-shared, like we talk about? How do you find it?

When it drops you off, how do you get information about where you are? You’re going to be sitting in different orientations. There’s no reason to sit forward anymore. And so a lot of the prototypes of people in these kind of C shapes are different types of seat orientations. So if you have a natural language saying oh, your destination is on your left, what does that mean? Like if I’m sitting in one orientation and you’re in another orientation, there needs to be some way of knowing the situational awareness of where you are to give that information correctly.

And so I’m really interested in looking at this whole idea of the complete trip of getting to the vehicle, being able to interact with the vehicle, know how not only understand the seating but the controls, and then be able to get out and get to where you want to go in a way that’s seamless, done in a way that works, and it is not like kind of a hack together part like oh yeah, we make the vehicle work but we haven’t even thought about what to do when we dump you off at the end, so good luck.

Jonathan: [laughs] So let’s say then that I’ve called an Uber, and I get the notification on my iPhone that says your vehicle is now arriving. How do I find it?

Nick: So there’s a bunch of different possibilities. One of the ones that we’re looking at is combining a couple of different technologies.

So we’ve done a lot of work here in the lab on combining navigation information that you normally get through natural language (go straight, kind of turn-by-turn route navigation) with what we call vibro-audio maps. So this is using your phone. And when you’re touching the map on the phone, the phone vibrates. But you’re touching it with one finger, so it really feels like you’re touching a line, even though it’s just a flat touchscreen. And so you can trace that line, and then augment that information which is spatial, so it’s similar to what you would be seeing if you’re seeing it, or if you had an embossed map.

But it’s real-time, and it’s multisensory. So you can then augment that with semantic information – names of things, or directions in different language.

And so in this autonomous vehicle assistant (AVA) app that we developed, there’s a Vibro-audio map that’s kind of showing you the path to where you need to go. There’s language. And then, it’s using computer vision. And this is all very kind of early stages, but computer vision, as you’re walking, is also trying to give you information about objects that it’s detecting that might complement what your cane or your dog might tell you about. We did it with overhanging branches and other aspects of cones, because you don’t know what the car is going to pull behind.

One of the things that people kept saying to us is well, it may show up. But how do we know it’s not behind a set of low-hanging branches, or one of these wires that comes down at an angle, which I’ve been knocked out by before?

And then, we’re also using UWB sensors. So these are kind of like radio frequency, kind of like Bluetooth beacons, but they’re directional. And so that’s giving you proximity information. So as you get closer, you can move your phone around and you can hear when you’re pointing at it, you know. It kind of calls out to you. And as you get closer, it gets louder.

And so we’re trying to play with a bunch of different types of cues – spatialized audio, vibration, visual for people that have either low vision or can see, and then combining that with several types of sensors. And I think really, that’s where we’re going to see a solution here, is kind of trying to play with as many types of sensory information as possible. That’s kind of my mantra of don’t just make assistive technology work with one non-visual modality. Oh, you can’t see? Fine, we’ll give you language. That’s great, but the brain actually uses all types of sensory information, so why don’t we make devices that do that? And then, let’s use as many sensors as possible to try to be redundant and solve these kind of really difficult problems of where you are, and error that’s inherent in GPS, or these types of issues.

Jonathan: So in today’s world, I would imagine that a Pro iPhone would be of benefit because that’s got the LiDAR sensor, and the other ones do not.

Nick: Yeah, and LiDAR is really good for distance. We’ve had some challenges, and others are kind of using it effectively. Getting to the code level can sometimes be very difficult. So you can access kind of sensors, but you can’t control them as much as you can on other types of devices. It’s that whole clos open ecosystem thing. But yeah, ultimately, that’s going to be, I think, very helpful because you get distance information much easier.

Jonathan: What’s a bit troubling for me about some of this is that the majority of people go blind later in life. Most blind people are over the age of, say, 80. and they have legitimate cause to get into an Uber – to get to medical appointments, or just socialise with friends or whatever it might be, and they might not want the learning curve of having to understand voiceover and other technologies just to be able to get from A to B. What do we do about that large group of people?

Nick: It’s a really important question, you know. And the stock answer is oh well, older people are starting to use smartphones way more. True, but it doesn’t really address what you’re saying.

I think that there’s going to need to be, for these groups of people, simplified devices that can allow… What we really need to have is 2-way communication between the user and the vehicle, however that is. Maybe it’s a push button that’s using Wi-Fi, or Bluetooth, or NFC that when the person comes out, they have a very simple interface with a button that when they push it, the car honks, or calls out, or blinks its lights.

But I think it’s a bigger concern than that, because these things are going to be so controlled by an app, especially if they’re ride-shared. And so what’s that going to be like? Maybe there’s a simplified app.

But if someone really is uncomfortable using this at all, I think there’s going to be some real issues that are not being addressed. It’s really easy to throw a technological solution, even if it’s not solved yet, but you know kind of where it’s going. And then, you look at the situation that you bring up, and everyone just kind of shrugs their shoulders. What do we do?

Jonathan: You may be aware of this device called Glide, which a lot of people are talking about. If you come across this, that has all sorts of promises to have all sorts of sensors and other technology in it. And it’s a mobility aid that would essentially replace the white cane, and targeted at older people. Maybe something like Glide would be an answer here.

Nick: So yeah. I talked with Amos Miller, Glide’s guy, about this. And that’s exactly what I was thinking, Jonathan, like helping to get to the car. If there was a way to have something like the Glide device be able to interact with the onboard computer in the vehicle, then it could help just kind of guide the person there.

I was skeptical, and I told him, when I heard about it.

I did try it at CSUN, and we were going through the exhibit hall, which is just a huge cluster. And it was surprisingly good.

I think it didn’t have a lot of sensors. There were a lot of things that I found to be… It’s early on, and the UI could work a lot more.

But I find that to be really promising. And to be able to couple that with other sensors and other interfaces, like if he could take the Soundscape stuff or something like that using spatialized audio, … I mean, spatialized audio is something that’s been around forever.

The guy that I did my post-doc with, Jack Loomis, really kind of did the early research on this with blind folks in the 80s and 90s, but it never was commercialized. And then, Soundscape made that happen. And so, I think it’s people like that that are taking kind of interesting ideas and trying to put them into tech in a navigation context.

I think the question now is just how do we marry those technologies to make them work better?

Jonathan: And it’s even something as simple as getting in and out of the vehicle.

I must confess. Every so often, I get a Tesla driving me when I request an Uber, and I have the devil of a job finding the door handle on those things. [laughs]

Nick: The damn door handle, yeah. [laughs]

Jonathan: I still can’t find them very easily.

So I mean, it’s just something as basic as that. Once you’ve found the vehicle, you’ve got to be able to get in and out of the vehicle.

And then, of course, when you’re all settled in and you’ve fastened your safety belt, you’ve got to be able to have confidence that you are going where you think you’re going.

Nick: Yeah. I mean, I’m glad you brought it up, because we spent a lot of time having that computer vision help identify where the door handle is. A number of blind people that we talked with mentioned this.

And others said well, that seems silly. You can obviously find the door handle. But it’s not always easy.

But also, it doesn’t feel dignified in a lot of ways. I don’t know another way to put it. You get there, you start kind of feeling around all over the car, like where’s the damn handle? I want to get there, and get in, and have it just be a seamless process, and not have that be a concern.

So I think little things like that are what can make a big difference.

I mean, but the trust of knowing that it’s doing what you want, and if it’s not, how to interact with it? I mean, that goes beyond an accessibility issue that gets to this, what I call the HI-AI, kind of That connection with the human intelligence needs to connect with the artificial intelligence in a much more consequential way than, if something goes wrong when I’m talking to my Alexa, you know, I don’t get the right ingredients. But when you’re in a car, this can be life or death, or put you in a totally wrong space. So how to make these types of interactions in a way that are not only accessible, but also in a way that just are usable and make sense?

And that hasn’t worked out yet. You get a lot of AI people that know nothing about human intelligence. And you know, people that study human intelligence often don’t know about AI.

What we need is that bridge. That’s true for all these autonomous and kind of intelligent technologies.

Jonathan: And I suppose, there will be varying degrees of interest that a blind person has, in terms of knowing about their surroundings as they’re being driven by the vehicle, and it may vary from journey to journey. So there will be times when I just want to curl up in the back seat with my ThinkPad and get a bunch of work done when I’m on a time crunch, and there’ll be other times, perhaps when I’m at a different destination on holiday or something, where I want quite descriptive scenes of what we’re passing and what roads we’re taking. So that’s another factor to consider.

Nick: Yeah, absolutely. I mean, I can’t tell you how often I’m in a car asking people, what are we going by? What are we going by? What are we going by?

Jonathan: Yeah.

Nick: And my partner, sometimes, she’s just like, I need to drive. [laughs]

Jonathan: Yeah. [laughs]

Nick: And then, other times, I don’t want to know anything. I’m like, would this thing stop telling me stuff?

How to get that right is a challenge, and it’s going to be really individualistic. I mean, it’s not a one-size-fits-all. And I think, you know, you can do focus groups, and you can talk to people and they can say, I want to know this, I want to know that. But you’re going to always want to know something different than me. That’s just what makes us humans.

Jonathan: Yeah.

Do you imagine a voice UI?

I mean, I used to be quite skeptical of voice UIs like Alexa, and Siri, and those kinds of things.

But now that I see what ChatGPT is able to do, it’s becoming increasingly sophisticated. You may well be able to have a conversation and say hey car, what are we passing? Tell me about this destination, or whatever.

Nick: Yeah. I really do think that that’s something that gets around a lot of different abilities, disabilities, …

I mean, there’s dialects and language issues that could happen.

But I really do think that that is going to be a big part of what happens. And the underlying large language models and whatever the AI that’s being used is going to obviously make a huge difference. But I don’t think there’s any way around that that is going to be a big part of how we move forward.

I mean, sometimes, I’m amazed by these things that will describe a scene. And I’m like oh, that’s actually really impressive.

Jonathan: Yeah.

Nick: And then, another time, it’ll do something, you’re like, Impressively dumb. Why did it do that? We have no idea.

Jonathan: [laughs]

At the risk of oversharing, now that we’re talking about being in the vehicle, I have this recurring dream. It’s not like every night, but it’s every so often that I’m in a vehicle, and I’m with a friend who is driving that vehicle. And suddenly, I realize, oh my God! This friend of mine is blind, and they’re driving the vehicle. And I wake up in a cold sweat. I think, I’m in a vehicle with a blind guy, and It wakes me up.

I don’t know what that dream is telling me, but I do wonder what my first experience in an autonomous vehicle would be like. I have not been in one yet, not a fully autonomous vehicle. But is there some sort of human adjustment required, particularly maybe for a blind person who realizes there is not a human being in this driver’s seat driving this thing?

Nick: I don’t know the answer. I think there absolutely will be. I mean, I think a lot of people are going to have that. I don’t know. I mean, I feel like blind people are so used to having other people drive them.

And I know that I am sometimes critical of like, oh, I could just tell by how someone’s driving. I’m like, oh, I don’t think that’s how I’d drive. [laughs]

I mean, there’s just going to be a lot of trust issues, I think.

An interesting study that we just did that I find very surprising (and this isn’t published yet, but we’re working on it), is all these, like I told you, 8 out of 10 people don’t trust these vehicles, and that’s pretty common in the literature.

So we did this big survey. I forget the exact question, but something like would you be willing to take an autonomous vehicle at the end of the day? And a large percent of them said no.

And of the people that said no, we then had them come into the lab, and we said oh, we have this app that we’ve made, that we’re testing autonomous vehicles. We have one on loan. It’s safe, and it’s been approved and all that. Do you want to take this vehicle, or do you want to take a manually driven vehicle, and we’re going to take you over to the coffee shop? And you just pick, it’s like an Uber app. And we want to see how people like this.

And the people that came in that said that they absolutely would not use an autonomous vehicle, as soon as they had that opportunity, they chose that they would. And then, they walk out to the car, and it’s a fake, and we stop the study.

We were just studying this. We don’t actually have an autonomous vehicle.

But we were trying to pit this idea of human trust (and your ideas about trust may conflict with your actual perception and your behavior). And so I think that this kind of shows that people may say they don’t trust these things, but their interest, or their kind of intrigue or what have you, actually is quite different. And they actually want to do it. They want to believe that it works, even if they’re kind of skeptical. And so I don’t know what that means for large-scale deployment, but I think that the idea of these naysayers saying there’s too much of a trust problem, these aren’t going to work, I think it’s certainly going to be a part of it. But I think a lot more people are going to do it just because they’re interested.

Jonathan: So then, you get to your destination. And we know from GPS applications that those last few meters have always been the challenge for blind people because GPS is not so accurate that it can help you find exactly where the door is, for example, of an unfamiliar building.

There may be parking issues. It may be that the vehicle can only park some distance away around the corner from where you need to be. So there’s another complicating factor there.

Nick: Yeah. I mean, it absolutely is.

But I think that this is where that idea of sensor fusion, or kind of being able to use different types of sensing information… So the GPS will have some aspect of this. I think there’s going to be a part of this that’s computer vision. I feel kind of everyone always says oh, computer vision is the answer to everything. I don’t think that’s the case, but I think it’s going to be really helpful if you can have a camera, either on the car, or on the person, or some way that can look around these underlying models that can describe what’s there, and then can connect that up the nearest GPS fix to be able to kind of reduce that error of that last meter problem.

So I think it’s going to be kind of a handoff of a number of different technologies, but I feel like that’s a solvable problem. I think it’s not yet, but I kind of see how it could happen.

Jonathan: Yeah. Because again, there are people with a wide range of mobility levels of comfort.

And you do have people who use services. We have a category in New Zealand that I don’t think is available in the United States for some reason called Uber Assist. And when you call an Uber Assist driver on Uber, they understand that if you require to be assisted into a building and taken to a particular place in that building, they will do that.

So there is the potential to alienate a lot of people who have mobility challenges, or just don’t have competent orientation and mobility skills.

Nick: Yeah. I mean, absolutely. I can imagine there’ll be like a little robot that jumps out and guides you.

Jonathan: Yeah. We can come on to that, actually, because that’s another interesting topic.

Let me ask you about the way that self-driving vehicles might be implemented in the community. Because I guess when I started on this journey hearing about self-driving vehicles, I thought I want to be the first person in New Zealand to own one.

It doesn’t look like that’s the way it’s going. It looks like these vehicles are utilities that you would summon through an app. Is that correct?

Nick: Yeah. I mean, I agree. I want to own one, too.

But I mean, pretty much all the projections that we’ve read and the car companies that we’ve talked with, everyone is pretty much talking about these things being kind of a ride share on steroids type model.

I think, part of it is that they’re going to have a lot more expense. I think, part of it is they need to make sure that they’re being maintained, and everything is being updated. And you don’t want to have some cars that you didn’t do your updates, so It’s not communicating with other vehicles. I think it’s just having control. Again, I don’t have the crystal ball, but that is pretty much what the current consensus of thought is.

Jonathan: How far away are we, do you think, from self-driving vehicles being the norm? I mean, it really struck me as disappointing that after channeling a lot of money into this, Apple has now abandoned its own self-driving car plans. So it kind of feels like this thing is not moving nearly as quickly as many of us hoped it would.

Nick: Yeah, I’ll certainly agree with that. I mean, my answer to when it will be, we might as well just go play roulette at the casino.

I think it’s going to be sooner than others. I mean, you hear everywhere from 2 years to 25 years.

There are cars that are out there. There’s scenarios that are in kind of constrained situations that they’re already doing this.

The norm? I don’t know. I think within 5 years, we’re going to see a lot more autonomous vehicles. And perhaps within 5 to 10 years, we’re certainly going to see that most cars, even previous older manually driven cars, are going to be mandated to have a suite of retrofitted (if it’s an old vehicle) sensors, so they can at least communicate with each other.

When it’s fully autonomous, I think this is going to take some legislation. There’s going to be huge fights about it. I don’t know.

Jonathan: Yeah, and there are lots of public policy questions, right, that still have to be resolved. It seems like, as you said earlier, there are some states where this is really moving ahead quite quickly in urban areas, and others just aren’t touching it.

Nick: Yeah, don’t want to go anywhere near it. I mean, you talk to DOTs (Departments of Transportation) in some of these states, and they’re like, We have no ability to do all the infrastructure that’s needed to make this work. We’re petrified of this.

And then, the other ones are saying, we can’t wait. This is going to cause so many less pedestrian hits and all these things that are on the rise that are really scary.

But yeah, it’s a huge policy nightmare, and I do really believe that technology will be ready to do this far before the kind of legislative landscape is in place.

Jonathan: It sounds like San Francisco is a fascinating place, that blind people are now actively able to try these vehicles out. How well is that working?

Nick: I’ve only talked to a couple of people that have done it. And like you, I think there’s been some real interest, but also some skepticism.

I mean, in general, you hear all the stories of when there’s an accident, when there’s a problem. But if you compare the mile-for-mile driven of an autonomous vehicle versus a manually driven vehicle, I mean, they’re just a heck of a lot safer.

I’m not there, so I’m only going by friends that I’ve asked. I mean, I think people are kind of interested, but skeptical.

Jonathan: Because we’re far less forgiving about machines making an error, aren’t we?

Nick: Yeah, absolutely. I mean, humans will make the error 10 times and you’re like oh, whatever. Machine does it, and you’re like, stupid thing.

Jonathan: Yeah. [laughs]

Let’s talk about robot guide dogs. Because obviously, these self-driving vehicles are only one form of autonomous device.

This has actually come up on the show before. Many listeners have expressed a strong dislike of the concept. They like the companionship that accompanies having a guide dog.

As we record this, you’ve just been training with another guide dog of the traditional furry 4-legged variety. Would you give up your cute, cuddly, affectionate ball of energy for a machine?

Nick: Absolutely not.

But I’ve kind of had this epiphany around robotic guide dogs. And you know, it seems like in one hand, you’re like oh, it seems very different than autonomous vehicles. But they’re both robots, and both deal with different aspects of navigation.

And they’ve been around since the early research in the 80s. I remember reading about these when I was in graduate school in the 90s and thinking oh man, what a dumb engineering idea! Like, these people, what else are they going to come up with?

But I was involved in a project recently of a group. And I told them when I started hey. If you want me to be involved, I’m going to come in and I’m going to give everyone a hard time because I mostly think these are dumb ideas.

And they’re like no, no, that’s great. And what they really wanted to do is not come at this from the engineering trap standpoint (just from the engineering mind focus), but to think about it from, what can we learn from understanding the human-dog team and how that partnership works (by interviewing blind guide dog users, by interviewing guide dog trainers, by going under blindfold, by watching people), and how can that be part of the modeling, the interaction that might be needed in a robotic guide dog system?

And I mean, that’s the right way to do it. So you’re not going to probably sit around and pet your robotic guide dog. I wouldn’t, at least.

Jonathan: [laughs]

Nick: And my dog, I have this bond. It is a strong emotional bond. It’s not just trust. I have this very strong bond with the dog.

But there’s also a lot of situations where you don’t want to take the dog.

It’s been hot as hell here. Last week, it was like 100 degrees. The dog, its feet touch the… When we go out on the blacktop, they could get burned.

I live in Maine. It’s a very snowy part of the country. The dog’s feet can get really cold.

If I’m going to a firework show, or a really loud concert, or some bars. They’re really crowded. I kind of don’t want to bring the dog there.

So I think there’s a lot of situations where a robotic guide dog makes sense to compliment a dog. Sometimes, when I go out, I decide to go with a sighted guide, or I take my cane. I don’t want to take the dog.

But in a lot of those cases, I like using a dog. I like the guidance that it provides.

It’s a very different situation than using a cane. They both can do the same task, but it’s done in a very different way.

And I think that a robotic dog can add a lot of things that an animal dog doesn’t. So you can have a built-in navigation system where it’s telling you not just we’re at the curb, but what the street is, or where I’m passing a bench. Or I can give it information because it can have all those sensors that the autonomous vehicle can. So it can be doing computer vision in the environment that I’m walking by, it can tell me things about the environment, it can have a lot of other aspects of navigation systems that are built right into the dog.

So we’re combining mobility and orientation in one device. And again, that’s another one of these kind of fragmented ecosystem where we have tools that are like, okay, these tools help with mobility. These tools help with orientation. But the complete trip that I talked about earlier is also, you need to do mobility and orientation, or however you want to think of these words. But you need to be able to find and avoid obstacles and know about your local environment, as well as orient in the environment and do spatial updating, and build cognitive maps and these things. And they’re both relevant. So if you could collapse those tools into one device like a robotic dog, I think there’s a lot of potential there.

So yeah, it’s not an either or at all for me. It’s a choice of in this situation, I’d rather use my animal dog, and that would probably be most cases. But when I don’t, I want a robotic dog that works in a similar way. I don’t want it to just grab on and have it tug me around, and I have no agency or autonomy over what’s going on because that’s the whole point of a guide dog is you’re in control, and I think that’s what’s been missed in so many of the last decades of this robotic guide development.

Jonathan: It may be that there’s some sort of protocol developed that allows a self-driving vehicle to send a signal to your robotic guide dog, so your robotic guide dog knows which is your vehicle, and guides you to that specific vehicle because there’s been some sort of data exchange.

Nick: [laughs] Yeah, and that’s what we were talking about with the Glidance device.

Jonathan: Yeah.

Nick: Something like that could absolutely work. And figuring out the devil’s in the details of the logistics of how those handshakes happen. But yeah, that’s exactly how I imagine it being something that’s useful.

Jonathan: This is an incredibly confronting discussion for a lot of people because if this technology gets better and cheaper, and even lasts a lot longer or is upgradable, we’ll surely get to a point where the only reason to have an animal guide dog is nothing to do with mobility. It’s all about sentiment.

Nick: Yeah, it’s certainly a fair point.

I don’t know. I mean, I think there’ll be people that just will always have the view that they can’t build the relationship that you need to, and the trust, the bond. That’s so important. This is what we’ve been doing with my new guy in the last couple of weeks.

With a machine, I don’t know. I think we’re a long way from that.

I’m excited about seeing how they can be complimentary. And I think there will be a lot of people that wouldn’t get a dog that can benefit from a robotic guide dog, in terms of different types of independence that it would bring – whether it’s because of allergies, or they don’t have the mobility, or they don’t have other maintenance things, or they don’t want to deal with losing a dog. When a dog retires, it’s tough.

Jonathan: It is, yes, yes.

But then, of course, there may be some public pressure brought to bear on guide dog handlers who are making that choice to keep their dogs.

And you might have wider society saying well, they were fine once before this technology came along.

But you know, guide dogs can make inappropriate messes. They can shed hair. There are allergy issues. You take them into a hotel when they haven’t been groomed, they leave a mess on the hotel room floor when they’ve shed.

Nick: Exactly. Yeah, yeah, yeah.

Jonathan: And so society will start saying these guide dogs have had their day. We shouldn’t have to accommodate them anymore.

Nick: Jonathan, you’ve obviously been in this field for a long time. You ask all the right questions. [laughs]

Yeah. I mean, I have to admit, I feel I have a little, guilt is the wrong word. But I mean, there’s some feeling of that that I’ve had with…

You started this talking about Ubers. Like, when was the last time, you were asking the audience.

Jonathan: Yeah.

Nick: And for me, it was 3, 2 days before I got my new dog, I had an Uber not pick me up.

And there’s just something where it’s like, the path of least resistance. It just would be so much easier to get my Uber and not have to stress about oh, what happens if it doesn’t get me? What happens if I miss my bus or my plane because it doesn’t want to let a dog in? And it somehow feels easier, but it also feels like a cop-out. And the societal thing, I haven’t really thought of it, but I could definitely see that happening.

Jonathan: Yeah, it’s the surrender thing that concerns me. It would be a shame if guide dog handlers felt some pressure to give up their guide dogs because of these constant refusals and problems.

I was reading some really interesting data from The Seeing Eye that they’ve just published, which indicates that people do have some ride share refusal issues and other guide dog refusal issues, but the vast majority of guide dog handlers are still determined to stick with it. There are a few who feel defeated, but most are still willing to stick with it.

Nick: Yeah. But I mean, a lot of people including myself are also doing things to just avoid the stress. so I’ll pay for Uber Pet.

Jonathan: Yeah.

Nick: And you don’t need to, you shouldn’t have to. And to many people, that makes it prohibitive already, something that’s expensive.

But I’m just like, you know what? I don’t want to deal with it. I just want to know that when this person comes out, I’m not going to have that problem. Because I have enough other things that I’m trying to fight, or trying to figure out. I don’t want to deal with something dumb because of some driver that won’t…

And so I already feel like in some ways, people are waving the white flag and saying well, in a lot of ways, I feel like I shouldn’t do that.

That’s not actually solving the problem. I’m just getting around it through a loophole that I’m willing to pay for.

Jonathan: So the whole concept of robot guide dogs could be a bit provocative, right, because these could be any form factor you want.

Nick: Yeah.

Jonathan: They don’t have to be dogs.

Nick: Yeah, exactly.

And that’s something that talking to people has surprised me because I fell right into that while thinking about okay. Because the people that I’ve been working with, The excitement over these new legged robots as opposed to wheeled ones, which, you know, we live in a world that isn’t flat. That’s fine for vehicles. But a lot of places that we go, you need to be able to go up steps. And so having legged robots makes sense. [laughs]

But a good friend of mine challenged me. He’s like well, why are you even thinking about these things as a form factor of a dog? It could be something totally different. And I think there’s some interesting debates about that.

I think there’s also some aspects of, what do you get from guiding? Are there certain biomechanical and kind of physical aspects of working with something that’s having tension on a harness versus… Because people say well, why don’t I just have the parrot on your shoulder type approach?

Jonathan: [laughs] Now, I’ve got a Monty Python sketch in my head.

Nick: [laughs] You should let it out. Let it out.

Jonathan: [laughs] Yes. Or even some sort of robotic humanoid that you’d be going Sizer Guide with, or something.

Nick: There’s so many things that are not worked out. But I think what excites me is that at least, a lot of people that are thinking about them are starting to realize, you know what? We can’t do this without understanding the human-dog relationship.

And I mean, I’m someone that is in the human-computer interaction field. And pretty much, I’ll always come back to, at some point, if you’re going to have computers that humans use, you need to have to know something about how the human’s going to interact with the computer, because that’s what actually makes it work.

But that’s beginning to at least happen, and I think that’s the turning point for me, kind of how these other things fall out. There’s a lot more that needs to still be figured out, but it at least excites me that it is an option that maybe something that could be a benefit to people. Where those negatives are, I think, still need to be talked about and thought about.

Jonathan: Yeah. And it takes us back full circle, really, to what we began discussing which is that whatever the future holds, we as end users absolutely need to be influencing that future. How do you suggest we do that?

Nick: Well, talking to people that can give you ideas as a person. Be involved in discussions. If you know that there’s someone giving a talk on something, listen to it and give input, challenge what’s being said. But also, think about how it could be useful.

I’m biased. I’m a scientist. I want people to be involved in experiments.

But the things that we’re developing won’t get better, unless we get, as you said, how we started. People with lived experience, blind people that are trying this out, giving input, giving data.

But also, giving qualitative feedback on, this works. This doesn’t work. Because that’s the only way that this stuff will really actually get out there in a way that it needs to, and also be adopted.

We don’t need another bunch of devices that someone thinks about, and puts out there, and then never get used.

And I think it’s important that on the flip side, people that are involved are compensated. I think, sometimes, there’s kind of this pushback of people that have been part of studies, or part of experiments, and feel kind of like the proverbial mouse, you know, like I’m just giving in data, and it goes into this black hole. I have no idea what happens to it. So I think it’s really important to let people know that they’re the grassroots kind of evangelists that are kind of making the difference by being part of something. But compensating people when possible, I think, is also important from a researcher’s standpoint.

Jonathan: And there are 2 broad categories of things we’ve discussed. One is obviously these devices that might be blindness-specific – some sort of mobility tool, robotic guide dog, or whatever it might be. And then, you’ve got the mainstream devices – the self-driving vehicle.

Are you confident that manufacturers are listening, and that there is dialog, and we’re not going to be shut out of these devices and actually end up in a worse predicament than we were before?

Nick: Confident? That may be a strong word. Okay. I am confident that there are people in decision-making roles and at some big companies that are thinking about this. And given kind of the interactions that I’ve had with them, I know that places like the DOT here in the States are putting a lot of money into figuring out supporting groups that are studying this type of stuff. They had this inclusive design challenge that we were a part of, looking at autonomous vehicles and other types of ways that these vehicles can be used by different groups of people. So I think I’m confident that there is thought, and research, and interest going into this.

How it gets deployed and how it gets implemented is sometimes a little bit more dicey.

Uber knows lots of things that they could do to make the guide dog denial problem better, but it’s not happening.

Jonathan: No, no. We even struggle to get their app in a consistently accessible form.

Nick: Yeah, yeah, yeah. So that also speaks volumes, to hat this means.

Jonathan: Well, we’ll watch it with considerable interest. And it does hearten me that somebody of your intellect and ability is on the case.

This has really been fascinating, and I hope we can keep in touch over the years as this technology evolves. I appreciate you coming on the show.

Nick: Absolutely. My pleasure. Reach out at any point, and we’ll do it again.

Jonathan: If you’d like to be a part of research on this fascinating issue, mate, have I got an opportunity for you. In the show notes, I’m including a link to the sign up page for the lab that Nick runs, and you can sign up there if you are interested in being involved with research.

Now sometimes, that research is done remotely, so it’s not too onerous. And Nick is trying to build a larger user base of blind folks who are interested in being potential research participants, as it makes a big difference.

So if you are interested, please do check out the show notes and sign up, and you may have influence over the future of some of this exciting tech.

So are you convinced? Would you like to use a robot guide dog instead of a furry 4-legged one that might do all sorts of interesting things from time to time?

Are you, like me, chomping at the bit to get your hands on a self-driving vehicle and gain some liberation from all of the nonsense that we have to put up with? Have you been in a self-driving vehicle?

I’d love to hear your stories of how it went for you. Do be in touch.

To find out all the ways that you can be in touch – WhatsApp, phone, email, just head over to, and you’ll find a good summary of how to be in touch that way. That’s on the web at We’d love to hear from you.

Advertisement: I have no doubt that this time of year, with the convention season in full swing, is a very busy time for our friends at Aira who are a sponsor of Living Blindfully, and we thank them for that.

One of the cool things they’ve introduced of late is Access AI. This gives you the best of both worlds. You can use the latest in artificial intelligence to take a picture and have that picture analyzed. If you have any questions that the AI doesn’t seem able to answer, or maybe you’re concerned about the accuracy of what the AI is telling you (because we know that the stuff still does hallucinate), you can ask a human professional agent to take a look at the image for you, and confirm whether what the AI is telling you is correct or not. Now, that is a fantastic tool to have in the toolbox when you’re traveling, such as many people are at the moment.

Make sure that you’re enrolled in Access AI from Aira. Grab the app from the Google Play Store or the App Store, and find out more by visiting Aira.IO. That’s A-I-R-A.I-O.

How to Use the Capslock Key in JAWS Desktop Layout

A few episodes ago, we were talking about the joys of laptop layout. And Stefanie was talking about how she wanted to be able to use the capslock key in desktop layout.

I was sure that you could do this. And actually, when I talked to Glenn Gordon about it, so was he. But I couldn’t find it anywhere.

However, Matthew Horspool has. He says:

“Hi, Jonathan,

I’m just catching up on Living Blindfully episodes, and heard the listener contribution about wanting to use the capslock key as the JAWS key in desktop layout.

It is possible to do this, but the setting is well-hidden.

Are you ready for the numbered steps? here are the numbered steps.

  1. From the JAWS window, go to options menu, basics, and make sure use keyboard layout is set to desktop.
  2. Open Settings Center. If you do not have a functioning JAWS key, you can do this from the utilities menu of the JAWS window.
  3. Press ctrl shift D for Delta, to load the default file.
  4. Tab once to the treeview.
  5. Press K for keyboard, and right arrow to open.
  6. Down arrow to general, and right arrow to open.
  7. Down arrow 3 times to JAWS key for desktop layout.
  8. Press the spacebar to toggle this to capslock.”

[applause sound effect]

Woohoo! Woohoo!

But wait, there’s more. Wait, wait, wait. You’re not finished yet.

“Press enter.”

What are we up to now?

“9. Press enter to save settings, and close Settings Center.

Hope this helps.”

You’re a genius, Matthew, a genius. Thank you!

My Recent App Advocacy Experience

Voice message: Hello Jonathan, and hello, everyone else listening to Living Blindfully and in this community. My name is Monique, also known as the Crazy Dutchie here in the Netherlands.

So I just, first of all, want to say thank you, Jonathan, for all the hard work you do week in, week out, next to a full-time job. And I’m pretty sure podcasting in its own is a full-time job, the way you do it.

And also, thank you to everyone in the community who is always willing to participate, comes up with suggestions if someone is looking for something, or is having a problem.

I really enjoy listening to the podcast. Or if I don’t have time for that, reading the transcription. So that’s really cool.

What I really wanted to talk about today was advocacy. This has been talked about a lot. And I will honestly and freely admit, I’m not the best at it.

But it is important. I do recognize that.

So to anyone who’s thinking why should I bother? You should bother because if something isn’t working the way you need it to work, and you suffer from it, or you can’t get stuff done in a timely manner, that’s really not good for you.

Of course, you have to stay polite. That’s always important. But you can communicate clearly what’s wrong.

I finally decided yesterday afternoon, as I was sitting on my balcony, to take up an issue I was struggling with for the last 6 or 7 months or so. Yeah, that took me a while.

We have an online shop here, a grocery store, a picnic supermarket. And you can order there through the app online, and they deliver it at home. A really great service.

But in the last 8 months or so, their app has been degrading a little bit more each time. And yesterday, I updated to the latest, and oh my God! There were really a lot of weird accessibility issues in it. It wasn’t entirely unusable. But well, there’s a difference between unusable and accessible.

So you would have this list of items. And it would say name of item, name of brand. And then, it would say the price and the amount of items, but it wouldn’t tell you the specific flavor of a breakfast cake, for example.

We have breakfast cake from different brands, and they have different flavors. But it would say breakfast cake from a picnic, such and such price, 5 slices, and you wouldn’t know the flavor until you put it in your basket or looked at the details.

There were other weird issues as well.

And so yesterday, I finally took the time to write a very long, detailed email explaining all the ways in which I think the app was really not working well, and what trouble it was causing me. I sent it off.

And today, I actually got a reply. Or actually, within a few hours, I got 2 replies.

Reply 1 was, “Thank you for sending your feedback. We really appreciate you sending it. I’ve passed it on to our app developer team, and they will go work on it as quickly as they can. And if there’s any more problems, please let us know and we will do our best to fix it for you.”

And then later, I got this message that said, “Thank you once again for sending your feedback. I’ve passed on your messages to the developers, and I got a response from them thanking you for your very detailed feedback that they’re going to work with.”

He couldn’t give me an estimate as to when they will actually get it done, but they will get to work on it.

So now, of course, is waiting and seeing what they do with the feedback. But at least it’s been passed on, and they’re going to work with it. And if there’s more issues, they’ve asked me to get in touch again.

I think that’s really good, really nice response. I know it doesn’t always work that way. But until you communicate nicely, friendly, and clearly what you need and what isn’t working, you won’t know how they respond, and You might miss an opportunity.

So thank you, Jonathan, for inspiring me, for encouraging me. Thank you all for doing so.

Keep up the great work and the great community. Thank you. Bye-bye!

Jonathan: Well, good for you, Monique.

And you see, the thing is if you don’t try, you’ll always be left wondering what might have happened if you had. Nothing ventured, nothing gained, and all that.

So that’s very good follow-through from them. Hopefully, that will continue into something tangible.

And if you don’t hear anything in a month or two, I would give them a gentle nudge – re-forward the message to them and say, could I just please get a progress update on this?

But well done for taking that big step. It is an effort to write it all down, to not be too grumpy, and defensive, and putting people on their guard, and just methodically chronicle the areas where you think improvement is needed, and it sounds like you did a brilliant job. So well done!

TV Apps for Deaf-Blind People

This email comes from Coffee Dream, and it says:


I look forward to listening to your podcast sometimes. Thank you for providing valuable and useful information.

I am Japanese, I am completely blind, and use a hearing aid in one ear and a cochlear implant in the other ear. The manufacturer of the hearing aids is Phonak, and the manufacturer of the cochlear implants is Med-El.” (That’s M-E-D-E-L.)

“We, an organization of the deaf-blind in Japan, request that television be made accessible to the deaf-blind.

In the process, I learned that there is an app called All4Access” (that’s all, the number 4, and then access) “that is a TV-viewing app for deaf-blind people that displays TV subtitles in Braille.” (with an uppercase B).

“It seems that it is being put into practical use in parts of Spain and the United States.

There also seems to be an app called GoCC4All that notifies you of emergency alert information.

Is there anyone using it? If anyone has used it, please let me know your thoughts, as I would like to use it as a reference.”

That’s a great question.

Thanks for writing in from Japan.

I hope that anybody who has used those apps can be in touch and tell us how they’re working out.

I don’t believe we have those in this country, and I haven’t come across them before.

I know that Scott Davitt tunes in from time to time, and he may well have. He’s keeping his finger literally on the pulse of this stuff.

And others may know, too, so let’s hope we get some feedback for you.

Orbit Writer

Caller: Hey, Jonathan! It’s Dennis Long. Just wanted to call and give a recommendation.

I know some time ago, I had said the Hable One was a good product. But I found a better one. And that is the Orbit Writer.

Now, let me explain what makes the Orbit Writer better.

It has proper integration of Braille tables such as the old North American Braille, the old what some people refer to as AEB. I forget what the actual name is. But it has that properly integrated because iOS uses the Liblouis tables.

It also works with JAWS, NVDA, it works on iOS, iPadOS, and with Apple Watches.

The Orbit Writer also supports multi-device pairing, so you can pair it to more than one device, and quickly and easily switch between the different channels. It allows you to go between 5 different Bluetooth channels, and one hardwire USB-connected channel.

Want to use your Orbit Writer while on the go? No problem. Just go grab the new TurtleBack case for your Orbit Writer. This TurtleBack case allows you to use your Orbit Writer while on the go.

It has a strap that can be detached, if you wish to have it detached. You are able to put it around your neck, carry it on, have it on your chest, flip open the top, type away. You’re done, close it.

It also has a non-skid thing, so it doesn’t slide around when you put it on a desk, and you can use it while on a desk.

And if you use the Living Blindfully code LB12, I believe it is, you will get 12% off your purchase.

So again, this is a nice little device. It just works on iOS 18 that you’ll be getting in the fall. Because I have tried the developer beta, and it is awesome, the number of improvements they’ve made. So looking forward to that in the fall.

Jonathan: Thanks, Dennis!

I’ve used an Orbit Writer for a year now. There are times when I just want something portable to take with me for writing things down. It is great for that.

And I think the key thing that you didn’t mention about the Orbit Writer is that there will be some people who just feel more comfortable with a Perkins-style keyboard.

The Hable is laid out like a Braille cell. So you’ve got dots 1 2 3 down the left-hand side, and dots 4 5 6 down the right-hand side. and many people may be absolutely comfortable with that. Hable is a fantastic product.

But I’ve been using Perkins-style devices all my life, so I do find the Orbit Writer’s Perkins-style layout just more intuitive. I do it without thinking. It is a really good device.

If there’s one thing I would like changed in it, it’s that I’d like a USB-C port. Pretty much everything else I work with now has USB-C, but the Orbit Writer does not, and you have to remember to take an older cable with you.

Beware of Y2Mate

This could be an important advisory, and it’s come through from Howard. He says:

“In episode 283, a contributor recommended a site called Y2Mate for downloading YouTube videos.

While trying it out this morning, Windows Defender warned me that the site was unsafe.

This, of course, had me a little concerned, so I checked online to see what I could find about the site. I read several articles, and the consensus seems to be that while the site itself is legitimate, it is full of misleading links and pop-ups that could easily trick an unwary user into ending up on other unsafe sites, or downloading malware.

I would suggest you consider very seriously before using this site to download a YouTube video. If you do decide to use it, be absolutely sure you know exactly what you’re clicking on.”

Thank you, Howard!

And it’s not as if there aren’t any other great alternatives out there without that kind of thing to contend with.

The Bonnie Bulletin Ahead of Convention Time


Jonathan: It has been a long time since that music has graced the podcast. And down from 0 G, it’s the famous Bonnie Mosen.

Bonnie: Kia ora!

Jonathan: Kia ora!Welcome!

What do you think of this 0 G business? Because we haven’t talked about 0 G on the podcast.

Bonnie: Well, they do know it’s the bed, right?

Jonathan: Well, no, they don’t, because we’ve never talked about it on the podcast. That’s what I’m saying.

Bonnie: So it’s kind of the new thing in sleep technology, I guess you might say, where it’s supposed to keep your head and legs above your heart.

Jonathan: Yeah.

Bonnie So it helps with circulation, and sleep, and all that good stuff.

Jonathan: So we’ve got this cool new bed, is what we’re saying. And it’s got this really nice remote control, which is fully accessible with the buttons on it.

Bonnie: Yeah.

Jonathan: And you can elevate the head, you can elevate the foot of the bed. And then, there’s a 0 G button that puts it in this magic position determined by science.

We did quite a lot of shopping around for these beds. And one of the things that I found that I thought was quite intriguing initially was that some of them do a massage thing. And I thought, this has got to be good. And Bonnie said ah, it’s a gimmick.

Bonnie: It’s a gimmick. Anyone that grew up in the 70s… I don’t know if it’s still around today, but if you ever went on road trips and stayed at the Days Innss and Holiday Inns and those places, you could put a quarter in the bed, and it would vibrate. [laughs]

Jonathan: Really?

Bonnie: Yes. [laughs]

Jonathan: That’s interesting.

Bonnie: It’d give you a massage.

Jonathan: I wasn’t prepared to believe you. I said, be gone with you, woman. It’s a great idea.

And then, we went to the shop, and even the guy there who was selling them said ah, it’s a gimmick. Mind you, he didn’t like the 0 G either, did he?

Bonnie: No.

Jonathan: But I said I have to try one.

So we got on the 0 G bed with the massage button, and I have to confess, yeah, it wasn’t worth the price. It’s just vibrating.

Bonnie: No. Instead of a quarter now, it’s $2,000. [laughs]

Jonathan: Yeah, yeah.

Bonnie: I mean when you were a kid, you thought it was hysterical because you just sit on the bed, and it would just jiggle back and forth.

Jonathan: Yeah. So we ended up just getting the one without the massage option.

And we didn’t get a split one either. Because some of them, you can split either side. So one of you can have a different level of elevation from the other.

Bonnie: Mmm-hmm.

Jonathan: But the trouble with that is you get this kind of bit in the middle. You know what I mean?

Bonnie: Yeah.

Jonathan: Like the split. I don’t like that.

But we’re quite happy with this. We do the 0 G thing all the time now.

Bonnie: Yeah, it’s great!

Jonathan: Yeah. It’s very nice.

Bonnie: It can be hard to get out of it though, sometimes. [laughs]

Jonathan: It’s incredibly relaxing having this 0 G thing.

Bonnie: Yeah. Are you sleeping better?

Jonathan: Yeah, I think so.

Bonnie: That’s good.

Jonathan: And we also, you know, got this new memory foam thing. I mean, you spend 1/3 of your life in bed.

Bonnie: Yeah.

Jonathan: That’s basically the calculation. So you may as well…

Bonnie: And the commercials.

Jonathan: What about the water beds in the 70s?

Bonnie: I never had one.

Jonathan: My sister had one, so I got to sleep in one a few times.

Bonnie: Yeah. I never slept in one. I never knew anyone. The only time I ever laid on one was a friend of mine had one. I knew a couple people that had them, but we never had them. They can be hard to maintain, and they’re heavy, apparently.

Jonathan: They are. They can go through the floor when you get them, and you fill them up and stuff like that.

Bonnie: Yeah. I mean, there was apartment buildings that you cannot have a waterbed on. And you know people did it. So hopefully, there’s not someone somewhere who was minding their own business one day and a big bed came through the ceiling. [laughs]

Jonathan: But why did they go out of fashion? I mean, one minute, they were everywhere.

Bonnie: They’ll probably come back.

I always wanted a water chair. I always thought that would be kind of cool.

Jonathan: Huh.

Bonnie: I don’t know. I mean, why did shag carpet go out of fashion?

Jonathan: Why did I did it?

Bonnie: Yeah, I think so. It’s probably coming back soon because everything comes back.

Jonathan: So when you put money in the meter or whatever for the vibrating bed, that reminds me of how my parents used to take us to the camping grounds when we were kids. And they’d have these pool tables where you’d put 20 cents or whatever, and the balls would all come rolling out the thing.

Bonnie: Oh yeah, I remember that.

Jonathan: But the white ball was different. It was a slightly different size. And if you accidentally sunk the white ball, it would come back out another little slot, you know? They were fun, those coin-operated pool tables.

Bonnie: I remember those, the coin-operated pool table.

Jonathan: Yeah.

Bonnie: Yeah, you’d hear people go chink in the campground.

Jonathan: Yeah. But we are not going to see, I don’t think, a coin-operated pool table in the near future because we are off to Orlando. And I thought we’d do a bit of a preview.

We’ll take various Zoom devices to record.

Bonnie: Yeah.

Jonathan: And last year, we did a kind of a convention diary , which we played.

Bonnie: Yeah.

Jonathan: So we may well try that again. This time, with the H1 Essential.

Bonnie: Yeah.

Jonathan: But I thought I’d just get you into the studio to tell us about what you’re looking forward to, all those things.

Bonnie: It’s always great to be back in the States, you know, and see everybody. We won’t be able to do any visiting this time, unfortunately. But probably don’t want to visit too much anywhere because it’s so blooming hot right now, so you’d be stuck in the house.

But it’ll be nice to see people. And it’s just nice being around blind people, you know, just networking, and professionalism, and seeing all the new technology, and just kind of getting away for the week.

The traveling part’s not much fun. But you know, getting there is fun, and being away, and just kind of being in that bubble for a week.

Jonathan: It’s a harder trip for us compared to Houston because it’s not a direct flight.

Bonnie: Oh, yeah.

Jonathan: So we go from Wellington to Auckland. And then, we change terminals to go to the international terminal.

Bonnie: And then, we go to Houston.

Jonathan: And then, we’ve got a lot. We are doing the flight to Houston. And then, we get off at houston and we have to go to Orlando from there. So it’s a very long flight.

Bonnie: Yeah, it is longer than long.

Jonathan: I think it’s going to be about 24 hours of travel all up.

Bonnie: Yeah. I mean, I’ll sleep, which is good because I have my magic potion. So I’ll be ready to rock and roll hopefully when we get there, because we won’t get into Orlando Until like midnight.

Jonathan: Yes, it’s very late on the 2nd we get in.

So we’re looking forward to capturing some interesting interviews, and just catching up with people, and it should be very exciting.

Bonnie: And my friend is coming that I’ve known since I was very little.

Jonathan: Aww!

Bonnie: She’s coming down to the convention. She’s never been to NFB before, so she’s coming down from DC.

Jonathan: She’s never been to an NFB convention?

Bonnie: Mmm-hmm, so I’m kind of looking forward to just having stupid girl time.

Jonathan: Oh boy! Watch out.

Well, I’m looking forward to having stupid political geek time because on the evening of the 4th of July, I am going to be cosseted in my hotel room following the election in the UK.

Bonnie: That’s Thursday?

Jonathan: That is Thursday, yes. Elections in the UK are always on a Thursday.

Bonnie: So Sarah and I can go storm the exhibit hall.

Jonathan: You’re not going to be glued to the exit poles?

Bonnie: Because we go to every single booth. [laughs] It’s sort of like a weird tradition.

Jonathan: Yeah, but don’t the exhibits close at 5 o’clock? That’s when the election starts.

Bonnie: Yeah? Oh, darn!

Jonathan: Yeah.

Bonnie: We’ll have to do something.

Jonathan: Well, anything you’re hoping to see, or buy, or anything at the exhibits, dare I ask?

Bonnie: Probably not. I mean, they always have the NFB tables with their merch. You know, they’re selling the snack packs again this year. Might get one of those. You know, you like to contribute to the little groups. And they’re good salesmen, too. Louisiana will probably have their hot seasoning, which I don’t think we can bring back.

Jonathan: The Louisiana spices, yeah.

Bonnie: The Louisiana spices.

Jonathan: I really hope they have GoodMaps again, because I just found that an extraordinary experience last year.

Bonnie: I think they do, actually. I think I heard they were.

Jonathan: Yeah. I love that. Just knowing what booth you wanted to visit, and getting absolutely accurate information about how to get through the exhibit hall to the particular booth you wanted. It was a fantastic experience.

Bonnie: One thing I had forgotten because I hadn’t been to a convention in many years was how loud…

Jonathan: It is, really.

Bonnie: When you go in that exhibit hall, it’s just like a wall of noise, and I had just forgotten about how overwhelming that was.

Jonathan: Yes. Well, it would certainly be a good test for my new hearing instruments.

Bonnie: [laughs] You might be able to hear better than me in there.

Jonathan: Oh, one never knows.

Bonnie: Yeah, I’m looking forward to it. It looks like there’s some really neat places to eat in the hotel, like a deli. And I do miss delis living here – getting a deli sandwich like a turkey and cheese, and that sort of thing.

And there’s a spa in the hotel.

You have to go across the sky bridge to get to the convention center. I haven’t done that in a long time.

Jonathan: Yeah, I understand the main hotel filled up pretty quickly.

Bonnie: Yeah, there’s apparently something else going on in the hotel, so that’ll be interesting. It’s always kind of interesting to see what else is there. I know the last time I was in a convention in Louisville, there was a judges’ convention.

Jonathan: Didn’t they have an Amway convention?

Bonnie: Oh yeah, that was at the end. The Amway convention, there was an Undertakers’ convention. I can only imagine what that exhibit hall looks like. There was some sort of priest convention. I think that may have been 2008.

What are you looking forward to seeing?

Jonathan: I really enjoy the Resolutions Committee. That’s my first highlight of any NFB convention is the debates and things that you see at the Resolutions Committee.

I enjoy a lot of the convention general sessions and the presenters, and finding out about some of the key issues.

I’m also speaking to the Deaf-Blind division, and that will be on the 5th of July in the afternoon.

Bonnie: Which is a Friday?

Jonathan: Yeah. I’m looking forward to doing that and talking about how important the relationship is between a blind hearing aid wearer, certainly, and their audiologist.

Bonnie: Yeah.

Jonathan: So that will be good.

Bonnie: It’s funny because I used to love all the technology. That was always like the highlight. You know, there was something new that you wanted to have. Like I remember the Victor Reader stream that everybody was going to…

Jonathan: You bought a Maestro.

Bonnie: I did, in 2008.

Jonathan: Did you get a VoiceNote?

Bonnie: I didn’t get the VoiceNote at the convention. I got that from The Seeing Eye auction. Mike May donated that at Sendero.

But yeah. I got a Victor Reader, and they weren’t shipping them yet because it was new. So everybody was getting one. That was 2007, the NFB in Atlanta.

Jonathan: Yup.

Bonnie: And then, when did I get the Maestro? That must have been 2007, too. Gosh! I got a lot of stuff at that convention.

Jonathan: [laughs]

Bonnie: But there’s not really anything that I necessarily want.

Jonathan: No.

Bonnie: Because I have my Mantis, and…

Jonathan: We’re fortunate like that.

Bonnie: Yeah. And I don’t think you see as much as you used to. I mean, I can remember the first few conventions I went to. There was always something, and the majority of them never saw the light of day. But usually, there was some company there that had some sort of thing, you know, the next greatest thing.

Jonathan: Yeah.

Bonnie: And they were usually the ones you got stuck at their table at, and could not get away.

Jonathan: Because they were evangelizing.

But I think what was the case then was that with the internet not quite as developed as it is now, they would try and time releases for those conventions because they were the big selling opportunities.

Bonnie: Yeah.

Jonathan: But now that people are so connected online, it’s not so important anymore, you know, with all the demos and things. But there’s nothing quite like seeing a product hands-on.

For example, I have not put my hands on a BT Speak from Blazie Technologies, and I’m interested in doing that.

Bonnie: Yeah.

Jonathan: So there’s still merit in actually going and getting your hands on one of these things.

Bonnie: Yeah, and those are always the crowded tables. Like Humanware, you can never get near their table, or Vispero, you can never get near their tables. [laughs]

Jonathan: Yeah. Believe me, imagine what it’s like being behind the table.

Bonnie: I can’t imagine. Yeah, it’s sort of like working The Seeing Eye booth.

And on that note, Eclipse will not be making the trip with us again this year because sadly, it’s just a few reasons. First of all, convention is hard on dogs, and Eclipse has never seen that many people in one room before. She’s been to romance writers’ conferences, but she’s never encountered anything remotely like NFB, and I don’t think she’d be very happy there.

And second, the paperwork to get in and out of New Zealand is just, for that short a time, it’s just not worth the hassle, particularly trying to get back in.

So she is staying with someone who boards guide dogs here. So she’s staying at the Silver Stream Hilton, as they call it. So she’ll have fun for a week.

Jonathan: I’m sure she will. She’ll enjoy her convalescence or whatever.

Bonnie: Yeah, her staycation.

Jonathan: Yes.

Bonnie: So if anyone’s guide dog needs a pet, …

Jonathan: Yeah. I’m sure there’ll be a few that will.

Well, we will keep people updated with our progress, no doubt. And of course, I don’t know about you, but I will certainly be tracking and participating in on the Mastodon. So I look forward to seeing people there.

Bonnie: Yeah, maybe I will. Maybe I won’t.

Jonathan: Tremendous!

Alright. Well, thank you for appearing on the Bonnie Bulletin. We look forward to maybe putting a convention diary together.

Bonnie: Yeah.


Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at That’s P-N-E-U-M-A

Closing and Contact Info

And if you are doing some traveling over the next little while, I hope it all goes well for you. I can’t say I’m looking forward to being cooped up on planes for so long, but I am looking forward to the rewards of doing it.

So if you are in Orlando this year, I may well run into you. I mean, I may literally run into you, you know what I mean? [laughs]

And also, all the very best to the ACB convention, which is taking place at the same time in relatively close proximity in Jacksonville. So I’m sure that’ll be a fantastic convention as well. All the very best to our friends at ACB.

We’ll see you back next week.

Remember that when you’re out there with your guide dog, you’ve harnessed success. (I mean, even if it’s a robot one, I guess.) And with your cane, you’re able.


Voiceover: If you’ve enjoyed this episode of Living Blindfully, tell your friends. Spread the word on social media.

And if you’d take the time to give us a 5-star review on Apple Podcasts, we’d appreciate it.

We love hearing from you. Be a part of the show by getting in touch via WhatsApp, email, or phone.

For all the ways to share your thoughts with us, visit That’s