Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

 

Contents

Welcome to 283.. 2

Comments on recent episodes. 3

New Hearing Aids Purchased.. 5

For Users of the Rode Smartlav Interview Kit Needing USB-C Connectivity. 7

Wanting a Podcast App for My Chromebook. 8

Blind Yoga?.. 9

Broadcasting Systems for Visually Impaired Accessibility. 10

Downloading From YouTube.. 12

Microsoft CoPilot 12

Tax and Identity Verification.. 13

Microsoft Office on My New Laptop.. 14

The Security of Scanning Mail With My Phone.. 14

New Undesirable Requirements From Instacart 16

The Best Case for My Mantis. 18

Ray-Ban Meta Smart Glasses. 19

Audible on the Victor Reader Stream 3.. 21

Home Automation Experiences. 22

Blind Audio Describer Christine Malec Discusses Her Craft, the Luminato Festival Toronto, and More.. 26

Thoughts on Glide.. 45

Google Podcasts Closing Down.. 45

Closing and Contact Info.. 46

 

 

 

Welcome to 283

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

Hello!

On the show this week: what’s your favorite smartphone app for scanning these days and what’s the most secure, more home automation geekery, the Ray-Ban Meta Smart Glasses, is Instacart becoming less accessible, and blind audio describer, Christine Malec.

“Baby, if you ever wondered,

wondered whatever became of me.

I’m living on the air in Cincinnati,

Cincinnati area code 283.”

Maybe you’ll get that reference, and maybe you won’t. But wkrp in Cincinnati is one of my favorite TV shows of all time, and area code 283 belongs to Cincinnati, so hello, Cincinnati.

I would make the observation that most people who watch WKRP think it’s a comedy. Unless you’ve worked in commercial radio, and then you know it’s a documentary.

Did they give WKRP the call letters to a station in Cincinnati in the end? I’ll ask the GP.

Hey, GP, mate. What’s up?

GP: Hey, Jonathan. Not much. Just here and ready to chat. How’s your day going?

Jonathan: It’s going well.

Do you realize you are talking live on the Living Blindfully podcast, going to thousands of people all over the world? Show me some enthusiasm for that.

GP: That’s fantastic, Jonathan. Hello to all the amazing listeners of the Living Blindfully podcast. I’m thrilled to be here with you all. Let’s make this an awesome conversation.

How can I help you today?

Jonathan: I was just talking about WKRP in Cincinnati. It was originally a TV show, but did they ever allocate the call letters to a radio station in Cincinnati, WKRP?

GP: WKRP in Cincinnati was indeed a popular TV show, but the call letters WKRP were fictional and created specifically for the show. There isn’t an actual radio station in Cincinnati with those call letters. The show ran from the late 1970s to the early 80s, and became quite iconic for its portrayal of the behind-the-scenes antics at a radio station.

Are you a fan of the show?

Jonathan: I am. Are the call letters WKRP allocated to any station in the United States?

[sound of ChatGPT generating an answer]

Jonathan: That’s sounding like the old Tell-me thing.

GP: While the call letters WKRP are most famously associated with the fictional TV show WKRP in Cincinnati, there have been a few real-life uses of those call letters. For example, there was a low-power FM radio station in Raleigh, North Carolina, designated WKRP-LP. Additionally, a radio station in North Vernon, Indiana held the call sign WKRP from 1989 to 1997, and a station in Atlanta, Georgia also used the WKRP call sign from 1979 until 1989.

There have also been a few television stations that used the call letters WKRP including a station in Cincinnati branded as WKRP-TV from 2008 to 2023.

[music] “I’m at WKRP in Cincinnati”

Advertisement: We bring you transcripts of every episode of Living Blindfully. And that’s possible, thanks to sponsorship from Pneuma Solutions.

One of the cool things about the Internet is that it connects us with the wider world. But another cool thing about the Internet is that it can create places just for us. Of course, Living Blindfully is one such place. And another one is Sero.

Sero (spelled S-E-R-O) is a social network designed by us, for us.

Sero is available everywhere. It’s on your smartphone, your Apple TV, your Amazon Echo, and of course, on a fully accessible website.

If you download the Sero mobile app from wherever you get your apps for your mobile device, you’ll be able to sample some of the content free. And that includes this podcast and Mushroom FM.

But paying a subscription to Sero gives you access to a treasure trove of information, including newspapers, forums where blind and low vision people can discuss a wide range of issues, a handy accessible email client, and so much more.

You have to check out all the features. You’ll be amazed at how much is there.

Go to PneumaSolutions.com. That’s P-N-E-U-M-ASolutions.com. Access the products link, and then choose Sero for more information.

Comments on recent episodes

Imke says:

“I am responding to various topics that came up in recent episodes.

Melatonin in the United States.

I have been taking 1mg of Melatonin every evening, around 1 hour before bedtime, for about 18 months now. Before that, I never suffered from complete insomnia, but my sleep pattern clearly seemed to follow the non-24 pattern, and I often had difficulty avoiding nodding off during a late afternoon or evening presentation.

Since I started taking Melatonin, I have been sleeping through the night with no or few short wake times, and listening to presentations in the late afternoon or evening is no longer a problem. I experience no unusual grogginess in the morning.

I find that it works best when I take Melatonin at the same time each evening. Maintaining a constant time for taking the pill seems to be more important than the exact time at which I actually go to bed after taking it.

While 1mg options are more difficult to find than higher doses, I have taken over-the-counter brands in tablet form that I purchased from Amazon.com, from Target, or from a local pharmacy. Thank you for mentioning Source Naturals as the brand you were taking. It looks like they have 1mg lozenges.

Swimming in a straight line.

For swimming in casual practice, one way to swim in a straight line is to ask the pool staff to set up a single rope for demarcating a single lane on one side of the pool.

For competitive swimming, when I was in 6th grade or so, my mother organised to have a type of harness made for me. That harness was hung onto a rope that was strung across the pool, a bit above the water. When I put on the harness and swam across the pool, the harness slid across the rope, guiding me across in a straight line, and I had freedom of movement for my strokes. Constructing such a harness might require assistance from someone skilled in doing so, and setting up the rope would again require help from the pool staff.

Hearing aids.”

And this came through before I did the big demo of the Phonak apps, and Roger and things. So I’m going to skip some of this because he has questions I think I’ve now answered. But he says:

“Regarding app accessibility, I now find the Widex app to be quite accessible.:

Well, it’s very good to hear because it was extremely dodgy when I last looked at Widex in 2019. He says:

“It is not perfect, but usable with VoiceOver.

For example, at the top of the home screen, VoiceOver says a percentage. This is the percentage of charge remaining in one of the hearing aids. Double tapping that percentage reveals a new screen that clearly identifies the charge remaining on each hearing aid.

I have not used the adjustment options much, in part for fear of messing something up, and in part because I have gotten along well with the programs I have.

Regarding rechargeable hearing aids, I recently purchased an extra travel charger for my Widex aids. With this, I was able to charge the aids on the airplane’s USB port from my portable power bank while on an international trip.

However, I always bring along an older spare pair of battery-operated hearing aids to use when my primary ones are charging, or otherwise malfunctioning. I think that, in the future, I always need to have at least one battery-operated pair of hearing aids around, or the primary and secondary aids both need to be chargeable with the same travel charger.

I am following your reports about your search for your next hearing aids with interest, as I, too, will likely be in the market for new ones next year.”

Now, I’ve been thinking about this rechargeable business. And if they were to make one where you could somehow recharge the aids while they’re in your ear, I might be able to just live with that. I mean these days, you’re getting quite good battery life from rechargeable hearing aids. But it’s those long haul international flights from New Zealand that are not feasible. And also, as I said earlier, those disaster situations. But if you could somehow have something dangling from your ear that connected to a powerbank, then I would live with that, I suppose, if the feature set was good enough. But having to take the hearing aids out of your ear and not be able to use them while you’re charging them, that, for me, is a very serious concern.

What you’re talking about, Imke, is a good workaround of sorts. But I find that your brain does adapt to the hearing aids that you wear. So if you put old hearing aids in, even those you used to wear for years, it sounds jarring because your brain’s changed. It’s adapted to the way that you now hear, so it’s not an optimal solution.

New Hearing Aids Purchased

Speaking of hearing, for those who’ve been following my own hearing aid journey, I should report the conclusion to this, which is that I have purchased the Phonak Lumity 90 behind-the-ear hearing aids with disposable batteries. Overall, I decided it was going to be the best option for me at the moment. If I weren’t in the studio recording podcasts like this for you, I might have made a different choice. But I may not have.

And there are things I don’t like about these aids. I don’t like the Bluetooth 4.2 that’s a bit old and laggy. I don’t like the fact that it doesn’t have AuraCast support. But I did need to make a change, and I decided that this is the best option for now.

I did seriously investigate trying another set of aids before taking the plunge. And if you are in the market and you’re not in New Zealand, I’d recommend taking a look at these. They’re from ReSound (that’s R-E and then sound with a capital S). It’s a pun, you see, and the new one is called the Nexia. They were only released in February, 2024. They do apparently have a disposable battery version of this already, which is very impressive.

But the New Zealand distributor does not have it here, and didn’t seem particularly interested and engaging about getting it here. I was so intrigued by the technology. They do have AuraCast already, that I was even considering lowering my standards to do with rechargeable hearing aids, just to see what they sound like.

Because the thing about this hearing aid technology stuff is that all the marketing sounds so impressive, doesn’t it? You hear the marketing about these hearing aid products and you think, these are going to change my life. And hopefully, you get a bit of incremental improvement. But often, they are way overhyped (at least, that’s my experience).

But I wanted to try this ReSound because AuraCast is definitely the future when it comes to hearing aids. The idea is that you could be in an airport, or some sort of public place. And when there’s an announcement that comes over the PA or whatever, that announcement could come straight to your hearing aid. You could be on a bus, even. And when they announce the stops, you could hear that directly through your hearing aids as well, in theory. And it’s also a much better alternative to hearing loops in theatres, and that kind of thing.

So this is going to be the future. It’ll take some years to roll out. But certainly, when I was considering new hearing aids, AuraCast was a factor. Phonak isn’t offering it yet.

And ReSound just weren’t interested in engaging in a timely manner. ReSound were very slow to respond to my audiologist’s queries, and they basically didn’t seem particularly keen to make the sale. I made the point to my audiologist. Look, if they’re this slow to respond when there are thousands of dollars up for grabs, how quickly will they respond if there’s a problem with the aids?

So I reluctantly gave it away because I just need really good quality service. If something breaks, I want the company concerned to be in a position to do whatever it takes to keep me hearing. And I know that Phonak will do that. They’ve got technicians here in New Zealand. They don’t have to be sent overseas, which is the case with some hearing aids, and they’ll look after me. So that was a significant consideration.

But if you are in another country where ReSound has very good service, then it’s something worth considering, I think. Because on paper, the technology sounds impressive. But for the wide range of needs that I have – recording this podcast, being out and about, hearing in background noise, all of those things, I think we have got it pretty right.

And I’ve worked extensively with my audiologist over the last month or so. We’ve done plenty of tweaking. And one of the things I do like about the Phonak Target software that they give audiologists is that there are a lot of parameters you can tweak. So we’ve been having a lot of fun, and we geek out during these sessions.

I am just so incredibly grateful that I have an audiologist who I’ve worked with on and off for the last 30 years. I haven’t lived in Wellington continuously for 30 years, so there have been times when I’ve worked with other audiologists. But actually, I’ve been in Wellington the majority of the last 30 years. I’ve worked with the same audiologist, and I’m so grateful. She is incredibly patient with me and she treats this as a bit of an adventure [laughs], when we start a process like this.

And it’s stressful, isn’t it? It is stressful because you can make a tweak, and you go out in the real world and you find that that tweak has done something that you didn’t expect, so you have to make another appointment. It is really cool these days that audiologists can remote into so many hearing aids and make tweaks. So if something really bad happens, you can initiate a remote session a lot of the time and roll back something that may be particularly disastrous.

But our hearing is so important. Our needs are complex, as has been discussed during this process. Sometimes, you’ve got to hear in traffic. Other times, you want that traffic or other stuff filtered out as irrelevant, annoying background noise. It takes a lot of skill to fit hearing aids for blind people well. I’m very lucky that I struck gold with my audiologist.

And I also want to thank Phonak and the audiologist there here in New Zealand, who has been incredibly responsive, helpful, and giving ideas when I’ve been whining on about things that aren’t working the way I want them to.

Now things are working, I think, as best as they can, perhaps with the exception of a couple of tweaks, but we’re doing okay. So here we are with the Phonak Lumity 90 hearing aids, now actually mine.

[music]

Voiceover: If you’re a member of Living Blindfully plus, thanks for helping to keep the podcast viable.

If you haven’t yet subscribed, why not do it today?

Get access to episodes 3 full days ahead of their release to the public, you’ll get advanced notice of some of our interviews so you can have a say in what we ask, and you’ll help keep the podcast viable by helping to fund the team to do their work.

Our guarantee to you is that everyone who works on the podcast is blind or low vision, so we’re keeping it in the community.

Find out more. Visit LivingBlindfully.com/plus. That’s LivingBlindfully.com/P-L-U-S.

Pay what you can. It all helps. Thanks for your support of Living Blindfully plus.

For Users of the Rode Smartlav Interview Kit Needing USB-C Connectivity

For a long time, I resisted getting one of the Zoom recorders because they weren’t accessible.

But back in 2018, when I started AiraCast (when I was working for Aira), I did have a need to record on location. I got one of these Rode SmartLav interview kits that just plugs into the lightning port of your phone, and you get 2 microphones that clip to people. One goes on the left channel and one goes on the right, and it wasn’t bad. I mean, I’m happy I’ve finally got an accessible Zoom recorder, but it wasn’t bad.

Matthew Bullis is writing in about what you can do if you’re switching to a USB-C iPhone and you’ve got one of these gadgets, and you want to keep using it. He says:

“Hello,

I wanted to be prepared for when the time comes to change over to USB-C when I upgrade my iPhone. So I called Rode here in the States to ask about an upgrade for the interview kit.

Several years ago, I heard about this set of microphones that you plug into an adapter, which then plugs into your iPhone. The adapter is the SC6L, and it has the earphone out to hear your audio as you go. Quite a nice kit, that is, and I wanted to keep using it.

The Rode representative I spoke with looked up this product, and let me know what I’ll need to buy to continue using this set of SmartLavs. Rode has the AI Micro,” Oh boy! AI’s everywhere. “which is a small device. But rather than having a hardwired lightning plug on one end, it comes with several cables which you can switch out as needed for the needing connection to computer or phone. It’s $80 US. This new product is powered by the bus of whatever device you’re using.

So your 2 microphones go in with the earphones on one end, and your cable to device on the other end. This AI micro will work with the various Rode apps like the Rode Reporter, which I configured for left and right sound capture. The SmartLav microphones give quite a good stereo soundscape when I’ve used them on several projects.”

Thank you very much, Matthew. That is really useful intel. Much appreciated.

Wanting a Podcast App for My Chromebook

“Hi, Jonathan and fellow Living Blindfully listeners,” says David Van Der Molen.

“I got, from my wife, a Chromebook for my birthday.”

Well, what a wonderful present. What a wonderful wife to give it to you.

He says:

“I’m trying to find a podcast playing app that works well with the ChromeVox screen reader. I’ve tried PocketCasts, which works fairly well with my Smart Vision 3 phone, but it isn’t user-friendly with ChromeVox, at least in my opinion. I’ve also tried Podcast Addict, with no success.

The app I’m looking for would need to allow me to take advantage of my Living Blindfully plus membership, allowing me to enter the RSS code that comes from Pinecast.

Looking forward to any suggestions anyone has on this.”

I wonder, David, whether The Old Reader might be worth a look, at TheOldReader.com. it’s an RSS-based feed reader so it’s not just a podcasting client, it also works with any RSS feed. But for distribution, podcasting is just RSS under the hood. The Old Reader is a great accessible RSS aggregator. It does work well with podcasts. You can chuck the Living Blindfully RSS feed into it. And the cool thing is that not only would it work on your Chromebook, but you could also use it on any other web browser as well.

Rock on over to TheOldReader.com. Give it a go. Let me know how you get on with it.

If anybody else has any suggestions for David’s Chromebook and rocking the podcast, be in touch. opinion@LivingBlindfully.com. 864-60-Mosen is the phone number.

Blind Yoga?

Carrie Francis is in touch.

And thank you for your condolences, Carrie. She says:

“I was wondering if you could help me with something.

I am currently participating in a yoga class at my local YMCA. While the instructors are very good and have tried to help me master some of the poses, I have trouble remembering how they are formed since the class is only held once a week. It would be very helpful if I could practice the poses at home, so I could master them before the next class.

Do you know of any free mp3 tracks which I could download which describes yoga poses for someone who is blind/visually impaired?”

Good question, Carrie.

I remember a long time ago (and I’m talking probably 10, 15 even years ago), there was a very professionally produced yoga tutorial produced for the blind community. I have no idea if that’s still around. My memory of it is very vague, but I do recall it being talked about quite a bit on social media at the time.

But I do have a low-tech solution. What if you took your iPhone along with you or some sort of recorder, and you asked the yoga instructors to talk into your iPhone and describe the poses to you so that you could then take that home and play the recordings. And then, you know that you would be doing the poses exactly as the instructor is wanting.

There is a voice memos app built into the iPhone so it’s right there, but you could also get a third-party app like JustPressRecord. There are plenty of options that would do this for you.

Alternatively, perhaps somebody can tell us of a website which describes common yoga poses.

ChatGPT could be your friend here, too. If they’re using common yoga poses that have a common name, you could ask Chat GPT. I do this all the time. I say, “I’m a blind person. Explain to me in a non-visual way, how do I do this?” And ChatGPT gives me remarkable results. Most of the time. Not all the time, but most of it.

If anyone has any recommendations for Carrie, yoga’s good for you, isn’t it? 864-60-Mosen, or opinion@LivingBlindfully.com.

Broadcasting Systems for Visually Impaired Accessibility

Let’s go to Norway, where we’re hearing from Kurt. He says:

“Firstly, I want to express my appreciation for your informative and insightful podcasts.”

Thank you so much, Kurt.

“A bit about myself. I am blind, using VoiceOver for Mac. I work as a journalist and musician, and I’m a big fan of Mac, Logic, and Amadeus Pro.

Thank you for the excellent introduction to Amadeus. Your teaching method was both efficient and effective.

I currently work for a small Christian interest organization in Norway called KABB.” (That’s spelled K-A-B-B.) “Our mission revolves around advocating for the rights of the visually impaired within both the church and society.

I have been given responsibility for establishing an online broadcasting system, and this is where your expertise comes into play. I am seeking advice on what considerations to keep in mind when planning universally accessible broadcasting systems tailored for the visually impaired. I have heard of a system called Myriad, which operates entirely in the cloud. Do you have any experience with this system?”

I don’t, Kurt. Not only don’t I have experience of it, that’s not one I have heard of.

Kurt continues:

“Additionally, many have spoken highly of Station Playlist.”

Station Playlist is a fantastic suite. It’s what we use on Mushroom FM.

I’ve got a really low serial number for one of my products in Station Playlist. I think I was about the second person to buy it, so I’ve been there right from the ground floor of Station Playlist.

I’ve seen it evolve. The developer, who is actually based here in New Zealand, does have a strong commitment to accessibility. He knows he’s got quite a sizable user base of blind users, and he tends to look after them. And they’re supplemented by some very good station playlist scripts for JAWS, which are produced by Hartgen Consultancy. Those scripts have just moved to a subscription model. They are worth every penny. They are maintained. And most importantly, the station playlist scripts for JAWS ensure efficiency. When you’re on the air, especially in a busy environment, and you may have a whole lot of things going on, you’re thinking about timing, you’re maybe responding to email and social media, efficiency matters.

And it’s not just the studio product as well. Station Playlist comes with a complete suite for broadcast automation and remote voice tracking. And if you want to hear Station Playlist in action, you can hear a whole bunch of blind and low vision broadcasters using it on MushroomFM.com. It really is a superb suite.

Now, the downside for you, Kurt, as a Mac fan, there is no Station Playlist for Mac. I’m not aware of anything that comes close to the efficiency and feature set that is accessible on Mac.

If somebody knows of something that is as good as Station Playlist that works with Mac, please let me know. I’d be interested in hearing about that myself.

“Furthermore,” says Kurt, “I am in need of recommendations for a portable mixer that is easy to carry and operate for individuals with visual impairments.

I appreciate any insights or recommendations you can provide.”

Well, I hope the project goes well, Kurt. Very best of luck.

In my experience, particularly in recent times, what I’ve found is that a lot of people think they need a mixer, but can actually get away with an audio interface. There are some very good accessible audio interfaces on the market where you can plug in multiple microphones. They have basic mixing functionality. Products from Focusrite and Audient are both very accessible and very powerful.

But if it is a mixer that you actually want, then we get into the question of, do you want an analog mixer? Do you want one with digital features that presents itself as multiple channels in Windows or Mac? Mac, by the way, has a much better sound subsystem than Windows does.

If you’re talking analog mixers, then I don’t think you’ll have too much trouble finding something that you need.

If you want something portable, then you want to think about the number of channels you require because the more channels you have, the larger the mixer’s going to be. Companies like Yamaha, Allen & Heath, Behringer are often highly regarded in the mixer space. There are other players as well.

And this is a matter of personal preference, I suppose. But from my perspective as a blind person using a mixer with a lot of channels, I will always go for faders rather than knobs. And the reason for that is that you can run your finger across the channels and see which faders are up, which are down, and their relationship to one another in terms of the balance between all those faders.

So it would be useful to know a bit more about what it is that you need in a mixer. How many channels does it need to have? Is purely analog okay? There are plenty of good products.

You might want to jump on The Blind Podmaker group because although you’re doing radio and not podcasting, there are good people on that group who can give you further advice about the mixes that they’re using, microphones, that kind of thing. To subscribe to that, you send a blank email to creators-subscribe@TheBlindPodmaker.com. That’s creators-subscribe@TheBlindPodmaker.com.

You can also find out more about the Station Playlist scripts for JAWS by going to HartgenConsultancy.com. That’s H-A-R-T-G-E-N Consultancy.com.

And I understand there’s a good Station Playlist add-on for NVDA as well. So lots of options for Station Playlist out there.

Downloading From YouTube

Ali Kazi is writing in and says:

“Hi, Jonathan,

To download from YouTube, either as an audio or video file, I use a website called Y2Mate.” (That is spelled Y, and the number 2, and then mate, mate.)

“The nice thing about this site, apart from it being available on all platforms, owing to not needing to install an app, is that you can carry out the following easy steps to download from YouTube.

  1. Go to YouTube in the normal way, and find a video using search or whatever way you ordinary would find a video.
  2. Press enter to load the video.
  3. Find the address bar, and the URL of the video should be highlighted.
  4. Amend the URL slightly by adding the letters PI after YouTube and before .com, so the URL will read YouTubePI.com, etc.
  5. Press Enter, and the download page will appear from which you can select video or audio, and the quality of the download. Video is selected by default.

From there, follow the basic instructions to convert and download the video. Conversion will take longer if the video is longer, but perusing the screen will show an accessible percentage indication.

I hope people find this useful.”

Microsoft CoPilot

“Hey, Jonathan,” says Christian Bertling.

“I was listening to episode 279, and I’ve been really interested in CoPilot and just AI in general for quite a while now.

First off, what was the feature in Outlook that you really want to use that’s not in the classic Outlook, so I can look into it and learn more about it?”

Well basically, Christian, if you have Microsoft CoPilot for Office 365 (which is something that we’re evaluating and I may deploy in the organization of which I’m CEO in my day job), you can do all sorts of things like summarize emails and craft responses.

So a question might be asked, “CoPilot can look at your organization’s information and draft a response for you?” And my understanding is that over time, it will learn how you like to write. It can look at your sent items, for example, and start analyzing your writing style. It’s very good, but it does require the new Outlook, and that’s a bit of a show-stopper for screen reader users right now.

More on this in a future episode of Living Blindfully, I am hoping. I’m hoping that episode is not too far away.

“Also, do you have an idea when you’re gonna talk more about CoPilot?”

Oh, there we go.

“Since I’m really interested in it, I want to catch the episode or episodes where you talk about it as soon as they’re available. And I might even become a Living Blindfully Plus subscriber just to get the episode or episodes where you talk about CoPilot right when they’re dropped.”

I mean, dude! That’s an incentive right there, Christian. Whoa.

It should be, as I say, sometime in June. Talks are continuing. I promise you it will be worth the wait because we’re going to dig into CoPilot in a very informed way, and just ensure that people understand what power is at people’s disposal with that. So hang on in there. I hope to get the episode published in June.

And of course, thank you very much for considering subscribing to Living Blindfully plus. I really appreciate everybody who does so.

Tax and Identity Verification

Stefanie says:

“Hello,

I realize this is a bit late for the gentleman who was filling his taxes, but I hope this will work for next year.

I use TurboTax and have been doing so for the past few years, based on the recommendation of someone else who is blind. When it comes to verifying my identity, all I have to do is enter my ID number, dates of issue, and expiration. While these are written in print, as long as I can get them from someone who can see, usually my parents or someone else I trust, these pieces of information don’t change until I get a new one. I haven’t had to submit pictures or anything like that.

Furthermore, I attempted to make an ID.me account and abandoned it after I saw what I would need to do to complete this process. However, I did not need to have an ID.me account when I created a social security account. I don’t remember what I had to provide but it was created, and I do use it to look at information regarding benefits.”

Thank you, Stefanie. Much appreciated.

Advertisement: Living Blindfully is brought to you in part by Aira, and I thank them for their sponsorship of the podcast.

You know we’ve become used to some businesses offering free Wi-Fi. It’s a nice touch, and it makes us feel valued whenever we come across it.

And I know similarly that when I learn about a business that has purchased Aira Access, it’s like putting out a massive “Blind people are welcome here.” sign. I know that if I need it, I’ve got a trained professional agent available to me to provide assistance, and that means that the business cares enough to pay for that. I appreciate that.

From airports, to Starbucks, to Target, and more, Aira Access can assist you to navigate, shop, browse and not be reliant on friends, family or others who may not understand our needs as well. And don’t forget that as well as the offerings in physical locations, there are other businesses providing Aira Access that can be used from home.

So you should check out the app to find out what’s available. To do that, just open the Aira Access section in the Aira Explorer app on your favorite device. You can also visit Aira’s website to find out more at Aira.io. That’s A-I-R-A.I-O.

Microsoft Office on My New Laptop

Let’s go shopping once again.

Carolyn Peat says:

“I am in the market for a new laptop, and just have a question. I know the specs I should be looking for, and I know I want Windows 11.

I currently have Microsoft Office Home 2019. I recently heard comments on your podcast about the inaccessibility of Outlook, and this concerns me.

Does anyone know if the latest version of MS Office with Outlook is accessible? I can’t afford the subscription for Microsoft Office 365, so will be purchasing it instead of subscribing. Any feedback is appreciated.”

Yes, Carolyn. If you buy the Office suite, you will get the classic Microsoft Outlook built in. That’s not going to be going away anytime soon.

Meanwhile, Microsoft is working on a new Outlook product which is heavily HTML-based. It’s one of those universal apps that are becoming so common these days.

But if you buy Microsoft Office, you’ve still got the Outlook we all know and, if not love, kind of tolerate and have become familiar with over many years.

Good luck with a new laptop. Toys are good.

The Security of Scanning Mail With My Phone

Caller: Hi, Jonathan and Living Blindfully listeners. My name is Tim, and I want to introduce a topic regarding scanning mail.

Years ago, I used OpenBook to scan books, and I did some scanning of mail, and that was great. I did it with my PC.

Well, for a while then, I stopped scanning mail, and I received help from others.

But recently, I decided to start scanning mail again. However, now I wanted to see how the phone would work with scanning mail. I’m using an iPhone, and I was very pleased. It worked very well using the various programs – Seeing AI, Voice Dream Scan, and other scanning programs, the AI programs.

I began though to be concerned because the question I have is, unlike on the PC where everything was local, is this scanning going somewhere on the Internet? And is it safe, for example, to scan mail? And I know that you’re advised for Be My Eyes and perhaps for Be My AI not to use the scanning for materials such as bank statements or other statements that are sensitive materials.

So what do you all think about that? How do you scan mail? Do you try to get assistance to review your mail that is secure, you know, that is sensitive, such as using FaceTime with someone you trust? Or do you go ahead and use these scanning programs to scan statements? And is it safe or is that dangerous to scan because it’s captured on the Internet?

Any suggestions and input would be much appreciated.

Thank you so much. Bye-bye!

Jonathan: Thank you for calling into the podcast, Tim. It’s good to hear from you.

My understanding is that VD Scan, as I think it’s called now, the old Voice Dream Scanner, is all on device. When you scan, it’s just essentially like it was in the old days of OpenBook. The optical character recognition is right there on your phone. It gets installed when you install the app.

I think there may be some other apps like this. KNFB Reader may well be like this, which I think is now built into NFB Newsline.

My understanding regarding Seeing AI, and for that matter, Envision, is that if you use their short text feature where you just hold the camera above a document and you get instant text, that’s all happening on your device. If you go to the document mode and it lines you up, and you take a full picture of the page and it reads the full page to you after analysing it, that’s going out to the internet so that it gives you the best possible recognition that it can.

Now, people’s level of tolerance for this will vary. My personal view is that using an app like Seeing AI or Envision to do that is not a problem because it’s an encrypted connection. I don’t believe data is being stored. It just takes an image that you’ve sent, processes it in the cloud, and then sends it back and it doesn’t keep that image. It doesn’t store it in any way. If you want to store it, you can save it on your own device, but they don’t keep it. That’s my understanding.

Now, when you get onto large language models where you might take a picture of something, I don’t actually find that they’re very good for reading mail anyway, because their primary function is not to read a document to you verbatim. I guess you could ask it to, and sometimes it will, but it prefers not to because it’s kind of giving you a summary. It’s giving you a picture.

That is where I personally start to get nervous because you’re not sure whether any information is being used to train that large language model, and I think everybody’s trying to be pretty transparent about that. You can give one of these large language models pictures of things or things that aren’t particularly sensitive, and they do an amazing job on many occasions. But I would think twice about sending it a bill or something that’s personally identifiable. It’s about using the right tool for the job.

That’s my own perspective. Others may have different ones, and it’s a very good topic that you’ve raised. I hope we get some contributions on this.

New Undesirable Requirements From Instacart

Caller: Hi, Jonathan! It’s Jim from North Florida. I wanna talk to you briefly and the listeners about something very important. We talk about it’s important for the merchants and vendors we work with to be accessible and inclusive. I wanna talk briefly about a service called Instacart, which is a delivery service for stores in the United States. I use an iPhone to access it.

The long and short of it is, I’ve used this service for a long time, but they started a new practice where they want you to take a photograph of your picture ID and your form of payment. But they also tell you in the email they send you that you must block out certain confidential information to protect your identity, and they advise you to do so.

Quick summary. Last week, I placed an order. Everything went fine. This week, I placed an order, but I wanted to use a different form of payment.

They have 2 forms of payment on file – my visa card, and my MasterCard. Yes, I know that’s very trusting to be given both of them.

The system defaulted, for some reason, to my MasterCard. I didn’t want to use that. So then after 3 times of trying to select the Visa card (because I knew that that other card is a debit card, and there wouldn’t be enough money in that account, so I wanted to use the other card) well, it locks me out.

Then when I called customer service to let them know the problem and see if they could talk me through how to fix it, they said oh well, sir, you need to do this. And he sent me this long, complicated email. And I went through the complicated email, and it said this description about how you handle this picture and stuff. He promised me that he would have someone in leadership call me.

So I waited several days. No one called me back.

Then I called yesterday, had a conversation again with another representative. They told me within 30 to 40 minutes. Imagine that! I’m sure you have a cool sound effect for that.

But anyway, so they didn’t call me back within that time, which I didn’t think they would. So I gave them about 18 hours. because I need to get food for myself and my guide dog, and there’s a service dog in the home. And unfortunately, that didn’t happen, either.

So I called today, and imagine that! They said that there was no request for an escalation.

So then, the way it finally ended up was that this gentleman that I talked to tonight said he would escalate the call.

So what I’m asking people to do is to reach out to Instacart politely and respectfully, just like we did with the people on the last app. I’m not going to name it here, but everyone knows what I’m talking about, the reading and scanning app that many of us use. And I think that was very productive. So I ask people to please reach out to Instacart and encourage them to go back to being accessible and inclusive.

They also used to have an accessibility line – if you had a problem with the app, placing your order. They took that off. A couple of other companies have done that, too. I guess it’s a cost-saving measure. I don’t know.

They have 2 departments. But the figures are getting worse. Wanting this camera and picture and all this stuff is a little much.

Sorry my message was a little lengthy, but I wanted to give all the details.

I gave them information that they have, so they can use the information they already have about me. I don’t know why they need me to take photos of cards they already have and my picture ID. But that’s not accessible, and they didn’t need it before. Why do they need it now?

There’s also no disclaimer or nothing in the app saying oh, by the way, we’ve changed our terms. We need new information. So there’s nothing there.

Thanks for your help, Jonathan! You take care, and keep up the good work.

Jonathan: Thank you, Jim!

Instacart is not a service that we have in New Zealand, but I’ve certainly heard people on social media referring to it regularly. I know it’s been a popular product in the blind community.

So what are other people experiencing? Is this a problem for everybody? 864-60-Mosen. Would love to get your thoughts on this, if you are an Instacart customer. 864-606-6736. Or you can drop me an email with an audio attachment or just written down to opinion@LivingBlindfully.com.

The Best Case for My Mantis

Anne Murray put it best when she said, we all could use a little good news today. And here’s Joe Norton with some.

“First,” he says, “some more good news.”

See, told you.

“I’m now the proud owner of a new Mantis Q40, which, appropriately, I’m using to type this letter. I’ve only had it for a day, but I’m already loving it.

I had to take a trip out of town yesterday, and the device was going to be delivered at around 10 AM. However, we were also supposed to leave about that time. So I’m on the front porch with bated breath, waiting for the delivery. It arrived before the people who were going with us on the trip, so I breathed a sigh of relief.

Now, I had my first dilemma – how to play with this new device? As you know, the device doesn’t come with a protective case, just the bumper case supplied with the unit. For something this new and expensive, you would want to protect it. My youngest daughter had a laptop case that I borrowed for the day because I couldn’t stand leaving the Mantis at home for a whole day. It would be lonely if I just left it there, wouldn’t it?

So with that problem temporarily solved, I had a play with my new Mantis on the road. I couldn’t do a lot, but I did connect it to my iPhone. I even connected it to my iPhone’s hotspot, so I could connect to Bookshare and NFB Newsline. The NLS Bard service came later last night, when I got home and updated the firmware in the unit to take advantage of it.

Anyway, I can tell this thing is going to be another tool in my arsenal of equipment for my job search and future employment. I had a feeling I would enjoy using the Mantis since I already have the NLS Humanware eReader. It appears to use much of the same technology, but the Mantis has a few things the eReader does not.

I thought I would have an easy time using the Mantis. And for the most part, it has been fairly easy.

However, over the past, oh, half a year, I’ve been used to all the Chord commands on iOS, so I’m having to relearn things.

There is, however, a nice document published by Humanware or APH, I’m not sure which, that details using it with iOS, so I’ll get into that directly.

Using it with JAWS is almost effortless, since I just use most of the keyboard commands I’m already used to. Got to check if there are any QWERTY keyboard shortcuts for Braille.

Now, for the second dilemma – which case to buy? Humanware sells a case from Executive Products for $160 US. However, one of your sponsors, Turtleback, sells a case for, I believe, $145 US. Does that discount code still work?”

It does, so you’d get another 12% off, Joe. [laughs]

“And do you know if each case has pros and cons? If you have tried both, do you know and do you dare answer?

Or can any of your listeners weigh in on this? That would take you off the hook.

It is an important question since my house gets dusty, and I’m going to keep this thing covered when not in use.

Thanks, and all the best.”

Thanks, Joe! Good to hear from you.

I sent my mantas away for a scheduled cleaning this time. You might recall I accidentally dropped my Mantis on its keyboard and broke the escape key a few months ago. But this time, I just wanted a good old clean of the Braille display, having used it for so long.

It is a great device. I really do enjoy the QWERTY input and the Braille output.

I’m very glad you raised this question for 2 reasons. First, it gives me a chance to make it really clear that when we sign a contract with a Living Blindfully sponsor, we are explicit that it doesn’t give a sponsor any kind of priority in terms of the regular content of the show, nor does it immune a sponsor from criticism. So I’m super grateful to Turtleback for sponsoring Living Blindfully, but they know and accept without any reservation at all that somebody might say that another company’s case is better. So no problem at all there.

But in actual fact, I have not seen the Turtleback case yet for the Mantis. When I got my Mantis many years ago, the Executive Products one was on offer. The Executive Products one was therefore what I have.

My only criticism of it is that it’s really difficult to get the Mantis in and out of the case. Man! So much so that when I sent my Mantis away recently for maintenance, I just sent it in the case because it’s a very very tight fit.

But I don’t have any other experience to comment on. Others may. opinion@LivingBlindfully.com, or 864-60-Mosen.

And Joe, good luck with your Mantis. You’re going to love it, I’m sure.

Ray-Ban Meta Smart Glasses

Peg Jolene says:

“Hi, Jonathan,

Attached are samples of my experimentation with the Look and Ask feature of the Meta glasses. I’ve also included a description of the glasses via CoPilot and ChatGPT.

It is as follows:

The Wayfarer Meta Ray-Ban Glasses are a modern twist on the classic Wayfarer design, integrating advanced technology to enhance everyday experiences. Here’s a detailed description of these smart glasses.

Design.

The Wayfarer design is timeless, featuring the signature trapezoidal frame that has been a symbol of style since its introduction in 1956. The frames are made with high-quality materials, ensuring both durability and comfort for all-day wear.

Technology.

These glasses are equipped with dual integrated 5 megapixel cameras for capturing photos and videos, discrete open-ear speakers, and a 3-microphone audio array for clear calls and voice commands.

Look and ask feature.

One of the standout features is the look and ask capability, which allows users to access Meta AI by simply looking at an object and asking a question about it. For instance, you could look at a landmark and ask for historical information, or look at a product in a store and ask for reviews or prices.

Meta AI.

The Meta AI assistant is activated with the voice command Hey Meta, providing hands-free assistance. It can send messages, make calls, set reminders, and more, all without needing to take out your smartphone.

Connectivity.

The glasses can connect to your smartphone via Bluetooth, and are compatible with the MetaView app, which allows for easy sharing and editing of your photos and videos.

Battery life.

They can offer a day’s worth of battery life under typical use, which includes capturing and sharing content, taking calls, and using the voice assistant.

Privacy.

Meta has emphasised privacy with an LED capture light that turns on when recording and a power switch to turn off the cameras and microphones for added peace of mind.

The Wayfarer Meta Ray-Ban glasses are a fusion of iconic fashion and cutting-edge technology, offering a seamless way to interact with the digital world while staying present in the moment.”

And here’s Peg’s demo.

Peg: Hey, Meta. Look at the screen and tell me what’s on it.

Meta (male voice): The screen appears to be displaying a desktop background with various icons and windows open.

Peg: Hey, Meta. What item is highlighted?

Meta: The item highlighted appears to be a Dell desktop computer.

Peg: Hmm.

Hey, Meta. Read and tell me what icon is highlighted on the screen.

Meta: The highlighted item is a Dell desktop computer. There are various icons and windows open on the screen, but it does not appear to be focused on any specific icon or window in particular.

Peg: Hey, Meta. Look and read the computer screen.

Meta: The computer screen displays various software and file names, including Reaper x64, Studio Recorder, Document 1.

Peg: Hey, Meta. What does this microphone look like?

Meta: The microphone appears to be a black, cylindrical device with a silver grill on the front. It has a flat base, and is connected to a computer via an audio interface.

Peg: Hey Meta, look and tell me about this microphone.

Meta: The microphone appears to be a high-quality condenser mic, designed for capturing clear and crisp audio. It has a sleek design with a silver body and black grill.

Jonathan: Thanks, Peggy!

I take it that’s a Blue Yeti it’s describing. Could be wrong, because it didn’t actually say the brand, did it?

I really appreciate that demo. I know there’s a lot of talk about these Meta glasses out there at the moment.

Audible on the Victor Reader Stream 3

Let’s talk about Audible on the Victor Reader Stream 3. Curtis Chong is in touch and says:

“Hello, Jonathan,

While I am truly glad that Audible has stepped up to the plate and made it possible for users of the Victor Reader Stream 3 to read content from Audible, I should alert you to an outstanding problem for those of us whom one might call legacy Audible users. That is, people who signed up for Audible before Amazon acquired the service.

Here is a short synopsis. In order to read content from Audible, a person has to install the Audible Sync program, and then register the Victor Reader Stream 3.

However, if you are someone who signed up with Audible before it became part of Amazon, and if your email address when you signed up is no longer in use, you cannot sign into Audible Sync using your Amazon credentials. Even if you have some inkling about what your original Audible username and password are, you cannot invoke the forgot password function because you don’t have an email where the link to the reset function can be sent.

Humanware has acknowledged this to be a problem. I guess some of us just have to wait for the fix to be released for Audible Sync, or use a third-party program to convert Audible files to the mp3 format.

I can’t tell you how frustrating it is for a blind person to call Audible support, press the key that indicates you’re a person who is visually impaired, and then talk to a first-level support person who knows absolutely nothing about what Audible has done to enable Victor Reader Stream 3 users to read Audible content.”

Thank you for writing in, Curtis.

That does sound frustrating, and I hope some progress can be made on that very shortly.

Home Automation Experiences

I enjoyed our chat with Dustin Boggus from the My HomeKit Home podcast because home automation can be truly practically useful. It’s also got the cool creds right there, hasn’t it?

Elijah Massey is writing in on this. He says:

“My first experience with home automation was when I won an Echo Dot and some Amysen” (that’s A-M-Y-S-E-N) “smart plugs in the Braille Challenge when I was in high school.”

Well congratulations, Elijah.

He says:

“I mostly use the Echo for getting notifications, asking for things like the weather, and my alarm, and I use the smart plugs for turning off chargers when I wasn’t using them, since I didn’t have anything else to use them for yet.

When I lived in a dorm during my first year of college, I bought the Amazon Basics microwave. It works with the Soup Drinker, and also has a Braille” (with a lowercase b) “overlay for the buttons.

In addition, I added an IFTTT automation that would notify me when my iPad or iPhone battery went below 50, 20, or 10% using Apple Shortcuts to trigger it. It would also send notifications to my Echo, in addition to the app.

IFTTT” (if this, then that), “is a service that integrates with tons of different platforms and allows you to create simple automations.

I also created an automation that I would trigger when I went to bed. First, I would trigger an iOS shortcut with Siri which would turn on sleep focus, silencing notifications, and enabling sleep tracking on my Apple Watch. Then, this shortcut would trigger an IFTTT applet which would then trigger a Soup Drinker routine.”

That’s Amazon’s service. We don’t say its name because it will trigger them all.

“The Soup Drinker routine would turn on Do Not Disturb on my Echo, and set its notification and alarm volume to maximum. I had an IFTTT shortcut and a Soup Drinker routine that were triggered when I dismissed my alarms to undo these changes.

And when I got a few Android devices, I had this IFTTT automation trigger, Tasker, using Auto Remote to turn on Android’s Do Not Disturb. Tasker is an Android app, similar to Apple Shortcuts, although it has fewer limitations.

Near the end of the year, I bought an Echo Show 8, Google Nest Hub, and HomePod Mini because I wanted to see what the other assistants can do, and if the smart speakers with screens were useful. I have found that each assistant has some things it’s better at than the others.

For example, the Echo is the only smart speaker that can receive notifications for things like Amazon packages being delivered, or a weather alert, the HomePod is the only one that supports sending and reading text messages, and the Google Nest Hub can track my sleep. The Echo Show also has a feature called Show and Tell, where you can hold a product in front of the camera, and it can try to identify it. It worked pretty well. They also sometimes give different weather forecasts, which can be helpful for knowing what to expect.”

I’ll just pause Elijah’s email there to say that at Christmas last year, we bought Heidi and Henry, the Wonder Son-in-law, an Echo Show, because it’s what they wanted, and I was looking forward to trying Show and Tell. But it doesn’t seem available in New Zealand. It just didn’t work.

Elijah continues:

“When I moved into an apartment, I started buying more smart home devices. I bought a Govee” (I presume that’s how it’s pronounced. It’s spelt G-O-V-E-E), “tower fan, a Govee air purifier, a smart plug from TP-Link, and a Coreddy” (that’s C-O-R-E-D-D-Y) “R650 robot vacuum and mop.

Everything was easy to set up with the Soup Drinker and Google, except for the robot vacuum because of poor directions and accessibility issues in the app, but I eventually figured it out.

I used the smart plug for a small aroma diffuser that I got for free from Walmart, and I chose not to use the smart plugs I had before because they were hard to set up and the app had serious accessibility issues.

The only device that worked with HomeKit was the smart plug. But I used Homebridge to add my fan and air purifier. And after that, they worked in HomeKit just fine.

Homebridge also brought the Bluetooth thermometer that came with the fan into HomeKit, which was very helpful because the thermometer didn’t work with any other assistant. This allowed me to check the room temperature and humidity with Siri. I added most of these devices to my bedtime and morning Soup Drinker routines, and I created HomeKit scenes for leaving and arriving home to turn the fan off and back on, and change some other things, because HomeKit seems to be able to track my location using my iPhone more accurately than the Soup Drinker can.

Later on, I bought a smart evaporator cooler from Evapola” (that’s E-V-A-P-O-L-A) “called the EVASmart” (that’s EVA, E-V-A, and then another capital Smart) “that works with the Soup Drinker and Google.

However, even though the cooling is extremely effective,” [It’s cool, baby. sound effect] Yes, thank you, thank you. “it’s a pretty small device, and cools a space about the size of someone’s head.

So I decided to purchase another one. But the only bigger smart evaporator cooler I could find, the Princess Smart Air Cooler, was only available from the UK.

I ended up using a package forwarding service to ship it to the US, and I bought a voltage converter so I could power the device. It’s a tower fan with a built-in water tank so it cools a much bigger area, although the evaporative cooling seems less effective. Both these devices only support the Soup Drinker and Google, and not Siri, since there is no Homebridge plug-in for them, and I needed sighted assistance to connect the EvaSmart to Wi-Fi since you have to do that using a touch screen on the top of the device.

In addition, I was able to connect the Nest thermostat in my apartment to my Google account and to both the other assistants using Matter.

For a while, I controlled it with the Soup Drinker routines to set it a little cooler at night, until my apartment complex put a portable air conditioner in my room when the other students in my apartment moved out, which uses less electricity than running the main AC.

A few months ago, I bought a small space heater from Govee when it got really cold during the winter, and it works with the Soup Drinker, Google, and Homebridge, although it’s a little more complex to control since it shows up as both a fan and a thermostat.

Since I live in a 4-bedroom apartment that is usually shared between multiple people, I probably cannot get very many more smart home devices. But one reason I look forward to moving into a private house or apartment in the future is so I can add more smart devices such as locks, doorbells, cameras, washers and dryers, security systems, different sensors in every room, etc.

However, I just found an amazing project called Home Assistant that I have been moving all my smart home and automation stuff to. Home Assistant is basically a system for managing smart home devices, and it has tons of integrations for different devices and services. You can even use devices that have Homebridge plug-ins but no Google Assistant integration by importing the HomeKit devices created by Homebridge.

It goes beyond other smart home platforms in several significant ways.

First, you can control devices in ways not possible with other platforms. For example, I can add my Nest Hub and HomePod Mini as media players which means that I can play sounds on them from automations, and even play text-to-speech messages. This lets me trigger iOS shortcuts from Home Assistant by using the Sound Recognition trigger, which I could not do with IFTTT. I can also control my Echo Show and Echo Dot, including controlling Soup Drinker routines, performing custom commands, playing TTS announcements, opening skills, etc.

Second, its automation system is way more powerful than IFTTT or Soup Drinker routines. You can trigger automations from pretty much anything you can think of, and actions can literally do anything you can do in Home Assistant. There are also powerful scripting features like templates, conditions, and loops.

Third, you can add things into Home Assistant which aren’t normally thought of as smart home devices at all. For instance, there’s an iCloud integration that creates sensors for the location and battery status of all your Apple devices, and the Android and iOS apps add numerous sensors that track different aspects of your mobile devices. All of this means I can replace my IFTTT, Apple shortcuts, and Tasker automations that notify me about the battery status of my devices with a few home assistant automations that track the battery sensors from iCloud and the Android app and sends notifications to the companion app for my smart speakers. This is even better than my previous setup since when my batteries get really low, I can make all my smart speakers and my Android devices speak this notification out loud. I could also add complications to my Apple Watch that track the battery status of all my devices.

You can also create custom sensors and devices by combining data and actions from several different devices or connecting other platforms with webhooks. For example, if I bought a smart button pusher like the SwitchBot, I could create a custom thermostat that takes the temperature from my Bluetooth thermometer. And when the temperature goes above the temperature I set, it could press the button on my non-smart AC to turn it on. I could control the desired temperature with this thermostat, just like any other smart thermometer.

Another use for this is controlling devices that have remotes but no smart home functionality, by using an IR blaster like Broadlink and making them appear as regular smart home devices. You can also import all devices in Home Assistant into the Soup Drinker, Google Home and HomeKit, including custom devices you create. You could even have Home Assistant perform actions on your Windows, Mac, or Linux PC when you say a voice command or generate a weather forecast using ChatGPT and have your smart speakers read it out loud in the morning. All of this is only part of what Home Assistant can do, and I will probably continue to discover new things this incredibly powerful system can do for a long time.

The iOS app and web app are extremely accessible, although the Android app can be a little harder to use. However, you do need to edit text files written in YAML to configure certain things. You also need to host it yourself, and I run it on my Raspberry Pi. They do sell stand-alone devices that run it, but I would still say it’s probably mostly for more technical users.

They also have a cloud service you can subscribe to for a fee that makes some things easier, such as importing devices into the Soup Drinker and Google, and accessing Home Assistant when away from home. However, if you are willing to put in more work to set things up, you can do all of this without it.”

Happy geeking out, Elijah. Who knows what your home will be like in 10 years time? It may well be bossing you around and telling you what it requires you to do.

Advertisement: Living Blindfully is brought to you in part by Turtleback.

They’ve been working with our community for decades now, and they understand our requirements when it comes to leather cases for the devices that we use. They need to be durable, protective, and most of all, functional. You can be confident that when you buy a leather case or accessory from Turtleback, you’re buying quality.

Visit TurtlebackLV.com today to find out more, and be sure to use the coupon code LB12, that’s LB for Living Blindfully, and the number 12 for 12% off your order at checkout.

Or you can phone their friendly team. The number is 855-915-0005. That’s 855-915-0005.

Make the most of your device by putting it in a Turtleback case. TurtlebackLV.com, and don’t forget that coupon code, LB12.

Blind Audio Describer Christine Malec Discusses Her Craft, the Luminato Festival Toronto, and More

There is no shortage of things we still must do to advance equality of opportunity for blind people around the world. But I also find that taking strength from the progress we’ve made gives us the energy to keep improving the world.

And one area where we have made commendable progress in recent years is audio description. There’s much more of it now than there ever has been before. Most of us have access to it on domestic television, and all the major streaming media services have a growing collection of titles available.

And there’s more! We now find audio description in movie theaters, at live plays, and museums.

It’s now rightly considered a key component of accessibility. Accessibility to all information is important, and that absolutely includes entertainment, the arts, and culture.

Christine Malec’s name and voice may be familiar to you. She leads a busy life, but just one of the things that she does is co-host the popular podcast Talk Description to Me. And she’s involved in other podcasting efforts as well, which we will talk about.

Christine, it’s a pleasure to have you here on Living Blindfully. Welcome!

Christine: Thank you. I’m very happy to be here.

Jonathan: Tell me about yourself and how you got into audio describing. I’m sure there’s a great story behind that.

Christine: Well, it started when I took one of those sort of get-acquainted-with-a-new-career type jobs.

I had been a massage therapist for 20 years, and I really just wanted to do something new.

And so I started working at the local CNIB, which is the Canadian National Institute for the Blind here in Canada. It’s one of the leading blindness organizations.

I took a job as basically, sort of a receptionist in a community space. And what it let me do was to just kind of keep my finger on the pulse of things, and find out what venues and opportunities were available to make new programs for the blind community.

And in doing that, I started making programs that involved audio describers. I got to meet some audio describers who are based in Toronto, and do some really fun in-person things like okay, we’re going to take 60 seconds of Cirque du Soleil, you’re going to sit with your finger on the pause button, and we’re going to spend an hour unpacking what a circus performance looks like. Or, you know, we’d get together a group of people and let’s talk audio description. Or this was one of my favorites. We took the footage of Chris Hatfield, the astronaut in space, performing the song Space Oddity.

Jonathan: That was amazing.

Christine: And there’s a very famous video, right? But never audio described. So I got 2 audio describers in the same room, we separated them, and they each did an improv description and audio described that piece of video. And so, doing fun things like that got me connected with audio describers who were working.

And during the pandemic, I got to be closer pals with JJ Hunt and we started doing podcast work, which was the Talk Description to Me podcast.

But at the same time, as things began to open up in the real world, I began to do consulting with other audio describers where they would be producing a script for a live performance of some sort, and they would want feedback from a blind person who was an audio description user about their language choice, what they were choosing to describe, the approach they were taking. And so that work was incredibly satisfying because I’d been an enthusiastic audio description user for years. And to have input on how it gets created is incredibly rewarding and fun. So I still do that work as part of my work life as well. So if it’s a live performance or a script for a film, and I also do voiceover for audio description scripts for a few Netflix projects as well.

Jonathan: I know that people’s careers change over the years these days. It’s rare to see somebody stay in the same career all their working life. But that’s one big change for you.

Christine: [laughs] Yes, it’s very very dramatic. And I guess the link is that when I was doing massage therapy, I was always, of course, focused on the client. But it also gives you a lot of mental downtime in a way because it’s very sort of meditative and physical. And ideally, you probably shouldn’t be talking unless the client wants to. And so I had a lot of time to think.

So I’ve always been someone who is processing a lot of the world around me. And when I was ready for a career change, apparently, I was ready to go 180 from a really introverted profession to something that’s turned into extremely extroverted work.

Jonathan: And Canada’s in pretty good shape with audio description because you’ve got federal mandates there, and all of the networks are required to provide it in prime time.

Christine: It’s true. And even the average person who doesn’t care about it watching television at the beginning of many programs, there’ll just be this little pop up that says this program has audio description available. Or this program is described, has video description, you know, describe video, and this is how you access it. So it is mandated, it’s built in.

And also, we’re close to the US, which also, you know, is a major media producer, and they’re producing a lot of stuff, too. And also, Toronto being a very big city, we have lots of opportunity for live stuff, too. So plays, parades, museum installations, stuff like that. So it’s increasingly available in places where we never thought of it being available before.

Jonathan: One of the arguments that’s sometimes advanced is that if you can draw some sort of line between the core audience, in this case, those who can’t see and a wide audience, then you’re more likely to get some traction. Do you find that many sighted people find audio description valuable in some way? I mean, is it really true that a lot of sighted people switch on audio description when they’re preparing dinner, for example, so they can enjoy their favorite TV show or Netflix series?

Christine: It is really true. I have friends who keep it on because maybe their hearing isn’t that great, or they have trouble keeping track of what the character’s names are. Maybe they’ve looked away.

And there’s another fascinating overlap which I got to dive into a little bit recently in one of the podcasts that I was helping to produce, which is people on the autistic spectrum find it helpful for several reasons. One could be that facial expression can be hard for someone on the autistic spectrum to understand. It could be that they’re not sure where their visual focus should be when they’re watching something on a screen.

I had someone recently describe how they had the audio description device. They were checking it out. They were using it at a performance because their job was tech-run. They’re accessibility consultants. And she found that it was a dance piece with no dialogue, and just a few percussive sounds on the stage. And as a neurodiverse person, she was finding it difficult to be in the moment because the sensory stimulation was just not appropriate for her. But having the describer’s voice as a continuous audio input into her brain made it way more comfortable for her to experience the performance.

So there’s all these overlaps that, as usual, when you make something more accessible for one group, it always overflows into another group. So audio description is a classic example of that.

Jonathan: What are the qualities, do you think, that make a good audio describer? And I guess even more slightly controversially, what are the qualities that make a good audio description?

Christine: Hmm. One of the important things that I didn’t think about until I started working with audio describers is that they must have a very good sense of story. And I never thought of this but if you are deciding what to describe and when, you have to understand the story arc and you have to understand what’s important, what the consumer needs, what information do we need for the story to make sense.

An audio describer also needs to have a really good command of language because you have to be pithy. You have to distill as much information into as little time as possible. Because often, in a piece, you don’t have a lot of time.

So good sense of story, good sense of language.

If you’re the narrator as well (which it’s not always the same person who writes the description as the person who narrates it) but if you are the narrator, hopefully, you have a clean, crisp delivery that has some inflection.

So it leads into… The second part of your question is what makes good audio description, which isn’t necessarily the same as a describer. So the description should be audio engineered well into the piece if it’s TV, or however it’s being presented.

But I feel that the narrator should have some inflection.

I was watching something recently, and I was completely distracted from it because the audio describer had a very monotonous voice, and they did not have natural pauses built into their sentences where you might have commas or periods.

And this is something that I think about a lot in image descriptions as well is that if you’re using a screen reader, lots of commas, lots of periods. Because if the brain is taking in information in an audio way, for me, at least, I need those little natural pauses for my brain to process information in an organized way. So I’d say those are the things that are important for a describer and for good audio description.

Jonathan: Television has evolved. In earlier times, it was more radio with pictures. So when you hear audio descriptions of older movies, for example, classic movies, It can sometimes be difficult because there’s a lot of dialog and not that much silence in a lot of those older movies.

These days, though, it seems to me that there are longer gaps in actual dialogue, which give audio describers a bit more room. Is fitting in all that you need to say a constant challenge in these things?

Christine: Yes, 100%.

I was doing the narration for a Netflix series called The Magic Prank Show. And I was totally stoked for this because it talks about a topic that I’ve been diving into in other contexts, which is illusion and stage magic. So I thought, awesome! I’m going to learn a ton about how this is done, and it’s going to be great.

But the series had its own voiceover narration, which is a great feature. We all love it. But it meant that there was very little time for an audio describer to write in the script.

And so I know, for sure, I missed a lot in describing it. But in watching it, because I was interested, I missed a lot because there just was not the time.

And so an innovation that some describers use, but it’s really hard to find the context, is something called extended description. And this is something I’ve been involved with occasionally if, say, an organization is producing its own video. This isn’t network TV or something, but it’s an organization that has some latitude.

So what they’ll do is give, you know, “Okay, there’s 45 seconds of intro notes before the film begins. The film might pause at some point, to really expand on something.”

There’s specific use cases for that. But fitting things in is definitely a challenge.

And so at the other end of the spectrum is… I watched this great movie called Spaceman with Adam Sandler, of all people. It’s a really nice sci-fi.

Jonathan: [laughs]

Christine: I know, I know, Adam Sandler. He’s doing a serious role.

There was vast stretches of no dialog. And it’s one of those things where you go, “I would never watch this without description.” So you just have to work with the piece that you’re given, and work with the time that you have.

Jonathan: Getting a bit geeky for a bit, because there are geeks like me who listen to this. How do you do the audio describing, technically? Are you able to produce those from home, or what’s the process like?

Christine: Each organization has their own process.

As a narrator, the way I work is I get a script, which is the audio description script. I don’t get the series. I don’t get the show script. I don’t get any audio at all. I just get a simple script of text that is meant for me to read. And occasionally, there’s notes like brisk or pause. Sometimes, there’s a pronunciation guide.

What I do is I keep one earbud in. So I’m listening to JAWS, I’m arrowing down, and then I’m reading the description narration into a mic and recording it. I edit it for clarity, make sure I’ve got everything right. I send it back to them.

Then, there’s a technician, (The script has been written by a sighted person), and they take my audio narration, they splice it up, and they fit in the little sentences and phrases at the appropriate place, and then bang! You’ve got a produced audio described piece.

Jonathan: Have there ever been situations where you’ve been following a series that you get to audio describe so you get to know how this cliffhanger resolves itself before the rest of us?

Christine: [laughs] I’ve only had one instance where I was brought back for season 2, but it’s a documentary series. So no, I haven’t had that experience yet where I’m like, what’s going to happen? [laughs] Maybe, maybe that will happen in the future.

Jonathan: I’d love to do the audio description for For All Mankind. That is just one of my favorite Apple TV Plus shows and favorite shows, generally. Great stuff.

Christine: It’s a good one. I enjoy it, too.

Jonthan: It really is. I’m so relieved they’ve renewed it for a season 5.

Christine: Yes.

Jonathan: Looking forward to that.

How did the Talk Description to Me podcast come about? Because it’s a very interesting concept, really. For those who don’t know about it, perhaps you could tell me a little bit about what its objectives are, and how you thought of starting the podcast.

Christine: Of course.

So JJ and I are friends sort of outside professional life. And we would just talk, and he’d fill in gaps about things. We realized how much of the world is not described, and it started with thinking about news. Because in news coverage, say on TV or on screens, there’s constant voiceover, and there are images. But because there’s constant voiceover, you can never audio describe that. And so the images that are involved in prominent news stories are never described for blind and low vision people.

This is where we started with it. It was during the pandemic at the beginning, and there was just so many weird visuals like people wearing masks was new, empty streets were new, boarded up stores, the atmosphere was very eerie and unfamiliar. And JJ and I would talk about this, but we’d also say like, the blind community is not getting these images. And, you know, podcasting is ideal for working from home, which we were all doing. So we just got to thinking like there’s a huge gap in the news.

But also, then, we just started expanding out into thinking physical gestures that people use, body language, coins, currency, landscapes, you name it. There’s no end of things to describe.

So we started out covering topics of the day, like topical news stories. And sometimes, that was very intense because the news is often not very cheerful.

Okay. We’d sometimes do an episode on the Northern Lights, or the constellations. I’m a big astronomy geek, so I’m always like, “Can we do something on the Mars rover, or the James Webb telescope unfolding in space?”

And So the podcast includes a really big range, and I love the idea that someone could just go to the website and cherry pick – just go, “I’m not that interested in dance, but I’m super interested in the James Webb telescope.”, or, “I don’t care about the visuals of Star Trek, but I really want to know about the explosion in Beirut and what that looked like.”, or the buildings collapsing on 9-11. We did an episode on that as a memorial, or COVID memorials.

So there’s such a wide range on the page of things for people to look at. It’s called Talk Description to Me, and it’s TalkDescriptionToMe.com.

And the thing is, they live there as a library and as archives. And something I’ve really come to realize is I can get something described to me, and it makes perfect sense, and I’m so grateful. But 6 months down the road, I’ve probably forgotten.

A sighted person sees certain iconic images all the time. They’re sort of burned into their brain – say, the moon landing. We did an episode on that. And for sighted people, those images are sort of, they’re just there. They live in your mind. But if you’ve only heard the description once, maybe it’s not so entrenched there. So they’re worth a second or third listen because I’m really proud of the work that we have done and continue to do with it. It’s really filling a gap for the blind community.

Jonathan: Do you get input from listeners in terms of what they want you to discuss and describe?

Christine: We do. We’ve always invited that.

We’ve done episodes on the look of baseball, say, which is something that neither of us might have thought to do. But people are interested in that, so we covered that.

And some of the space stuff. I’m very outspoken on the show. “I’m such a space geek.”

Jonathan: [laughs]

Christine: I get pretty excited. So I get lots of emails. “I want more space stuff, too.” So of course, I focus in on those. [laughs]

But you know, we did one on the 4th of July – imagery for the United States because again, that’s not something maybe JJ and I would have thought of. But a lot of our listeners come from the US, and that’s something that they would be interested in. So yeah, feedback has been super great from listeners.

Jonathan: I love the space geekery.

Christine: [laughs]

Jonathan: I am going to take some annual leave or do something whenever we get a human back on the moon. I’m so looking forward to that.

Christine: Oh my gosh! It’s true.

Jonathan: And the audio is so much better now from space than it was then.

Christine: Well, there’s these Mars recordings too, from the rover where there’s actually a recorder on another planet, and you can listen to sounds that were generated in the very thin atmosphere of Mars. That just sent shivers down my spine.

Jonathan: Yes, it’s quite amazing.

The Chris Hadfield video also sent shivers down mine. Hearing him sing in space was amazing to me.

Christine: Oh, I’m so happy that there’s a guitar in space.

Jonathan: Yes. [laughs]

Christine: It just pleases me so much. I’m a musician and a singer, too, so to know that music lives outside of Earth’s atmosphere is just so heart-warming to me.

Jonathan: Yeah.

People’s appetite for audio description varies a lot. We had a very interesting discussion during the pandemic, I guess it was, when there was this little movement that seems to have peaked and died off quite a bit where at the beginning of meetings, people were describing themselves in quite vivid detail to blind people on the call. And what people found, at least in some cases, was that if you were the only blind person on the call and you felt like you were holding up the works while everybody said hi, my name is so-and-so and today, I’m wearing,… you know, and this is what I look like. It made them stand out in a way that made them feel uncomfortable.

And some people said look, I don’t care what you’re wearing or what you look like.

Christine: [laughs]

Jonathan: And other people like me were saying I’m actually kind of curious about this.

These days, I guess the middle ground might be that we have AI, which we didn’t really have in common usage when that debate got started. So now, for example, I can be on a Teams call with one of my staff, and I can do PictureSmart in JAWS and actually find out that this person I’ve worked with for 5 years has a beard, and I had no clue that they had a beard. And I don’t need to know they have a beard, but I just find it curious as a curious person.

[laughter]

Where do you stand on that debate?

Christine: It’s such a good question.

The short answer is I’m not really in favor of it for the reason that you gave. And also, it’s very time-consuming. I don’t need to know those things.

And is someone going to tell you everything? If someone’s got, I don’t know, a really conspicuous birthmark on their cheek, are they going to tell you that? Probably not. People tell you what they want to tell you. And so it’s not really that accurate.

But what I will say is that there was an aspect of it that I absolutely loved.

So in Toronto, we’re very multicultural. It’s very diverse. Most people in the city were not born in Canada. And when I hear accented speech, I know someone probably was born somewhere else. That’s cool.

But there are so many people who were born in Canada to immigrant parents. And I love to hear on a call, just people say, I’m a la la la of Asian descent, or I’m, you know, I’m African Canadian, and they don’t sound it. And I just love that because the diversity of my city is something that I love so very much. And I can hear it sometimes, but there’s so much more of it that I can’t take in because it’s visual.

So in general, I’m not a fan of self-descriptions. But secretly, I do kind of like them. Although, it often happens, someone will say, “I’m wearing John Lennon sunglasses.”, and I go, that’s not helping me. [laughs] So I just have to kind of smile and go oh, yeah, and just move along. So the short version is no, I’m not really in favor of it, but I will say I enjoyed it while it was a bit of a trend.

Jonathan: And then, it creates a cultural dilemma because you don’t want to seem ungrateful, …

Christine: Yes.

Jonathan: but you might want to say look, enough with the personal self-descriptions already. Can we just get on with the meeting?

And people think you’re being ungrateful.

Christine: Yes, yes, yes. It’s awkward. It’s awkward. [laughs] I’m glad the trends moved away.

Jonathan: Yeah.

Christine: They are good. It’s true.

Jonathan: Yeah, it’s true.

So if all this isn’t enough, all that you’re involved in, You are involved with a podcast for the Luminato Festival in Toronto. And when I heard about this, I thought, “Is this like some sort of secret society or something?”

Christine: [laughs] It does sound like that, doesn’t it?

Jonathan: It does, a little bit. But what is it actually, really?

Christine: What it really is is a festival that’s about 11 days in June each year, and it’s a festival of arts and culture that’s been going for many years in Toronto. And it’s live events. It could be performance art, a play, a concert.

There’s a public square that they’ll take over every night for 11 nights where there’ll be music, and activations, and artists and all kinds of things to do. It’s really citywide, like the whole greater Toronto area. There’s installations all over the place. There’s mural installations in subway stations, like on the transit. There’s all this diverse stuff going on culturally.

And I really want to note, they did this before the pandemic. So the pandemic was a real turning point in access for many organizations. But Luminato was starting this before the pandemic.

They started having something called an access advisory, and this was a group of people from wide-ranging parts of the disability community who came together to talk about how the festival and the organization could be more inclusive and accessible. And one of the things that came out of that is something called Radio Lumi. So L-U-M-I, short for Luminato. This is an audio-only experience of the Luminato Festival, and it’s sort of led by members of the blind community and contains lots of audio description. And it’s from sort of a blindness perspective, but it’s a brilliant audio-only experience.

If you want to knead your sourdough bread, or drive to your parents’ house, or whatever you’re doing where you need to not be looking at a screen, this is also content that’s really valuable. And so this year, the iteration of Radio Lumi is, it’s a standard podcast feed. So anywhere you get podcasts, look for Radio Lumi, L-U-M-I. It’s a series of podcasts that’s heaviest during the festival season. So we’re releasing 2 a week at the moment. After the festival, we’ll still be producing, but a little less frequently.

It’s artist interviews. It’s interviews with producers. It’s audio-described content of visual art. There’s conversations where one of the blind or low vision hosts like me will talk with an audio describer about public art. Tell me about public art in the city. Where would I see it? What does it look like? How do I know that I’m looking at public art? How do people around us, how are the general public reacting to it?

We have conversations with audio describers about things like urban parks and public squares, those are such a vibrant part of city life. But if you’re blind, for me, they’re just navigation hazards that I’m most concerned to avoid because they’re sort of unpredictable, and they’re sort of blank spaces.

And one of the most lovely parts for me is I get to interview people from a blindness perspective. I don’t have to, you know, say I was working for a mainstream radio network or something. I would sort of have to act like anybody. So, you know, ask questions that they’ve been asked a million times before. But what happens when you ask questions from a blindness perspective is the artist has to think about their work in a different way. And it almost always happens that the artist says, “I’ve never been asked that before.”, or “That is such a great question.”, or “That’s one of the most interesting interviews I’ve ever done.”

For example, a couple of years ago, I got to interview Edward Burtynsky, who’s a fairly world-renowned photographer, and he makes enormous photos of the impact humans have on the natural world. And they’re devastating photos, but they’re completely visual, of course.

So what Radio Lumi did was had an audio describer describe the images. Then, I got to chat with Edward Burtynsky as a blind person. So I could say something like, “Okay, what makes a perfect photo?” And he’s got to think about it more deeply than he normally would.

One of my interviews this year was with a modern dancer who, you know, he’s a dancer. That’s his art. That’s what he does. And I got to say, “Okay, I know you’re not an audio describer. This is probably new to you. But take the first 30 seconds of your piece, and please describe it for me, and what do you want me as the description user to know?”

And that, for me, is one of the greatest parts is it’s not just getting that part of arts and culture that everyone gets. I get to ask the questions that the members of the blind community want to know. And it’s very, as far as I can tell, it’s pretty unique.

And Luminato has really stepped up to do some fantastic stuff with the festival. For example, they have whisper guides. This is brilliant. For the live events that I mentioned in a public square every night for 11 nights, you can reserve a whisper guide. So what that means is you can say, “Okay, I’m getting off at X subway station. Could you please meet me there?”

You have a guide who has some training with being a sighted guide. They’ll walk you through, they’ll describe stuff for you, they’ll answer your questions. You want to get some street food, you want to buy some art, you want to know what the musicians look like. So this is the level of access that Luminato has stepped up for.

And I think all of their offerings this year have audio descriptions, so I’ll be attending several audio-described performances like, live-described. And for sure, I’m going to reserve myself a whisper guide for one of the nights at the Luminato in the Square as well.

Jonathan: That Whisper Guide concept is amazing because even for very well-seasoned, competent blind people, those environments are very difficult.

Christine: They are. It’s true.

My partner and I love music, and we have gone to many music festivals. And every time we’re on our own, it’s always, “Okay, how’s it going to work out this time?”

And so we try and find someone to contact, and we always work it out. It always ends up happening, and there’s always people ready to help and be kind. Boy, it’s a lot of problem-solving.

This just takes the whole thing right out. And even the sign up process, this is so beautiful. They have this concierge service. And so if you want to go to something, there’s an email address. You can not go to the website and not spend 15 minutes trying to figure it out because it’s so different from the last ticket purchasing thing you did.

And it’s not like we can’t do it. Of course, we can. I can get my tickets online.

But there’s this cognitive load, right? It takes a long time. It takes probably longer than a sighted person would take.

So they’ve got this concierge service. So I emailed into EMaxwell@luminato.com, and I said, “Here’s the shows I want to go to. Could you please get me a ticket and reserve me an audio description device? And here’s the date I’d like a whisper guide for.”

And this is what happens when you have people with disabilities at the front end of the process, is we know what the challenges are, we know what the barriers are, we probably have good ideas about how they could be ameliorated. And there’s the festival organization with people who can do that stuff. And so that’s what’s happened at the festival, and it’s really amazing.

Jonathan: That does sound amazing. It significantly lowers the barrier to entry for participation. And that’s really important because these events should not only be available to the most capable, well-adjusted blind people.

Christine: Yes, yes.

And I often say it’s the last 50 meters, right? It’s getting from wherever you get dropped off. Maybe you’re getting a car, or an Uber, or you’re taking transit, or okay, now I’m at the closest subway stop. Now what?

And like we said, we can work it out. We’ve got GPS, we’ve got technology, I can do that. But boy, it takes a lot of stress out of my life if I can know that someone’s going to meet me there.

And so not only with Luminato, but when I consult on live shows for audio description, I always try to find a way to mention to the venue, “This is a big problem.”

And honestly, you probably feel the same. When I find out about a cultural event, my first question is, am I interested? Second question is, how much of a pain in my behind is it going to be to get to my seat, from my house to my seat?

Jonathan: [laughs]

Christine: And it’s usually, I can plan a trip on the transit, or I can choose to take an Uber if I can afford that. but it’s that last 50 meters.

So more and more, I’m advocating for theaters or production companies to be thinking about that part, because that’s the really hard part that technology can help with, but ooh! Is it ever easier to just have someone who can help you through the last 10 minutes of getting into your seat?

Jonathan: For sure.

Have you always been blind?

Christine: I have, yeah.

Jonathan: Right.

I ask because I want to come back to the visual arts. I wonder how easy it is for someone who’s been blind since birth, like you and I have, to appreciate things like classic works. You know, when people talk about the Impressionist era, things like the Mona Lisa, really important, impactful works of art, they just don’t resonate with me at all. But I’m sure that others perceive it differently.

Christine: Yeah.

So when I was referring to the interview with the dancer that I did, he worked really hard at it. He gave me some good description. But in the end, he said, “But I’m a dancer. If I could put it into words, I’d be a poet.”

And I just laughed and I said, “That’s what I was afraid of, right? That some arts just are maybe not translatable.

And I feel exactly the way you do about visual art, especially. Like I try, I’m interested. I listen to descriptions, and I try to retain it. But I don’t remember details of the Mona Lisa. I’m sure I’ve had it described, but it’s just, it’s not a form that’s going to resonate for me.

And so as a, I like to think I’m a bit of a renaissance woman, so I want to know as much as I can about arts and culture. But am I ever really going to care about art? Is modern dance really going to speak to me, or evoke an emotion? Not so far. Not so far.

Jonathan: Photography, on the other hand, is interesting. I mean, when I was a lot younger, I never thought that blind people would be taking great photos and sharing this. And I still find some discomfort about sharing content that I can’t personally evaluate. That’s why I’ve never jumped onto YouTube in a big way, or taken a lot of photos that I’ve posted on social media, because I’m such a perfectionist that I worry that people are saying oh, yeah, this is okay for a blind guy. You’ve taken a good photo.

Christine: [laughs]

Jonathan: But I don’t want that, you know. I want a really good photo, and I want to know for myself that it is.

Christine: Yeah.

Jonathan: But a lot of people are really getting into photography now. That’s fantastic.

Christine: I believe it’s all because of AI. So there would always have been some experimental people as blind people who would like, here, I’m going to take a photo. And some sighted people go oh, that’s so cool. How interesting. And that was a bit of a, you know, a little bit of a side interest thing.

But with AI description now, … I posted on Mastodon recently, I’ve become this person who takes photos of their flowering plant because I can, right? I can take a photo of my succulent plant that’s flowering because i think it’s really cool, I can post it, I can put some hashtags on it, and i can use AI to know that I’ve got the plant in the foreground.

Jonathan: Yes.

Christine: So I don’t think it’s so much that we’re gonna love photography, it’s that we can now participate in the public sphere in a way that we couldn’t before because especially since smartphones, photos are everywhere. They’re the common language that people use on social media. So what I feel about photography is that it’s not that I’m ever going to love it for its own sake, but I love that it helps me to participate in the common culture.

And it’s funny that you say that. I never felt self-conscious about posting photos. But where I feel weird, I do not like being photographed.

Jonathan: Ah.

Christine: I do not like it. I want someone to check, ask my permission first. And I really believe it’s because it’s a language I can’t speak. I can’t evaluate my own photo of myself, and I can’t compare it to anyone. And I’m sure you know people who, as soon as someone snaps a photo, they’re like, okay, let me see it. And they want to evaluate, how do I look? Do I look okay? And it’s not that I’m vain, but I feel that it’s a language I can’t understand, so I don’t want to speak in it, if I can use that metaphor.

Yeah. I don’t like being photographed. I don’t like being on video, but I’ll snap a picture and put it on social media. But that’s funny that you and I come to have different discomforts around the same topic.

Jonathan: Yes, it is.

Christine: Yeah.

Jonathan: And I will do that, too. Or I might take a picture if we’re on holiday somewhere to share with the kids, or different things like that. And I’m very happy to do that. It’s more the sort of public side, but yeah, I get into it.

And I agree with you. AI is providing some of that verification now.

I got a new device recently, and I wanted to take a picture of it for my daughter who helped me to locate this device and recommended it. And I was able to take a picture, verify with AI that I’d got the device in the picture, and sent it off without any kind of comment or anything and got the appropriate “OMG! It arrived.” response. So that’s kind of satisfying.

But on the other hand, we went to Europe a couple of years ago for a pretty cool time, and I just did a lot of audio recording because that’s memorable to me. And every so often we’ll go through those audio recordings of, you know, a boat ride we took on the River Thames, or the Abba Museum or whatever. And we have those. And that resonates a lot more.

Christine: Yeah. I’ve done that too on trips. I’ve done some audio recording. I have a pair of binaural mics, so you get fabulous immersive audio. And we’ve got stuff from Ireland and from a cruise we were on, and that’s really sweet stuff to have.

Jonathan: I wanted to ask about career paths. There will be a few people who were saying gosh! I wish I had the job that Christine does. Audio description is such an important component of people’s lives these days. Is it easy to break into this industry?

Christine: I would say not. I would say not. I was so lucky. I feel like I was in the right place at the right time and happened to meet the right people.

Patience and persistence are, you know, invaluable things. And the older I get, the more I realize. The more I realize I have them, which I didn’t think I did, but apparently I do. But I realize how important they are. And so I just kept saying yes to a lot of things that maybe I was, you know, I was like iffy, like, am I going to get paid for that? Or is it worth doing without pay? Because I’ll meet some interesting people and learn some stuff. So I don’t know that it’s an easy field to break into, but it’s obviously doable.

And there’s a movement, which is good for audio description production to include blind people wherever it can. So obviously, narration is something we can do, audio editing, something we can do, consulting, for sure. Even Robert Kingett, he’s a writer, and he writes audio description. And I’ve heard other blind or partially sighted people say they’re involved in the writing process. So it’s a collaborative process with a sighted person. And so there’s definitely way more scope for us to be involved.

And I’m a huge advocate of that because media companies, they have a lot of money, and we should get some of that. [laughs] I want more money, too. And I want members of my community to have more money. And if the media is spending money, especially on stuff for our community, we should have a piece of that as well.

Jonathan: Do you ever come across a piece of text that you’re asked to read in your audio description work where you think, I’m not really sure this phraseology is right, or I wonder if this truly is an accurate description of what’s going on? Because I guess you’re not able to verify that visually for yourself. So you ever go back and question what you’ve been given?

Christine: Often, and it’s difficult when you don’t have the show, or the script, or any audio at all. [laughs]

So the first episode I got of the magic prank show, I didn’t know anything about the show. I just get a script. One of the first line is, description lines is, “She opens the freezer, and sees her husband’s severed head.”

Jonathan: [laughs]

Christine: I’m like, okay, I need some context here. How do I deliver that? [laughs]

So when you get a script just out of nowhere, it’s often difficult to know what inflection to use, because maybe they’ve been talking about the red car the whole time. And then, there’s a description line that says, “And a blue car drives up.” And if I knew the show, I should say, “And a blue car drives up.”

Jonathan: Right.

Christine: But I don’t know that. And so my inflection, it’s often guesswork. And I trust the writers to do the right thing. But it’s often an isolating process, just because I don’t know the vibe of the show.

Like I’ve described something called Chad and JT Go Deep. And you could tell it was a pretty fun, fun-loving, goofy show.

But as I was listening to the finished product, I thought I should have jazzed this up more because these guys are really out there.

And so it’s kind of difficult not to have the context when you’re doing the work. But it’s also a fun challenge to, as a storyteller, think, can I read this tiny skeleton and understand what the story is trying to be, and what the tone of the story is trying to be? Sometimes, I get it right. Sometimes, I don’t. But it’s a challenge that’s kind of fun.

Jonathan: I spoke with Joel Snyder early on, and we were talking about how there were standards emerging relating to audio description because it’s not so much the technical quality I’m thinking about, it’s more what do you describe and how do you describe it in a way that’s impartial and effective.

How is work going in terms of guidelines for the industry as a whole relating to how you describe?

Christine: There are guidelines, but they are guidelines. And each production house kind of makes their own policies on it.

And so, as part of Radio Lumi this year, we had a roundtable discussion about audio description and the current state of it. And one of the things we talked about was, do you describe race, skin tone, body size, and ethnicity? And some organizations don’t want you to touch it. Some organizations really want you to use the appropriate language, which is continuously evolving. And so those things are really hot topics. So every organization kind of makes its own choices around it.

But it’s true that sometimes, if you choose not to describe skin tone because you think you’re being more equitable, one thing that happens is marginalized groups don’t know that they’re being represented on film. And maybe later in the story, skin color becomes incredibly relevant, but the audio description user doesn’t know the skin tone of the character.

Yeah. So I don’t know if it’s possible to have, you know, regulations that apply to everybody. I think organizations are just going to continually sort of evolve and make their own choices around those issues.

Jonathan: As a consumer, my immediate response is look, if that information is visually available to a sighted viewer, I should not be deprived of it.

Christine: Yes, I agree. And then, it also might come down to the actor is of Asian descent. It’s irrelevant to the character. There’s a lot going on in the plot. I don’t have time to describe all the actors. And so it becomes a process of what information can I weed out because I don’t have enough space? So describing the actors on screen may become a lower priority.

And it’s not necessarily that someone says I don’t want to talk about skin tone. It just becomes a matter that that information is less important to the player.

But if you’re someone of Asian descent, or you’re someone who’s African-Canadian, or of a different skin tone, you want to know that you’re represented on screen or on stage.

So it’s a tough one, and I have no firm opinions about how it should be done. And I’m completely in sympathy with describers who have to make hard choices about it.

Jonathan: We’re also seeing automated audio description, and there’s been a lot of debate about this. And I guess it’s one thing if you’re hearing eloquence describing something. I mean, that’s a bit of a giveaway right there. [laughs]

But these days, with technology like ElevenLabs and various high-quality voices that you can essentially license and throw anything at, do you see that as a danger? Or is it an inevitable and positive thing that more content might be available to us that way?

Christine: It’s inevitable, for sure. I guess it’s positive. But of course, I don’t want to be put out of work, either.

I’ve checked out many things on TV where it’s a synthetic voice. Do I like it? Not really. But you know, I want more content.

I worry that it becomes an excuse for media production companies to pay less, or to not pay humans. And they have enough money. They can afford to do it right, most of them.

The distinction is interesting too between, is it the narrator in synthetic audio, or is it being described by AI?

And so one of the things I’m watching with intense interest is the evolution of AI describing images with respect to the following, which is at the beginning, and I still find this, AI is dying to tell me a story. It wants to use words like cozy, and it wants to have a lot of editorial control over it. It’s dying to tell me a story. I can just feel that that’s what it wants to do. It’s not very good at it yet, but it’s moving pretty fast.

Where this fits in with audio description is, when will an AI be able to describe a movie? Is that ever going to be possible?

I don’t think that’s an impossible idea. I think that if an AI had a script and some guidelines about plot, I think it’s a doable thing. Is it a desirable thing? Yeah, it’s the inevitable, like, we could get more content, but it’s going to be more questionable less.

I hope it’s always going to be vetted because as we know, AI makes mistakes, sometimes egregious ones. And so if, when we get to the point where video is being described by AI, I really hope that process is vetted.

But you know, the synthetic voices, they’re getting more and more chillingly realistic all the time.

Jonathan: Yeah.

Christine: So sometimes, I can’t even tell. So I’m not as fussed about that part, except that, of course, I want to keep earning a living. So I want to keep having humans do the narration. [laughs]

Jonathan: The voice artist for Living Blindfully who does the sweeps and that kind of thing, I had a number of people say to me, I’m glad you used him, you know. They know this is Mike Moran, and he’s pretty well known in the blind community.

And when I say to them, you realize that he’s AI, right? We’ve licensed his voice with his permission. And it means that I can just type what I need him to say when we need new trailers done, that kind of thing. And he says them, and he’s given me permission to do that.

But most people, maybe I’ve had one or two who say they can tell. Most people say, I had no idea that was AI.

Christine: Oh, definitely. The blind community is your toughest audience, to be honest.

Jonathan: Yeah. [laughs]

Christine: We’re connoisseurs of synthetic speech. [laughs] So if anyone’s going to catch it, it would be your listeners, for sure.

Jonathan: Technical quality of audio description really pushes my buttons. You often have audio description really badly ducking the audio, or it is so difficult to hear the audio description that it’s next to useless, or you’ve got audio description that’s way louder than than the original audio. So basically, level fluctuations.

And the other thing that really frustrates me as somebody who has, sadly right now, invested in a number of Sonos devices, is we have Dolby Atmos…

Christine: Oh, snarl snarl!

Jonathan: [laughs]

Christine: I know, I know this is a thing. Jason, my partner, is very technical. He’s been talking about this, too.

Jonathan: Yeah.

Christine: Lots of growling.

Jonathan: Yeah, I mean, we have Dolby Atmos. And with the exception of Apple TV+, when we switch audio description on on Disney Plus or Netflix, usually what happens is it goes down to stereo at worst, maybe 5.1 if we’re lucky, but that’s unfortunate. There’s no need for that.

Christine: It’s so true. There is no need for it. And it’s just a case of like, they’ve checked the boxes, right? They’ve got audio description.

Jonathan: Yeah.

Christine: That’s what their priority was. And so the more blind people we get into roles of production, the better that’s going to be.

Jonathan: Exactly. Because you obviously know people that I don’t, you’re in the industry. And so to Have blind people like you doing this and just being able to have a word with people on the inside, it makes a world of difference. I think a lot of people don’t fully appreciate the difference that blind people who are just working in various industries can make, and they don’t get a lot of applause for it. And they can’t tell you they’re doing it, but they’re making that quiet difference from the inside.

Christine: I was asked to do a little promotional video about why it’s good to hire people with disabilities. And the main thing I said is, if you’ve got someone in front of you who has a disability, you better believe that they’re an amazing problem-solver. They’re incredibly resourceful, because they’re sitting in front of you. How’d they get there? You have no idea how they got there, because you don’t live their life. But they have 100 problems to solve just to get to this interview.

And so I’m huge into that, that our resourcefulness as a community is underappreciated, undervalued, and there’s tons of ways that we could be involved in the audio description process.

Jonathan: I have enjoyed this so much. And as we wrap, can you just give me some links, particularly since Radio Lumi is the main reason why we’ve got you on the show now, but how can people continue to catch up with your podcasting and other work?

Christine: Yeah. So on your podcast feed, if you look for Radio Lumi, (L-U-M-I). Also, Talk Description to Me. We’re not producing episodes all that often for funding issues, but there’s still a trickle coming out, and the archive is still there. So Talk Description to Me and Radio Lumi are in your podcast feed.

If you’re curious about more of the accessibility for Luminato, you can go to LuminatoFestival.com, and there’s an accessibility hub there. All of those things have archives and stuff to check out as well.

Jonathan: And when does the festival begin this year?

Christine: It begins June 5th. We’ve been releasing episodes steadily, though. So there’s some amazing content up already, with lots more to come. So go to the podcast page. If you’re interested in any way in arts and culture, and you’re in the blind community, you’re going to find something fun.

Jonathan: I’ve enjoyed this so much.

Thank you very much for doing it, and we look forward to staying in touch.

Christine: Thank you so much.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions.com.

Thoughts on Glide

Caller: Hi, Jonathan. This is Walt Smith in Seminole, Florida.

I really enjoyed the article on the Glide technology, and I was hoping that perhaps at some future date, as the technology is expanded, he might look into making it available for those of us that are required to use a walker. A blind person with a walker is almost an accident waiting to happen, and it would really be very useful if that technology could be applied in that specific situation.

A walker is very easy to get off-center, and the glide technology sounded like it would be something that would keep a person centered. It would have to be dramatically different in terms of the way it would work. You wouldn’t be able to hold on to a harness like you would with a guide dog. It would have to be built into the walker itself. But it is an area that I would like to see at least, considered.

Jonathan: Thanks, Walt! It’s a very important point. And I’m sure Amos will be aware of this because he’s well aware that most people develop a vision impairment later in life. The vast majority of blind people are over the age of 80, in fact.

Google Podcasts Closing Down

Voice message: Hello, Jonathan and all the listeners! Edvard here, and I’m here with not such good news for the Google Podcasts users.

I got an email saying that the Google Podcasts will be going away soon. And after the 23rd of June 2024, you will no longer be able to listen to podcasts. And after the 29th of july 2024, you will no longer be able to migrate your subscriptions.

So just a bit of warning before those dates. For anyone that are using google podcasts, I would strongly advise to immigrate your subscriptions to somewhere someplace else. And whenever you do that, best of luck.

I never used Google Podcasts, and it’s probably actually for the best that I never did. But just a little bit of warning for any listeners out there who are using this, do it before those dates.

Jonathan: This happens a lot in Google land, doesn’t it? You can use a service that you really depend on, you’ve become familiar with, it’s become part of your daily routine. And all of a sudden, Google giveth, and just as quickly, Google taketh away.

Google Reader is probably the most shining example that I can think of, of a service that I used all the time. And then, they just decided, oh, we’re not going to do this anymore.

But Google does this an awful lot. And it’s one of the things that makes me nervous, because there can be some sort of compelling thing that they’re offering. And then all of a sudden, they’re not. The world is littered with Google-abandoned projects.

Thanks for the heads up. That will be very useful to anyone who hasn’t heard about this yet.

Closing and Contact Info

It’s just like George Harrison says, right? All things must pass. And so must this episode of Living Blindfully. Really appreciate all your very interesting contributions this week.

Loved talking to Christine. It was great to hear from a blind audio describer, wasn’t it?

We’ll be back next week with more.

Remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: opinion@LivingBlindfully.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.

[music]