Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.



Welcome to 282.. 2

A Lot Is About to Change in Our Lives, Thanks to ChatGPT 4O… 3

Sonos Comments on Recent Accessibility Regressions. 6

Listener Comments on Sonos. 9

Essential Series Button Layout Descriptions Available.. 15

The Importance of Blindness Agencies Leading by Example When It Comes to Accessibility. 15

New Hearing Aids. 17

Improving the Quality of iPhone Calls in Challenging Environments. 21

Judy Dixon, President of the International Council on English Braille, Discusses Their Forthcoming General Assembly and Braille Today. 23

Interview with Yalim Gerger, Creator of 29

A Demonstration of 40

Closing and Contact Info.. 48




Welcome to 282


Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

Get ready. AI is about to change our lives again, more on Sonos and accessibility, Judy Dixon talks about the General Assembly of the International Council on English Braille, and turn all those articles you want to read into your own personal podcast with

We must brave this frontier alone because yet again, there is no North American area code 282.

And as we’ve already covered when you get to the 2 8s in the country code, there is nothing. There’s a bit of extra capacity there, in case they have to introduce some more area codes to Africa. So in this instance, it’s me, you, and 282.

But we also have Pneuma Solutions to tell you about who are, of course, among other things, the people who bring you RIM (Remote Incident Manager).

I was talking with Judy Dixon (You’ll hear Judy later in the show.) about the International Council on English Braille, and we were talking about RIM.

And I said, “Do you like RIM?”

And she said, “I love RIM.” And she said, “And I enjoy the way you talk about RIM, and you clearly are enthusiastic about it.” And I am.

But if Judy points out where you’re falling short, you listen, right? I mean, you listen. And one thing she said to me is, “You never mention the thing that I like about RIM the most.”

I said, “What do you like about RIM the most, Judy?”

And she said, “I like the fact that when you’re connected on a RIM session, If you’re helping someone or someone is helping you, you’ve got a voice channel there.”

And she’s right, you know. I don’t think I’ve really emphasized this when I’ve talked about the fact that Pneuma Solutions sponsor the transcripts of Living Blindfully. So this is a very cool feature. If somebody is assisting you to get a task done and you need to be able to communicate with them, or they need to impart something to you (ask you questions about what you want filled in on an inaccessible website, or with a particularly bad app or whatever), you’ve got a voice channel right there built into RIM. You don’t need to use any other tool, so it really is a complete accessible solution for remote access.

If you want to find out more about RIM and take it for a spin, you won’t regret it. I don’t know where I’d be without this product. It just is something I integrate into my life daily. You can go to That’s There you go, Judy.


A Lot Is About to Change in Our Lives, Thanks to ChatGPT 4O

I have something incredibly consequential to start us off with this week.

You might remember that last year, I spoke with Mike Buckley, who is the chief executive of Be My Eyes, when they were rolling out the Be My AI product. And Living Blindfully was the first podcast to bring you what was then an exclusive demonstrating how Be My AI worked in a whole lot of scenarios.

That was a little over a year ago. And yet, in that short time, Be My AI has become a part of the toolbox for many of us.

We know that AI sometimes hallucinates, but it’s also incredibly useful a lot of the time.

And when I spoke with Mike Buckley back then, I said, “Won’t the game changer be when one day, the AI gets so good and things speed up so much that we’ll have a real-time video feed of what’s going on around us?” And I think we may have speculated about how long it might take to get there. I did not anticipate that we would be talking seriously about this a little over a year after we had that conversation.

The pace of AI is rapid right now. Some people find that a bit frightening. We’ve obviously got to watch that, but it’s also incredibly empowering.

On the 13th of May, OpenAI, the people who make ChatGPT, unveiled a new version of it. It’s a little bit confusing from a screen reader user’s point of view. It’s called ChatGPT 4O, but it is the letter O for Oscar. And I believe in this instance, the O stands for Omni.

Now, if you’ve not caught up with this, I’d encourage you to go to YouTube and search ChatGPT (which is all one word), 4O (just the number 4, and then the letter O), and see if you can get some of the demos, including the 26-minute presentation where they unveiled this. It is a remarkable leap forward.

It’s multimodal. What that means is that you can talk to it. It understands emotions. It will respond in more human-like ways. You can hold things up to it and say, “What am I looking at?” You can give it documents to analyze.

There will be apps that essentially read your screen, and that is going to change the nature of screen reading. Because this will all be available through an application programming interface. And what that means for the non-geekily inclined is that third-party apps will be able to take advantage of all of this technology as well. So you will see screen reading manufacturers integrating this in some interesting ways. Of that, I have no doubt because if the demos are to be believed another big feature of this innovation with chat gpt is that it is fast. It is very fast. And if it is as fast as the demos indicate that it is and there’s not some sort of naughty Wizard of Oz type things going on behind the scenes, it means that you’d be able to open a spreadsheet, or a PowerPoint presentation, or whatever and ask your screen reader to give you an overview of it.

We’ve had a taste of this with the brilliant integration that Vispero has done with JAWS and PictureSmart, but it takes a while. You have to actively invoke it, wait for the image to go and be processed, and come back and you get the text description. At the risk of lapsing into jargon, there is a bit of friction there.

But with the speed, the lack of delay in this thing, you could have a screen reader monitoring in real time what is going on. Because JAWS is a commercial product, they have the option, I would imagine, to license this and build it into their costs.

You may well see it in NVDA first, and it may be that you will require an API key that you might have to pay for. So there’ll be probably a little bit of geekdom, I would imagine, involved in getting it going with NVDA. But if you don’t mind that geekdom, or you have someone who can help set it up, or there’s really good step-by-step instructions to set it up, this is going to be a game-changer for screen reading. We can get out of this linear way of looking at the screen and query the screen for even more efficient access.

That’s exciting in itself. But let me come back to Be My AI, because what I want to play for you now is a video that’s not on YouTube, it’s on Vimeo.

I’ve got to tell you, this is one of those flashbulb moments. All the way back in episode 2 of this podcast, we talked about flashbulb moments. [laughs] These are big things where you will always remember where you were and what you were doing when they happened. People put the Kennedy assassination (who are old enough to remember it) right up there. The assassination of John Lennon, for many, is similar. The Challenger disaster.

They’re not all necessarily bad things, of course. You might remember, for example, in great detail, the first conversation you ever had with the love of your life, all kinds of things like that, but they’re consequential.

So I will always remember that I was in a hotel in a city in New Zealand called Dunedin when I first came across this video.

I’ve been involved in the development of a bit of technology at a product management level. I’ve used a lot of technology. It’s not that I’m cynical, but I’m, I guess, hard to truly impress, I think is how I would describe it, because I see things evolving.

But every so often, you see something come along and you think, “This is such a sea change. This changes everything.” And I think the original Be My AI was like that.

This is just a completely new level. I had shivers down my spine when I first saw this video, but I was also skeptical.

So I dropped an email to Mike Buckley, the Be My Eyes CEO, and I said to him, “Mike, we talked about this only a year ago. Is this a proof of concept, or is this thing actually real right now?”

Mike says it’s real right now. This is coming, folks. This is coming. And it’s soon. I mean, this is not some sort of futuristic thing you can expect a year or 2 from now. I think we’re months away from something being available.

I’m going to play you this and explain what it is that you’re hearing, because one of the challenges we now have is that it’s getting increasingly difficult to tell what’s text to speech and what’s a human.

The female voice you hear on this is ChatGPT. This individual is not talking to a human. This is using ChatGPT 4O. It’s Be My AI, and I’m going to play it to you now.


Male voice: Is the king in residence at the moment?

Be My AI: Yes, it looks like the king is in residence right now. The royal standard flag is flying above Buckingham Palace, which is the signal that the monarch is present. How exciting to be there while the king is home!

Male voice: Try and tell me exactly what they’re doing right now, please.

Be My AI: Um, right now, the ducks are gently gliding across the water. They’re moving in a fairly relaxed manner, not in a hurry. Occasionally, one of them will dip its head under the water, probably looking for food, and then pop back up.

Male voice: I even know when a taxi is coming with its orange light on. I think I’ll hail it to get home.

Be My AI: Yes, I spotted one just now. It’s heading your way on the left side of the road. Get ready to wave it down.

Great job hailing that taxi. It looks like you’re all set to go.

That’s a good dog right there, leading the way into the taxi. Safe travels!

Jonathan: Alright. So we’re going to have to kick the tires on this. We’re going to have to see what it’s like in the real world.

And of course, AI hallucinates. We know that. So I have no doubt that those who are going to have access to the API to use in that sort of situation where you’re out and about, you’re doing things like hailing taxis are going to have to consider the ethical and moral questions around this because if it gets something wrong, there could be serious consequences.

But we’re very close now to something being available to you and me that will be a remarkable travel enhancer. The weak link, I think, especially for iPhone users, is that Apple has the camera app very locked down, so there aren’t wearables like glasses and things that can connect straight to the iPhone and just see what you see.

You can do chest harnesses and things like that, and people have done that for Aira. And if you want to strap your phone to yourself in that way, I guess that’s one way of being hands-free, because we’re already without one hand, which is holding our guide dog harness or white cane, right? So we don’t want to have the other hand holding the phone out in front of us for too long.

Hopefully, Apple might, at WWDC, come up with some sort of API that allows some wearable devices to use the camera technology, the infrastructure that’s built into the iPhone. Otherwise, that is another very attractive feature of Android where these things just aren’t so locked down.

You can find out more about all of the things that ChatGPT 4O can do if you, as I say, search on YouTube. There’s a lot of very impressive video material out there demonstrating all kinds of use cases for this, and many of them are going to be quite beneficial for this community. We will obviously be right on top of this as it evolves over the next few months.

Sonos Comments on Recent Accessibility Regressions

Let’s return to Sonos, which has been a hot topic over the last couple of weeks after they seriously broke accessibility through a complete rewrite of their app.

Earlier in the week, I participated in an Ask Me Anything (AMA) session that was on the Sonos community. It was a bit chaotic, but I logged in right at the beginning and got my question in, in the hope that it would be addressed.

I’ve put a link to the summary of the AMA – the questions that were answered in the Sonos blog post that I am maintaining. So if you’d like to check that out and also keep up with what’s happening at Sonos, you might want to bookmark the page at That’s And if you’re interested in all that’s going on at Sonos at the moment, It’s an informative read.

I’ll read you the question in full that I posted to that AMA.


Thank you for hosting this AMA today.

As a blind person who owns 15 Sonos devices and has respected the company for its commitment to accessibility over the years, I am appalled by the way it has shown such disregard for accessibility.

I, and many other blind people, reached out to Sonos ahead of the app’s launch and received a response that basic accessibility was in the initial release.

Fortunately, a blind tester saved access to many of our Sonos systems when he blew the whistle and spread the word that the app was not accessible at all and that it was impossible to perform essential and basic functions.

Sonos misled us either deliberately, or because they didn’t have blind people advising them on accessibility at critical stages. It is Accessibility 101 that it is a non-negotiable part of an initial app’s spec, and you build it in as a foundational component of any new app.

Sonos now claims that some of the most serious defects will be corrected in the 21st of May release. But hopefully, the panel can understand that there are a lot of blind people who can’t trust Sonos anymore.

Given that Sonos got it so horribly wrong with this current release, why should we expect anything better in the next? Will Sonos offer an apology to its blind users and accept that it got this wrong? And will Sonos commit to creating a chief accessibility officer as a tangible commitment to ensuring this never happens again?”

Now Diane Roberts, who is the senior director of software development at Sonos responded to this, and this is what she said.

“Thank you for your heartfelt feedback.

We invested our user experience and engineering energy on supporting VoiceOver throughout this project. Unfortunately, near the end, we took our eye off the ball and missed a couple of key bugs.

Those bug fixes have been shipped in a release today.”

And by today, she’s meaning the 14th of May, US time.

She continues:

“That doesn’t mean we’re done. We have more that we want to do, and will do, to fine-tune the experience. This is the same kind of fine-tuning we are doing for the visual experience. In a visual UI, that means adjusting the gutter between items on screen. In a spoken UI, that means adding more hints about how to navigate. We look forward to tweaking those and making the experience get continually better.

I understand that we have to rebuild your trust. We will only be able to do that by improving the experience. Any words we say will be incomplete. I am sorry that we missed this.

Our next step involves building a hearty beta community of vision-impaired users. Today, we have 30 visually impaired users on the beta of the next version of the app. The next version already has several improvements beyond the bug fixes we shipped today.”

That’s Diane Roberts’ response, and I appreciate her taking the time to respond.

I have a couple of reactions to this.

The first is that I think she’s grossly understating the mess that this app is in. Now, I do acknowledge that they have released an update that allows you to explore by touch. It does allow some functionality to happen, but it’s kind of an unstructured mess.

However, we have made a bit of progress, and they say that the first really substantial progress that we will see is on the 21st. So we may be making more still by then.

Thanks to the efforts that everybody made to contact Sonos and make their displeasure heard. I think that what’s happened now is that we’ve gone beyond the first-line tech support stuff and the PR spinning people, and we do have the ear of engineers. That wasn’t guaranteed, so people can feel proud of the letters that they wrote and the constructive feedback that they gave about the hurt that they feel and the frustration that they feel.

I am optimistic that better things are ahead. It is great that they’re building a team of blind beta testers. If there are a bunch of capable Sonos users who are beating up the app, then I’m very happy about that. And I’m sure that they will work some magic if Sonos is willing to listen.

But that should have been done from the beginning. And I think that’s the theme that’s running through a lot of the criticism from sighted people too that Sonos is receiving. What on earth was the hurry?

The thing that both heartens me and puzzles me is that we learned as part of that AMA that the iOS version of the Sonos app is written in Swift, Apple’s own preferred programming language. It heartens me because it means that good accessibility is possible if Sonos is willing to listen and allocate the resources to fixing this. So it’s not some sort of clunky cross-platform thing that they’re developing in for the most part. And that means that if we keep putting the pressure on and keep offering constructive feedback, they should be able to make this a great experience. That needs to happen sooner rather than later. So that was heartening.

But it’s also perplexing because you have to really go off the beaten track to create an app with Swift that is this bad from an accessibility point of view. Apple makes it pretty tough to release an app like this that’s so inaccessible. I don’t know what they’re doing. I don’t understand why they would have departed so much from standard practice.

But the good news is it is actually technically fixable. We just now have to make sure that they do those fixes.

I’m not as familiar with what’s going on in Android, but they do seem to be developing natively for some sort of platform on Android, so hopefully we can see the same kind of improvements happening to that as well.

I just really don’t understand what the rush was. Obviously, they’ve got their new Sonos Ace headphones coming out next month. But that’s next month.

If they’d have held it back a few more weeks, they could have got the alarms working. Apparently, the issue with the alarms was only discovered on the morning of the release. They found a whole bunch of bugs. They decided oh well, we’ll ship it anyway, and just disable the alarm function altogether. Extraordinary, really. Extraordinary.

But hopefully, there’s light at the end of the tunnel, in that there is nothing inherent in the architecture of this app, as I understand it, that would prevent us from getting good quality accessibility back.

Our mission, should we choose to accept it, is to constructively ensure that we don’t go away, and that we don’t settle for second best, and that we get the accessibility we deserve as paying customers.

Listener Comments on Sonos

Jesper Holton is writing in from Denmark and says:

“I have just been listening to your latest installment of Living Blindfully, which, alas, brings very worrisome news in regards to Sonos, and which has prompted me to write to the Sonos Accessibility Address. Maybe they will answer where their CEO has failed you so far.

As mentioned in my email, I hope we can bring the EAA” (that’s the European Accessibility Act) “and compliance to consumer electronics to bear in this case. It could be argued whether Sonos’ current line-up falls under the Act, as it is not a set-top box, nor a TV. Still, it could be worth a try. The EAA is unfortunately far from as comprehensive as we could have wished for, but it will have some merit to level the field of accessibility to consumer electronics in the years to come.”

Jesper, if you got a reply from the Sonos accessibility address, that would be interesting because what people are telling me is that they have not received any reply, and they wonder whether it is monitored or not.

Caller: Hey, Jonathan! It’s Dennis Long. I’m calling about the Sonos topic.

Sadly, Sonos is like a lot of companies today.

Very few companies, I can almost count them on one hand, that bake accessibility as one of their core principles. Apple being chief among them that puts accessibility as a core principle.

You’re seeing this more and more and more today, where companies don’t give a crap, and we’ll get around to accessibility when we get around to it. And Sonos is no different. I will never buy any of their products.

The sad thing is before this broke, I had called Sonos in the last couple weeks and gotten information on speakers for my computer. I was going to buy a set. So they have now not only lost my business permanently, (because why am I going to go with a brand that I can’t trust when they’re going to release a new app that’s going to render it inaccessible?) But they clearly don’t care. They don’t get it. They don’t understand what they’ve done. The fact that the CEO doesn’t have a decency to respond to you or have somebody from within senior leadership respond, no, I’m not going to endorse that product.

We need to take a stand as a community. It worked with Voice Dream Reader, and maybe it’ll work here. But we need to take a stand.

Jonathan: Thanks, Dennis!

And just to be clear for the audience, that message was sent before I mentioned in the previous episode that I’d had some dialogue with a senior leader at Sonos. I think what this shows us is that accessibility is very hard won and incredibly precarious.

Constant vigilance. Which Harry Potter character said that? I forget now. [laughs] Constant vigilance. Accessibility can break at any moment.

And in all fairness, Dennis, I can recall you calling into this podcast, or perhaps posting somewhere that I read where you were very frustrated by something that Apple had done at one point, and you threatened to go to Android because Apple, in your view, had lost the accessibility plot. So nobody’s perfect. Apple has a good track record, but it, by no means, has a perfect track record.

And if anyone wants my views on that, you can go to and read or listen to the address I gave to the National Federation of the Blind last year.

We can reel off the examples. The one that frustrates me the most at the moment is that for years, the audio destination rotor option is not available by rotating. It’s available sometimes by fluke, but it’s not available consistently by rotating where it should be. This is a longstanding bug and it has a significant impact on hearing aid users.

There will be other bugs that annoy other people just as much as that one annoys me. And Apple just ignores us.

And people will have different approaches. My personal approach to this is to keep calm, but be firm where firmness is required.

But if somebody, even after doing something like Sonos has done, is willing to get around the table and talk, blind people are going to gain nothing if we are belligerent, if we’re rude, if we yell. If someone’s willing to sit around a table and have a discussion genuinely about the issues that are affecting us and how we get out of whatever mire it is that we’re in, then we should have that talk. As Teddy Roosevelt said, “Speak softly and carry a big stick.”

If someone is not in the Sonos ecosystem, I completely understand why you would be reticent about getting into it after an incident like this. I would be.

I should say though, I don’t think Sonos is a good choice for computer speakers. Even if this issue hadn’t occurred, I wouldn’t recommend Sonos for computer speakers. It’s a different use case entirely, and there are much better options. Because with Sonos, you’re always going to get a little bit of latency. You’d be far better with other options that connect, say via USB or even a 3.5 jack, and will serve you better for the computer speaker use case.

But if you were looking for a multi-room home system, and you’re blind, and you’re looking at the situation with Sonos right now, you’d have to think, man, I might want to steer clear of these guys.

Now, I don’t know about the state of accessibility for competing products. In this consumer category, Bose is a competitor, and I don’t know how their apps are doing these days.

If you have a Bose soundbar or one of the speakers that competes with Sonos and you can tell us how that’s going from an accessibility point of view, I would be interested. Because when I got into Sonos about 10 years ago now, the Bose product was decidedly inferior from an accessibility point of view. And I know Bose has lifted its game in some areas since then, largely thanks to Aira, actually, because we were promoting, I say we, I was working for Aira at the time. We were promoting the Bose frames and the Bose sound collar product, which is pretty cool. So you could talk to an agent hands-free. And Bose really did lift their game with accessibility to some degree. So it may be that Bose is doing a little better now.

But the thing is, many blind people have invested a serious amount of money in Sonos. Their whole houses are kitted out with Sonos gear all over the place, like is the case for us here. I like Sonos when it’s accessible. It works pretty well for me. It’s reliable. I like the audio. It meets my needs. So I want to see this work.

I don’t particularly want to have to sell my Sonos products and start again with a competing product. Obviously, if I have to, I will.

It’s right that we express outrage at where we are, but we are where we are. Gosh! That’s a terrible sentence, [laughs] but it’s true.

Now, we need to get to a better place as soon as possible by being pragmatic and being willing to talk if they’re willing to listen.

Now, I am told that they have reached out to a number of capable blind Sonos users and said, hey, you want to check out the app that we’re building now, the new version of this app that’s supposed to be better from an accessibility point of view and let us know how it’s going? I don’t know whether people are on that beta track yet, but there has been some reaching out. So if they’re making amends, then I guess all we can hope is that they’ve learned their lesson.

But it’s right that people who aren’t in this ecosystem yet are wary, and that trust is going to take a long time to rebuild.

Voice message: Hello, Jonathan! It’s Darry Lawler here from Dublin. Hope you’re keeping well.

Really enjoying your podcast. Just want to chime in with Sonos.

I’m hoping that it will be fixed by the end of June. It’s a real disappointment. I don’t have that many. I have a Play 5 and an Era 100. And connected to the TV, I have the Sonos Ray with the subwoofer, but only use that for the TV, etc.

Quick question now. You’ve done great reviews on the Zoom H1 Essential, H4, and H6 Essential. I must say they’re great reviews. I enjoyed all of them.

I’m currently recording on the Zoom H1 Essential, and I have a clippy microphone. Not connected to one, just using the Zoom recorder in my hand.

Can I ask you what do you use? What’s your favorite one? Do you still like the Zoom H6 Essential, or the Zoom F3? Just to get your point of view on this.

Keep up the great work and talk to you all soon.

Oh. And by the way, think of us at episode 353. That’s the Irish area code. Take care now.

Jonathan: Great to hear from you, Darry.

I guess if I had to pick one recorder, it would be the H6 Essential for my particular needs. Because there are times when I go out and I interview people, and I don’t want to be passing a recorder around, or pointing a microphone at people, which I’d have to do with the H1 Essential, unless I got some sort of microphone plugged into the jack. But I just picked up 4 of these Sennheiser microphones. I talked about that on the Bonnie Bulletin for when I am interviewing, maybe not just one person, but a bunch of people. And on that basis, I think the H6 Essential gives me enough scope to do most of the things I want to do.

I do like the F3. It’s a lovely small device, very solid construction, and I think the preamps are a bit better. The F series has better preamps in it. But when you’re just recording spoken word in a good environment with some good mics, it’s kind of academic. I really don’t think there’s going to be that much difference that anybody will notice.

And the accessibility just wins hands down for me. Being able to change anything in the menus, have it talk, that is just a win. I will always use an accessible option over an inaccessible option, unless there was such a vast difference in quality that it didn’t make sense to.

Now back on to the Sonos topic, Stefen Hudson writes:

“I’ve been following your coverage of the unfortunate situation with Sonos, and it’s made me realize that there’s a rather unsettling trend in software development that goes beyond this particular company. It seems to have started when web-based apps became popular. The new Sonos app doesn’t sound web-based since it’s written in Swift, but many native apps are being replaced with what are basically glorified stand-alone browsers these days. I’m not a fan of being forced to use a sluggish, bloated web interface for every single thing, but that’s another topic entirely, and I won’t go into that right now for everyone’s sanity.

I’ve seen a couple of examples of this with Microsoft. They did something similar with Skype, and they’re now doing it with the new Outlook app they want to force everyone to use by the end of 2024.

Not unlike Sonos, both apps started from the ground up, Both started in an obvious unfinished state with missing features, both forced users and screen reader developers to rethink how to make these new apps more accessible.

This is the problem. When companies decide to build a new app from scratch, they effectively discard any and all improvements that were made during the lifetime of the previous app. This includes not only features that people have enjoyed for years, but years of accessibility improvements as well. Sometimes, the result is much worse than what’s being replaced, and it often feels like progress really means taking giant steps backward just to make things more modern.

To Microsoft’s credit, they do have accessibility teams, and they’re in frequent contact with the blind community. I believe that they do try to accommodate us as best they can.

I do wonder, however, if any of these developers, and even those higher up the chain, truly understand how frustrating their apps can be to use compared to what they’re replacing and how much influence the accessibility teams really have over individual apps.

As for Sonos, I wonder how they would feel if they had an incomplete app forced on them like this and could do nothing about it. But hope things improve. It does seem like they’re doing their best to remediate the situation, but this shouldn’t have happened in the first place.

On a more positive note, I do have some good news regarding the accessibility of Duolingo.

Some may remember when I talked about the increasing degradation of the accessibility of Duolingo’s website. I’m pleased to report that in February, I was able to get in touch with the head of the web research and development team. He assured me that they want to make the site more accessible, admitted that they’ve been failing in that area, and apologized that it took this long to get me connected to the right people. While there haven’t been any changes since then, he did get back to me and told me that he assigned the issues I outlined to the appropriate people.

Other people must be making waves as well, because a Duolingo staff member, who frequents the Duolingo subreddit, posted and asked for feedback about accessibility. I commented on it and reiterated what I said about the accessibility issues with the site, but they’re also asking for feedback about the accessibility of the mobile apps.

So if any of us use Duolingo now or in the past and they want to give feedback on the site and mobile apps, they can head over to the subreddit r/Duolingo” (that’s D-U-O-L-I-N-G-O), “and search for the post titled Staff Post Asking for Feedback on Accessibility.

Time will tell if Duolingo is serious about this. But it’s a step in the right direction, and I sincerely hope something good comes out of it.”

Thanks, Stefen. See, this just goes to show that sometimes, it’s just getting past the frontline and onto someone who has some influence and some nous.

We know that sticking with the current Sonos app is a temporary solution, and it’s not good for long, and here’s an example of why.

Kayde Rieken says:

“Hi, Jonathan,

I hope you’ve been well.”

I’ve been super well. Thank you, Kayde. Hope you have been, too.

“I ordered a Sonos Era100 a couple of weeks ago, before I found out about the inaccessible app situation.

Just getting around to setting it up today. Life happens,“, she says,”and it is trying to force me to update to the new app before it let me set up the new speaker. I don’t know if that means that there’s some kind of software update on the speaker itself that forces you to update to the new version.

But whatever it is, I refuse the update and now have a lovely, shiny, useless Era100 sitting on top of my piano.

Like you, I’ve invested rather a lot of money into the Sonos system, and I hope they fix this mess soon.”

Thank you, Kayde.

She then let me know that she has put the new app on a new phone, and managed to get her Era100 up and been running that way. And presumably, now it will just work with the existing Sonos app.

Based on what we’ve heard, Kayde, I am confident that we will see some improvements. Obviously, I regret very much that we didn’t have parity with sighted people on day 1, although sighted people have had a rough experience, too.

But they have published a timetable. They say that we’ll see some substantial improvements on the 21st, so help may well be on the way quite soon with this new app, and that by the end of June, we should have something comparable to what we have in the old sonos app.

I think then, the question will be how do we get our heads around this new paradigm? Because the way they’re structuring this new app is quite different – everything on one big screen.

So I think good use of the actions rotor would be critical here, good use of heading navigation will be helpful, but I am quite confident that we will start to see some improvements in the near future.

Advertisement: Living Blindfully is brought to you in part by Turtleback, the original manufacturer of leather cases for notetakers since 2003.

Now back then, I was managing blindness products at Pulse Data International, working with Russell Smith. And we never regretted choosing Turtleback because the name’s synonymous with quality manufacturing, quality service.

There’s a wide range of leather cases available for the products you use every day.

Check them out – That’s all one word., and be sure to use the coupon code LB12. That’s LB for Living Blindfully, and the number 12, and you’ll get 12% off at checkout.

If you prefer, give their friendly team a call at 855-915-0005. That’s 855-915-0005.

That coupon code again, LB12, for 12% off at checkout at

Essential Series Button Layout Descriptions Available

There’s yet more good news from Zoom. That’s the audio recorder people Zoom.

I got an email from Samuel Greene from Zoom today to say that they have published layout descriptions for all 3 of the Essential Series recorders. These documents have been written specifically for blind people so that you can go to their website, grab the one you want for the H1 Essential, H4 Essential, or H6 Essential. And when you get your recorder, this document will describe in a very blind-friendly way how the recorder is laid out. They also go into detail of some of the operations of the Essential Series recorders. So if you don’t want to listen to my lengthy tutorials, this is a quick cheat sheet to get you up and running.

It’s fantastic that Zoom has taken the time to do this, thanks to Samuel Greene, and everybody involved at Zoom for showing that they take these products seriously and their engagement with the blind community.

You can find the one for the recorder that you’re interested in by going to the support and downloads page on the Zoom website for the recorder in question.

The Importance of Blindness Agencies Leading by Example When It Comes to Accessibility

Here’s a message that came through on the weekend of the 27th and 28th of April that’s relevant because of the subject matter. It’s from Darrell Hilliker, and he says:

“This weekend, the Perkins School for the Blind is holding their Early Connections 2024 conference for family and parents of blind children. As our 5-year-old and 8-year-old daughters are blind, Alison thought it might be a good idea to attend the conference to listen to their sessions and find out about any new resources about which we may not yet be aware for our family.

Perkins offered a track for in-person attendees, and another version for virtual attendees. As we live in Arizona, Alison chose to participate online. She purchased a ticket for $35. Perkins chose to use the VFairs app for managing their event and providing the virtual experience. I have not seen this app before, but if you want to find out more, you can go to to learn more about this app.” (That’s

Darrell continues:

“They do actually have an accessibility site. Check out for their public statement.

Alison decided to participate on her iPhone, as that permits a great deal more flexibility and mobility than does being tied to her computer.

She found logging into the conference through the app difficult to accomplish, followed by an invitation to participate in chat when none of the controls had roles, and it was extremely challenging to get past that dialog.

After all that, after selecting the agenda, she was unable to scroll past the first one or two events. Continuing to flick right would just move focus back to the top of the screen. Subsequent attempts yielded the same results.

She eventually asked me for help, where I was also unable to move past this task blocking accessibility barrier.

Asking for help from Perkins, Alison was told it was much more accessible on the website and that she should access it through her PC instead of the mobile app. We feel that some organisations like Perkins only think about the needs of sighted parents of blind children, and they give short shrift to making sure that blind parents and those with other disabilities are as equally included.

I have some questions for which I hope you might be able to get answers, and we might be able to get opinions from other listeners.

  1. What is the responsibility of organisations of and for the blind to ensure accessibility and inclusivity?
  2. Do they have a higher bar to make sure they are accessible and to go beyond the accessibility statement to ensure inclusion?
  3. Do organizations serving parents of blind children leave out their blind and other disabled parents, and what should we do to correct that tendency?”

And Darrell subsequently sent me an update which says:

“After some additional trial and error and a recommendation from Perkins, we were finally able to join using Safari on iOS. It is a workable kind of accessibility filled with low-hanging fruit, obvious accessibility barriers, and difficult-to-finish situations, but it does, ultimately, work. Let’s also consider how much time we have spent this morning providing free accessibility services.

I still maintain our initial concern, and my questions still stand. What are the responsibilities of organisations in our own field to ensure they are accessible and inclusive, not only to blind people, but to those with other disabilities as well?”

Thanks, Darrell!

I’d like to just sound the community out on this a bit, and we may get Perkins on. But I think it’s probably symptomatic of a wider issue we need to discuss.

My personal opinion on this subject is that organisations of and for the blind have not just a legal obligation, but a moral obligation to be absolutely best in class in this area. They have to be the change that they want to see in the world. Otherwise, what moral authority can they possibly have to go out to others and demand accessibility and talk about how important it is if they don’t have their own house in order?

It would be my hope that every blindness agency has a 100% accessibility policy, a 0 tolerance for inaccessible solutions, and that if an inaccessible process comes to light, Either the vendor works quickly to resolve it, or that solution is replaced with something that’s more accessible. And we know that there are accessible options out there in this space, so it is very disappointing.

I’d be interested to know if you’ve had any further feedback, or whether they’ve said to you, well, you can do it on Safari for iOS. What’s the problem? We’ve given you a way to do it.

True accessibility in the context of a blindness organization means that the choices available to blind participants are no fewer than the choices available to sighted participants. Anything less is a cop-out, in my view.

So it is very concerning. You’ve certainly got your hands full with kids that age, too. So you don’t want to be running around tracking down accessibility barriers when all you want is information.

So I’d like to hear from other blind parents, or people who’ve tried to participate in these sorts of events run by organisations for the blind that are putting up clearly removable barriers to actual blind people participating. Let me hear what you think about this. 864-60-Mosen, if you want to get in touch by phone. 864-606-6736. An on the email.

New Hearing Aids

Let’s talk about hearing aids.

“Good morning, Jonathan! It is Don Roszmann here again about hearing aids.

Like everyone else, I am totally impressed with your podcast and all the wonderful information that is shared. You never cease to amaze me with all your IT knowledge.”

Well thank you, Don. I appreciate that.

“I have always used ITC aids.”

For those who aren’t hearing aid wearers, that stands for In The Canal, by the way.

“I was looking forward to trying the Lumity. So far, they are not available as ITC.

Like yourself, I found the MyPhonak app quite accessible.

I got the Signia AX7 ITC hearing aids on April the 19th. And by the 26th, I had decided they are keepers, so I bought them. They pass every test I was able to apply to them over the week.

There are 6 programs which are in the aids. The audiologist has a number of choices of the programs one can use.

When the Signia iX7 come out in ITC style, I can update for about $200.

I was quite impressed to discover that when the company comes out with new software for the aids, it can be installed right from the MySignia app. The MyStarki app is the same with updates.

As you know, it is ever so important to take the time to make sure whatever aids one chooses are the best for the person’s needs. I hope this helps.”

It does indeed. I really appreciate your perspective, Don.

I use BTE (behind the ear) hearing aids. Isn’t it interesting how all these industries have so many acronyms? In The Canal hearing aids just aren’t powerful enough for my hearing loss these days.

It’s good to hear that Signia have that updateability right in the app. There does not appear to be a way to update firmware for Phonak without going into your audiologist, and that is a pity.

Caller: Hey, Jonathan! This is Jose in Kansas City, Missouri.

Looking forward to listening to your demo of Phonak hearing aids because you’re listening to them right now. Actually, I have 2, but only the right one uses the microphone when you’re on the phone, which I don’t have as a problem.

What I have as a problem, unless I raise my voice, especially if there’s noise behind me, I can’t be heard, and it always sounds like I’m on speakerphone.

I did reach out to Phonak customer service and twice, I went through the unpairing and re-pairing. Once, I was told that there was a setting, and they helped me change whatever setting it was, it would improve the connectivity.

My question, if I had the ear of Phonak, other than to politely, professionally say, what were you thinking? I want to know, are they ever going to come out with an MFI hearing aid? And hopefully, that’s what you have, made for iPhone. I didn’t know there was such a thing until I got these.

Overall, they’re okay. The program doesn’t seem to change, unless the app is open. And there’s some connectivity issues with it disconnecting the left hearing aid from time to time from the phone.

So there you go. You take care.

And I still want to know. How did you get this number? It’s a way cool number.

And I intend to take a dip in the podcasting pool soon, or internet radio or something. Who knows?

Take care.

Jonathan: Well, I’ll answer the second bit first because that’s quickest and easiest.

The service that we use for the voicemail line here is called Voicemail Depot. Or as Americans like to say, Depot, but it’s spelled the same way. And it’s available at, and you can choose your number. So if it’s available, then you can get it. And you can also type in a string and have numbers that they have in their pool come up to match. So it’s a pretty cool process.

It’s not a free service, but it’s a reliable service. And it allows me to get the voicemail messages into the podcast.

Now, back to the subject of hearing aids.

The microphone issue you highlighted and demonstrated so well there is one of the weakest links of Phonak, as far as I’m concerned. And I think you’re right. I think the reason why it’s a problem for Phonak is that they’re choosing not to adhere to the MFI spec. They are seeing this as some sort of advantage.

I don’t know whether it is they’re using Bluetooth 4.2 in the Lumity. That’s quite old now. The latency is quite apparent. And it does sound like a speakerphone. I mean, even when you’re in a pretty decent room, it does sound like a speakerphone.

Now, you can get around that if you are a Roger On user, and you can talk into your Roger On when you’ve got the device connected via a USB cable. But that’s a lot of hoops to go through, when you could just use the microphones of your iPhone.

I believe that the latest version of the MFI spec does allow you, as a hearing aid user, to specify whether you want hands-free calling via the hearing aids microphones, or whether you want to use the microphone of the iPhone. That’s how it should be.

I have no idea whether this is just about licensing costs. Is it that Phonak doesn’t want to go through the MFI certification process? I presume there’s a cost to that. Or is there some technical advantage that they perceive?

But I agree. This issue with phone calls is highly annoying. It’s a major downside.

Caller: Greetings again, Jonathan. This is Jose in Missouri.

I obviously pulled the trigger too soon on my comment.

I was really impressed on the demo you did with the Phonak Lumity, I guess is what it’s called. Well, I have to go re-listen to it, because there’s a lot of cool stuff happening there. I intend to try to figure out what it costs and how I can get my hot little hands on it, as well as the Roger microphone.

I actually bought a microphone with these Phonak hearing aids. They’re LEG or something. Anyway, but then, I was told by the audiologist that that wasn’t exactly a Roger mic.

My only real-world experience with it, actually, I had two. One is I was at a restaurant with my friend, I had Reg clip it to his lapel, and I was able to hear myself talking to him, a bit off mic. But when he spoke, I couldn’t hear him. I’m like wait, that’s not gonna work.

And I did use them at a meeting. I attend some meetings regularly, and they have the dreaded ice machine, which makes it hard to hear. And it’s just a big old room, no walls close by. And I guess, it’s a high ceiling, and it’s just a mess acoustically, and everyone has trouble hearing from time to time.

What I find annoying though is if others are having trouble hearing, nobody will say nothing. They’ll just sit there. And it’s like, I guess they’re trying not to be rude. And I get a little rude, right? I’m like, excuse me, can you speak up a bit, please? And then I’ll get others that go, yeah, yeah, yeah.

Hey. I wanted to, as an aside, give my condolences about the loss of your mom. I lost mine about 3 years ago, and she had Alzheimer’s, so we really lost her way before she passed.

So anyway, keep your chin up, my brother. Here for you. Praying for you, basically. Y’all take care. Bye.

Jonathan: Thank you so much! I really appreciate the condolences and the interesting message about the hearing aid technology.

And what you highlight there is just how complex it is to fit hearing aids for blind people. Because in an environment like that where you’ve got a loud noise going on in the background, …

For me, actually, when I was chairing a large meeting of our wider leadership team, It was being in a hotel with an old heating system that was making a hell of a racket. At least, my hearing aids were interpreting it as a hell of a racket. [laughs] And my hearing aids, the Oticon ones, were not filtering it out.

So we do have situations where filtering is dangerous, such as traffic, and we’ve talked about that in recent episodes. But we also have situations where not doing filtering is frustrating.

And I think that a lot of people lip read more than they even realize.

Caller: Jonathan, Jose in Kansas City.

Your description was phenomenal of the hearing aids. I was inspired to peruse the app to see if things had changed since I was in there last. I did find an enhanced mode for phone calls, phone call microphones, which their tech support people told me didn’t exist. One helped me fix it, the other one told me it didn’t exist. And so I didn’t get around to calling them a third time to play the 2 out of 3 game.

So I believe my audio should be pretty decent, in comparison to the other message.

Jonathan: Jose, thank you for that because that’s quite informative.

That shows us 2 different messages – one recorded when you don’t have the enhanced bandwidth enabled in the MyPhonak app, and the second when you do. And while I think that they still need to give us the option to use the iPhone microphone, the second, when you have the enhanced bandwidth turned on, is definitely an improvement. So I’m glad to hear firsthand the difference that that setting actually makes.

Improving the Quality of iPhone Calls in Challenging Environments

Now, I want to talk about something I had completely forgotten about. I read a lot of technology news, and I guess it didn’t seem super relevant to me at the time. But this may help further if you have hearing aids where you use the microphones of the hearing aids when you’re on a phone call.

Now, for Phonak users, you don’t have a choice about this. When you pair your Phonak hearing aids with your phone, callers are going to hear you from the hearing aids microphones.

With made-for-iPhone hearing aids, at least the newer ones, you can go into the settings for the MFI hearing aid that’s connected, and disable the microphones of the hearing aids as the audio that people hear on a call. And then, you can just hold your iPhone up in the normal way to your mouth. You’ll still hear the audio over the hearing aids, but your caller will hear the iPhone microphone, which I think in a lot of cases is preferable. But with these Phonaks, we don’t even have that choice.

So here’s something that was introduced back in iOS 16.4, which I promptly forgot about.

And to give you a commentary on what a sad life it is that i lead, I’m going to call myself to demonstrate this, get into the voicemail system, and just be recording a kind of a nonsense voicemail because you do have to be on a phone call for this to work. So I’ll just do that and edit all that out.

And the second thing that has to be true, and you’ve got to be aware of this, is that you can’t be using your hearing aids when you make the change I’m about to demonstrate. So the best thing to do is probably to turn your Bluetooth off, then you can put your phone on speakerphone, and make this change.

The good news is that when you’ve made the change once, it’s sticky. Even when you restart the phone, the change will stay intact.

I’m going to go into control center by swiping down from the top of the screen just a little way.

VoiceOver: Control center.

Jonathan: Now, I’m going to touch around the top of the screen.

VoiceOver: Double tap and hold to show more controls.

80% battery.

Mic mode.

Selected. Phone button.


Jonathan: Now there’s the mic mode, and it’s currently set to standard, which is the default. We can change this. This does make an appreciable difference in certain environments, particularly environments that are noisy with a lot of background noise, or environments that are echoey, you know. You don’t answer your phone in the bathroom, do you?

Anyway, here we go. We go left, …

VoiceOver: Selected. Phone, button.

Jonathan: flick down, …

VoiceOver: Open controls.

Jonathan: and we’ll double tap this.

Now, we can choose the mode that is active for your microphones when you’re on a phone call.

VoiceOver: Wide spectrum is currently unavailable.


Selected. Standard, button.

Jonathan: It’s currently on standard.

VoiceOver: Voice isolation, button.

Wide spectrum, button.

Jonathan: And you don’t want to choose wide spectrum, even if it is available.

So I’ll flick left to…

VoiceOver: Voice isolation, button.

Jonathan: I’ll double tap voice isolation.

VoiceOver: Selected. Voice isolation.

Jonathan: Now, I can exit control center by going back home, and I can end the phone call.

From now on, when I make a phone call, voice isolation mode will be active. And it makes an appreciable difference to the way that callers hear you if you are in a sub-optimal audible environment, and if you’re using your hearing aid microphones.

This might be a trick to try, even if you don’t wear hearing aids and you just want to cut out a lot of background noise in difficult listening situations.

Judy Dixon, President of the International Council on English Braille, Discusses Their Forthcoming General Assembly and Braille Today

200 years ago, a 15-year-old blind kid changed everything when he invented his system of raised dots that ultimately, after robust debate, I have to say, gave blind people true literacy. And what’s remarkable about all of this is that the core of the system has not fundamentally changed in 200 years, although there has been plenty of innovation and evolution.

English-speaking countries have Braille authorities that administer the code, and they get together through an organization called the International Council on English Braille.

They are about to hold their General Assembly, which is held every 4 years. This year, it is taking place in Auckland, in New Zealand.

Its current president is someone whose contribution to Braille and to blindness in general cannot be overstated. I appreciate that the word legend gets overstated these days, but not in this case.

Judy Dixon, welcome. It’s good to have you back on the podcast.

Judy: Thank you, Jonathan. I’m really happy to be here.

Jonathan: We’re going to be talking about the ICEB General Assembly a bit over the next few weeks. What does ICEB do?

Judy: ICEB meets every 4 years. We have delegates from our member countries. We have 8 member countries at the moment, and the delegates meet to talk about Braille for 5 days. We will have papers presented. We will have committee reports. We will have discussions about many different aspects of Braille.

At its very core, ICEB is responsible for Unified English Braille, and that means listening to people about it, talking with people about it, updating it as necessary.

One of the things that’s going to happen at this year’s conference is we’re going to release the 3rd edition of the rulebook. The second edition was released in 2013. There have been changes since then, and those changes have been effective rules, but we’re compiling all the little changes since 2013 into a new edition of the rulebook.

Jonathan: This all sounds very geeky. But discussions from time to time can get quite animated, can’t they?

Judy: Yes, very much so. [laughs]

Jonathan: So why is that? Why is all of this so contentious?

Judy: Well, the good thing is that so many people care passionately about Braille, have opinions about Braille, and care.

When UEB was adopted by its various countries, it brought change to each country for various reasons. I mean, people in the UK hadn’t been using capital signs very much. Oh my! Now, they do. And some people don’t like, seeing capital signs.

That didn’t bother Americans. We were used to capital signs, but we weren’t used to having contractions in words like reduce. You say that red-uce, is it? We expect Braille to be how you pronounce it. Well no, it’s not the way it is in UEB.

So that’s why people had to endure a certain amount of change. And it was different change for different people, but it was changed nonetheless.

Jonathan: You make a good point. I think that more blind people feel a much stronger sense of ownership of the Braille code than, say, most sighted people do about print itself, and there are some valid cultural and sociological reasons for all of that.

One of the concerns I hear from some people when you talk about UEB (I heard this a lot during the heat of the UEB debate), is that you guys always seem to be tinkering with the thing. They say that that causes confusion, and sighted people aren’t subjected to the same kind of tinkering. How do you respond to those concerns?

Judy: There’s two things that speak to that. One is technology.

And one of the major reasons for the change to UEB is that kids these days (maybe I’ll speak for the US Because this is what I know best), but kids in the US, they actually have children going to school without any teacher of blind kids having anything to do with their education. They’re in a school, they give them an iPad, a BrailleNote, kid, you’re on your own.

And kids have to be able to create documents, output them through some kind of back translation tables, and get accurate results. And UEB is so much better able to produce those accurate results than our former Braille code was able to do. So that’s a major reason for the changes.

The other thing is print is changing. Print is changing all the time. And there are new symbols. There are new ways of formatting print. I mean, it is not a static world by any means. And Braille has to keep up.

Jonathan: Yeah. What’s interesting to me is that individuals can now format printed documents in a way that was really only available to desktop publishers or official printers 40 years ago or so. And now, the variation in type fonts, the various things that people can do is just remarkable. It’s completely different now.

Judy: I wrote a book about that.

Jonathan: Right. Yes, indeed.


Yeah. And we need to understand, particularly those of us in professional scenarios, we need to understand what does a good document look like, right? And Braille has to convey that.

Judy: Yes. But now, in UEB, we have the tools that we can know the difference between bold, italic, underline and all these various ways of emphasis.

And so many Braille readers will say, “Why do I need to know that?” Well, you might not need to know that. But the kid in school who has a textbook where the teacher says, “Define all the words in bold.”, that kid does need to know that.

Jonathan: Are screen readers keeping up with that, though?

Judy: That’s a good question. There are ways to determine if a word isn’t bold, but it is not always the most straightforward.

Jonathan: Right. They don’t seem to have implemented, in translation software, the actual UEB symbols for those different type fonts.

Judy: That’s true.

Jonathan: So why is that, given how long UEB’s been around for? And does that concern you?

Judy: It does concern us. and actually, that’s something that’s of a concern to the Braille Authority of North America. We frequently communicate with screen reader manufacturers on this particular topic.

Jonathan: With any progress?

Judy: Minimal. Incremental. Let’s say that.

Jonathan: I think there was a consumer organization who passed a resolution on this topic, too. So it is something that people are aware of where the ball is being dropped.

Does the issue lie with the screen reader vendor, or does it lie with the back translation software that they’re using?

Judy: Now, the back translation software they’re using is pretty capable.

Jonathan: I guess there’s no reason why most people should have to think about this very much. But the fact that ICEB exists at all provides a vehicle for establishing international consistency, uniformity across English-speaking countries. Why does that matter? Why is it important?

Judy: Well, at the very same time that we were worrying about UEB, we were also getting the Marrakesh Treaty. And now, the United States is exchanging books with countries like Canada and the UK, and we’re getting books that look exactly the same kind of Braille that we read ourselves. So it’s making the international exchange of materials very fluid and very nice.

Jonathan: And that wasn’t always the case. There was a time there where the US had a thing called Braille grade 1 and a half, I believe it was. And this was in the kind of 20s in particular.

So what would happen is that you would get books sent in from the UK, for example, which was using what we called grade 2 then. And there would be abbreviations that Americans simply could not decipher because they hadn’t been taught them. They weren’t in common usage.

And on the other hand, you’d get a book sent to the UK from the United States, and they would be readable, but there were a whole lot of contractions missing and it kind of seemed weird to the reader.

Judy: That’s true.

Jonathan: When the UEB project was conceived, (and I do remember going to a conference in Auckland, I think it was, in about 1993), during the very early stages of UEB, there was discussion about this. The primary objective was to unify the codes across the world, but also the secondary objective, as I recall it, was to make sure that there was a unified Braille code for everything with the exclusion of music.

That didn’t really happen, right, because you’re still using Nemeth in the United States for maths?

Judy: But the United States is the only ICEB country that is using Nemeth.

Jonathan: Legitimately, do you think? I mean, the objections that people… I did a documentary on this a long time ago, and I actually spoke to Abraham Nemeth and a few other people. And the objection was, look, the UEB implementation for mathematics is just bulky. It takes too much space, and that affects processing time.

Judy: It does take more space. There are those who believe that it’s more comprehensible for people who are not mathematicians, who are not technical people. So it’s a matter of who is the technical code appealing to – the people who have major technical expertise and technical needs, or the general public who needs to use the math code in everyday life?

Jonathan: Do you think one day that BANA will revisit that and consider implementing UEB math, or is that just too contentious in the United States?

Judy: No. Actually, we are using UEB math. It’s a state-by-state thing, and states have the option. They can use UEB, and we have some states in the US where UEB technical materials are the law of the land. APH and other entities are producing materials in both codes. So we have our own little mini War of the Dots going on here.

Jonathan: Yeah.

Judy: And it seems to be sustainable, if not ideal.

Jonathan: I mean, is that not the worst possible of worlds? Because you’ve got a kid in one particular state who knows how to read their math text. And then they go to another state, and they have to relearn how to read their math text.

Judy: And if they have to take a high stakes test in a code that they don’t know or haven’t been exposed to, that’s a problem, too.

Jonathan: You and I remember when some educators and actually, funders who are always looking for efficiencies were gleefully saying, when talking computers arrived on the scene, that Braille would just die a natural death, that it had outlived its purpose. What’s your assessment of the acceptance of the need for Braille instruction and its availability now?

Judy: I think if anything, it’s getting better because it’s a little like the paperless office. There’s more paper in offices today than probably ever. Now, everybody prints things out. There’s just paper all over.

And it’s the same sort of thing with Braille. Braille was going to go away. No, all this technology has enabled people to use Braille displays with their computers, and use them very well.

So I think more people are using a Braille display with their screen reader than ever before.

Jonathan: And in the United States, you’ve got the NLS eReader program. That’s really exciting because these are not cheap, these devices, particularly for blind people who are not employed. And that’s the majority of us, unfortunately, who are seeking work. So the eReader program sounds absolutely brilliant.

Judy: It is wonderful. And the eReaders can be used with a computer as a Braille display.

Jonathan: On this podcast in the past, we talked about the EBRF project. It’s been renamed the eBraille project. This is a new file format. It’s portable. It’s extensible. Is that making progress?

Judy: It is. It’s slow going, but it’s very very very technical. But it is making progress.

Jonathan: Yeah. Do you think that there will be universal adoption of that? Because universality, when it comes to these sorts of things, is always the tricky bit.

Judy: I don’t think it’s going to be an issue that’s going to affect the user so much because once the Braille is formatted, it’s kind of like HTML for Braille. It’s about organizing the document and making it navigable. And, you know, it’s kind of like Daisy in that way. And how should Braille be organized so that this can happen? But when the user is reading the Braille, the Braille is going to be the same as now.

Jonathan: You mentioned the Marrakesh Treaty earlier. It showed a lot of promise, and I just want to check in on whether it’s delivering on those promises.

If someone, say, is in New Zealand or Australia and they know of a Braille book on BARD that they would like to get, is that possible? Is it delivering in that way?

Judy: It is. If the country has signed on to the Marrakesh Treaty and the agencies in the country are participating in an active way, then it’s a relatively straightforward process.

Jonathan: With devices like Monarch and other multi-line devices that are coming out, I guess people are starting to feel quite positive about getting more blind people involved in STEM subjects. You feeling good about that?

Judy: Yes, isn’t that great?

Jonathan: Yeah.

Judy: I think it’s just fabulous.

Jonathan: Yeah.

And any other items on the agenda that might interest listeners?

Judy: One of the exciting things that’s happening in ICEB these days is we have our newest member, which is Nepal.

Jonathan: Wow!

Judy: And Nepal’s not always thought of as being an English-speaking country, but there are a sizable number of people in Nepal who do use English, and it’s a country that’s using UEB.

So ICEB is welcoming members who use UEB for English in their country, even if it’s not the primary language.

Jonathan: That’s excellent.

Will you be seeking another term? How do terms work at ICEB?

Judy: Yes, I am seeking another term. I’m eligible to serve 2 terms, and this will be my second.

I’ve been president for 4 years, and I am up for re-election, so I cannot take anything for granted.

Who knows? Somebody may jump up and want to be president. Right now, I’m not aware of anyone who’s running against me, but you never know.

Jonathan: And that is Judy Dixon, President of the International Council on English Braille.

As I say, stay with Living Blindfully. We’re going to be giving you some detailed coverage of this over the next little while.

And I’m deeply honored to say that I’ve been invited to present the keynote address at the General Assembly of the International Council on English Braille to mark the 200th anniversary of the invention of the Braille Code. It is a long address. It goes through about 55 minutes, and it’s called Safeguarding the Legacy, Investing the Inheritance. And it takes you on an historical romp through the history of Braille, talks about its place in today’s society, and my thoughts on some of the things that we might do to ensure that the legacy that we’ve all inherited is passed on to future generations .

That address is delivered live at 9 AM New Zealand time on Sunday, the 26th of May. If you’re into UTC, that equates to Saturday at [21:00] UTC. If you’re in US Eastern Time, it equates to Saturday at 5 PM US Eastern Time.

And ICEB will be streaming their conference if you want to check out not just my keynote, but all of the deliberations and debates that will be taking place on Braille. You can find them at That’s And I do hope to get the audio for you of the keynote, so that you can hear that on a future episode of Living Blindfully.

Advertisement: I want to thank the wonderful people at Aira for sponsoring Living Blindfully.

Aira has this cool new feature called Access AI. You can go back in the Living Blindfully archives and hear all about this.

It can be very useful because, as we know, AI can hallucinate sometimes, so it pays to get a bit of human confirmation from time to time.

The other day, I was going to Parliament for an important meeting. I was being super cautious about my appearance, So I took out a shirt and I asked the AI whether the shirt had any stains on it.

It came back and told me that it did.

I was perplexed by this because it was a clean shirt, and I thought it’s highly unlikely it’s got a stain. I thought I was being super cautious.

So without incurring any extra cost at all, I handed it over to an Aira agent right from that same screen to confirm whether what the AI was saying was true.

The agent came back and actually said that what I thought was true was true. There wasn’t a stain on it. It was probably picking up some sort of shadow or something.

You’ve got the best of both worlds with Access AI – a very good user interface for taking all kinds of pictures, and then the ability to hand it on to a human if you need that extra bit of reassurance or confirmation. If you don’t have Access AI yet, request access to it from the Aira app. Access AI from Aira.

You can find out more by going to Aira, that’s A-I-R-A.I-O.

Interview with Yalim Gerger, Creator of

Services that allow you to save an article and read it later are nothing new. Some browsers even have that built in. But today, we’re going to be talking about a cool twist on the read later concept that may have some particular appeal to the blind community.

What if you could take all those articles that you want to read while you’re doing your chores, working out, or you’re on your commute, and you have your own personal podcast feed of your favorite articles? And when you hear that podcast, it’s like your articles are being read by a human with very high quality speech. That’s exactly what does.

I’ll give you a demo of this after the interview. But to discuss how it came to be, how it works, and how you might integrate this as part of the tools that you use to consume information, I’m joined by’s creator, Yalim Gerger, who is in Istanbul. Welcome to you, Yalim.

Yalim: Hello! Hello, Jonathan! Great to be here. Thank you for inviting me.

Jonathan: It’s a pleasure.

Tell me about yourself, first of all. I take it you have a strong background in software development to be doing things like this?

Yalim: Yes, I’m a software engineer. God, am I 48 years old? Yeah, I am. That’s a lot.

Jonathan: [laughs]

Yalim: I started in 1999, so almost 25 years now. Mostly, I’ve worked in the Oracle Database world. I worked as a consultant. Then, I also worked as a team manager, an outsourcing company owner. Mostly worked for an American company throughout my career. I also built enterprise products for the Oracle developers. One is still available online. I still sell it, Kitora, a version control tool for the Oracle database.

So until ListenLater which came out this year, I never really did consumer software. I always wanted to, just never had the chance. No idea. And then, I didn’t have any good idea to build a product.

But then, ListenLater occurred to me a few years ago. It lingered in my head for a while. I couldn’t find a similar service. Close, but not quite exactly the way I want it.

And finally, I think it was around September, October of last year, I decided to bite the bullet and do it myself.

Jonathan: And did you build it because you figured well, I want this. And if I want it, maybe other people do, too. Is that the rationale?

Yalim: I did it for myself, but I hope there are other people. I mean, I thought that’s something sensible to want. And there were similar services, but it’s mostly for myself.

Because as you experience today, in the mornings, I drop my kids off to school. It’s quite a drive. So 40 minutes one way, 40 minutes the other way. And there can be traffic, just like today. And I love listening to podcasts.

Also, I walk a lot. Istanbul is a great city to walk. We have beautiful shores and beautiful walkways next to the Bosphorus and the Marmara Sea. I walk a lot, and I listen to podcasts while I walk.

But I run out of podcasts, and they’re not always quite information dense, if you will. Sometimes, speakers can talk about stuff that is not really on topic. I don’t care about it that much, so I like to read, but I don’t have the time to read anymore. So I was like, I don’t want to listen to these articles.

And as you mentioned in the beginning of the show, I started using the narration services of the browsers first. But they’re awful and don’t really work. Like I can’t manage tabs just to listen to what’s supposed to be podcast. I always wanted these things in my podcast feed.

So I looked around. Everything I found had an issue. The voices were awful. They didn’t really clean up the text well. I had to listen through a lot of nonsense. Share, click, buy. Or a lot of, you know, unrelated text, disclaimers, references, things in parentheses. Those need to be cleaned up.

I didn’t want to pay for a subscription because I don’t know how much I’m going to use this thing. Maybe sometimes, an article a day. Sometimes, it’s an article a week. I don’t know. It changes. And I hate the subscription part of software. Depending on what it is, sometimes it makes sense. But for this kind of product, it didn’t really make sense because I don’t know if I’m going to use it constantly. And I don’t want to be on a subscription treadmill, if you will, all the time.

So I wanted to pay for what I use. I want it to be cleaned up first, then narrated really really well. Didn’t exist.

I was like heck with it! I’m going to do it. But it took about 3 to 4 months to build the initial version, then another month or two to go through the feedback and fix the bugs. And now, it’s pretty good. I mean, it still has its quirks. Now, I have extensions and stuff. Sometimes, some stuff doesn’t work, and I have to look up. But it’s pretty stable, especially the email part is pretty stable.

Jonathan: If you’re a smartphone user, you do have access already to Instapaper, and Pocket, and several other ReadLater services. Although I think that has significant benefits for everybody.

But what caught my attention about this when you got in touch with me and told me about this service is that there are a lot of blind people out there using devices specifically created for us, and even some people who do have smartphones like to use those devices to consume a lot of information. The Victor Reader Stream from Humanware is probably the most popular one. It doesn’t have a touchscreen. It has physical buttons. You can take it everywhere with you. And it occurs to me that as long as that Victor Reader stream user has a computer, they can send their content to and subscribe to the podcast feed on the stream, and they will get this service there. And it will open up the whole concept of reading things later to many more people.

Yalim: Yeah. Well, the way I started interacting with the blind community was again, through a customer. I mean, I thought that while building it, I realized that hey, this could be useful for the blind community. It occurred to me. But I didn’t really do anything specific about it other than trying to make sure that the page is accessible. Even then, I still have some work to do there.

But the feedback, even for the accessibility portion of the webpage was positive, I’d say.

After my initial ad run on an Apple blog, Sabahattin (I don’t think he would mind that I mention his name), he goes with Sabi for short. He contacted me. He was one of the first customers who contacted me. I didn’t know initially that he was blind. We just started talking. He was asking very insightful questions. It was obvious he deeply understood the product. He was very intrigued by it. He asked intelligent questions, and he helped me a lot because those questions made me think about the product myself. Like he asked, why did you choose this? Why did you not do this? Why did you do that? And I was like hmm. Why did I not do that? And I actually stopped, thought about it, and wrote back to him what I hope were thoughtful answers.

So he was my first interaction with the blind community. He liked it. He complained about a product going paid or something. Voice Dream, DreamVoice?

Jonathan: Voice Dream reader. That’s a big controversy.

Yalim: Yeah, Voice Dream. Yeah, I learned about that controversy, thanks to Sabi.


In some cases, instead of using Voice Dream, he started using ListenLater.

We still exchange emails. I’ll definitely mention that I was on this podcast. He’s been very insightful.

And then, there was a second one. I don’t think he’s still signed up. He asked a bunch of questions about his blog.

Again, until towards the end of these email exchanges, I don’t even know that they are blind. I have no idea, so I just talked to them. And eventually, he mentioned that he is, and was like, the voices were amazing, and he always wanted to put those voices to his own blog, which is something he can do with ListenLater.

So it’s being received well. And hopefully, it will be helpful and useful. And it’s cool.

Jonathan: Sure.

There are various ways to get content to And the easiest one is just to send it an email, and include a link to the article, right?

Yalim: Yes. And originally and initially, and for the most part, that’s the only way that I intended to do this. I didn’t want to build an app because I think people have enough apps. And for a long while, I thought about what’s the best way for people to send content to this service without learning a new app, because it’s a huge barrier.

I mean for tech people, they think it’s easier with just another app. Just download it and use it, goddammit.

But I know for a fact that it’s not. Like each app for my parents, even for my wife who’s not technical, each app is like a mental hurdle they need to get over. Like, there’s an app. I have to get used to that app. I have to install the extension. I have to remember the extension. I have to know. I have to sign up. I have to do this. I have to remember the username and password. I have to subscribe.

Took me a while to figure out. I was like, wait a minute. Let’s use email. Everybody knows how to send an email. Everybody has that installed on their device. It’s easy. Everybody knows how to do that. They don’t have to remember anything. Like, no username, no password. No extension, no share sheet where you click your extension, if something pops up. Nothing is always there. It’s already installed on millions of devices. I was like, heck with it! I’m going to do it with email. And the original and still the current main way to use this is send an email. You can either send a link. Hey, I want to read this, send it. I’ll just email it.

Or if you have a newsletter, (it’s very popular these days), Substack paid subscriptions. I think you mentioned that you have one. I’m just going to forward it. I get an email. I want to read it. I’m going to forward it.

Or there’s a PDF. I want to read this PDF. I’m just going to forward it. And I think that’s still the 90%, 95% of the way people interact with it.

I built extensions because people wanted it.

And after the service is available, after the bugs are fixed, I still feel like I need to do things and get this customer feedback. So I built a bookmarklet. And I think that’s another simple way to use ListenLater. It’s on the main page. There’s a bookmarklet. If the user drags and drops it to their bookmarks bar, that will just be there. And whenever they’re on a page, they can just click the bookmarklet. And unless the website restricts the – because it’s possible. It’s called, they may have content policies to restrict how their content is used. So assuming most sites don’t do that, in most sites, the bookmarklet should work.

And even more fancier, I recently released a Chrome extension where you can just click the Chrome extension, and it’ll send the content of the current tab to ListenLater.

But I view them as secondary ways. Most people are mobile, and they’re on their mobile devices all day. So I view these bookmarklet and Chrome extension as secondary ways for people to spend more time on desktops.

Jonathan: Right. Because as I see it, the big advantage of using the extension is getting behind paywalls legitimately. I mean, if I’m subscribed to the New York Times, or the Wall Street Journal, or any number of paid for publications, if I email you a URL, then you’re not going to be logged in as me. So you won’t have the access that I do. But if I’m on the page, I’m signed in as me and I invoke the extension and it sends that material, then you will get it. So that’s a good thing.

But I guess you could also do it by just copying all the text from the webpage, pasting it into the email, and sending the email, right?

Yalim: Yeah. Also what most people do, and I didn’t come up with this. Well, did I? I don’t know. I might have. [laughs]

PDFs. Like Wall Street Journal. I’ve got customers who are Wall Street Journal subscribers. The way they send their content is on an iOS, let’s say on an iPhone, they tap share, then they tap print, which opens the print dialog. And as it turns out, there’s another share button on the print dialog, which surprised me when I first saw it. I was like hey, let’s press this, see what happens. So I press that second share button on the print dialog, and another share sheet shows up, which has the mail app. So you click the mail app, and what iOS does is it creates a PDF from the current webpage and attached it to a new email. So all of a sudden, you got an email ready to go, and your Wall Street Journal or whatever paywalled content you have, created as a PDF and attached to that email. And you just email it to ListenLater, and voila! And this works really really well.

Jonathan: And is there a size limit to what I can send?

Yalim: There is a size limit, but I don’t think you can pass it with an article because we use a mail server and their PDF limit is 20 megabytes. That’s a lot of content. And I can always talk to them to increase it, but I haven’t been able to generate an article that is greater than 20 megabytes.

Jonathan: Thinking about this, I think blind people are actually at an advantage here because when we load a web page, certainly in Windows-based screen readers, it puts it into a virtual buffer. So we can press Control A to select all, Control C to copy, and then just Control V to paste it into an email to

Yalim: Alright, yeah. In a Windows PC, that works perfectly well.

Jonathan: Yeah, yeah.

Yalim: Because on iOS and even MacOS, they can use a shortcut that works on iPhones and iPads. For MacOS, I believe I have another shortcut, and I don’t think I put it online yet, but I have it available. And if somebody asks, I share it. Maybe I should put it online, actually. because for some reason, Apple shortcuts is a second-class citizen on macOS, so the same shortcut may not work on macOS, so I ended up creating a new one.

Jonathan: What can you tell me about the text-to-speech that you’re using?

Yalim: That’s actually what makes this all possible. AI. The whole thing is AI, and I use OpenAI, the creators of ChatGPT.

Jonathan: Yeah.

Yalim: So, well, let me explain the whole process. Maybe it’s of interest to the listeners because it includes the audio part as well.

First of all, I get the content. Okay? And then, I put it through a software. It’s an open source software that cleans it all up, gets rid of the HTML. And if it’s a PDF, I extract the text out of it. There are open source software that does this.

Then, I send that text (whatever I have, because it’s not perfect. Software is not perfect. It cannot clean up an HTML page that well.) I send it to OpenAI’s AI, and I tell it to get rid of all the redundant stuff that doesn’t belong to the main article. So that’s actually the thing that takes a while, makes you wait because if it’s a long text, it’s multiple round trips.

And then finally, I again use OpenAI’s text-to-speech service. They have 6 voices. I think they’re excellent. They’re amazing. Compared to the other voices around, I think it’s no contest, really. [laughs] I don’t know how they do it, but they do it so well, and I decided to use them. I think it was a no-brainer. There are 6 voices.

I don’t know if you tried, but there’s also an HD voice setting. Did you try it?

Jonathan: I did. I found the differences quite subtle. What do you think the key differences are?

Yalim: The difference is subtle, but it’s noticeable, I think.

Jonathan: Yeah. What I might do is try sending the same article twice, one with the high-quality speech on and one without, to hear the difference more.

Yalim: Well, did you try the examples on the web page?

Jonathan: I did, yes. Yeah, I did try those.

Yalim: Okay. So the difference is it gives the AI more time to synthesize a better sound. That’s the difference. So instead of trying to be quick and real-time, the AI tries to really produce the best it can do.

I did a big experiment, independent study, in the Gerger family. I had the 2 versions, and I have 2 kids. I talked to my son, my sister, and my wife, and I go, “Here’s the 2 sound bits, exactly the same ones that you listen on the web page.” That piece from, I think it’s from Moby Dick.

Jonathan: Yes.

Yalim: And I love that piece.

And each one of them, unequivocally, is that the word?

Jonathan: Yup, unequivocally.

Yalim: unequivocally chose HD.

Jonathan: Right.

Yalim: But they all said that it’s subtle, but it’s obviously better. That’s why I decided to put it on the website. Again, not everyone can hear it. For some people, there is no difference. And then you should not use it. But I see that several of the customers have it turned on. It’s a little bit more expensive, unfortunately, but it’s there if people care about it.

Jonathan: And what expectations should people have of turnaround time once you’ve sent an article? I take it if you turn the HD on, it’ll take a bit longer to process?

Yalim: Yes, but that part won’t be noticeable. I don’t think so. That’s not the bottleneck.

Jonathan: Okay.

Yalim: The bottleneck is with AI processing and the busyness of OpenAI servers, especially during American business hours. They are very busy. So busy that sometimes the requests I send to their servers fail, so I have to send it again. And I had to write some code that manages this because I was surprised that it failed so often. I wasn’t expecting it. That’s the main bottleneck, the busyness of the servers.

You’re in New Zealand, so it should be fairly quicker for you. A regular CNN article. I don’t know. 500 words, 600 words. That should be fast.

Jonathan: Right.

Yalim: But longer articles, I mean, I’ve seen articles that took an hour or 2 hours because they’re long, and the OpenAI servers are busy.

Whatever I do, I do quickly. But the current state of AI, it’s slow. It’s 15 minutes, 10 minutes. And especially if you send like 5 articles back to back.

ListenLater only processes one article at a time per user. And I think this fits well with the way people use podcasts.

I never, maybe I should have, but I never designed this for like real-time screen reading. That wasn’t the idea.

Jonathan: Right.

Yalim: The idea was okay, I like this article. Before you go to bed, you scroll your iPad. Oh, I like this article. Yeah, yeah. I don’t want to read it now. I’ll listen to it later. That was the idea.

So I don’t think this is for people who are in a hurry to read something. That’s not the design. If there is a need for that, I have to rethink the design and turn it to real-time because technically, it is possible to make it real-time.

Jonathan: As I was looking at this, I thought there might be one area where it gets a little bit bamboozling for some people, and that is that you’ve got to paste your private podcast feed into your podcast player. That’s a one-off setup process.

I’ve had a bit of experience with this because we run Living Blindfully plus where every subscriber who generously supports the podcast gets their own RSS feed, and it’s private to them, and they have to paste it in. And some people find that a bit difficult.

What feedback have you had about setting up the podcast feed bit of this process?

Yalim: Well, it started badly because initially, I only had a URL there and nothing else. They had to copy the URL. It’s a custom URL, which is multiple manual steps.

Jonathan: Yeah.

Yalim: But I first advertised this on a software podcast, and that was on purpose. So the users, the first customers I got were software engineers or technical people. So the first onboardings were easier.

Then, I added a button where you can click that button, it shows you a list of the common podcast players, and you pick yours. And then, it automatically adds it to your podcast app. So your podcast app comes to the screen, like becomes the main app on an iPhone or Android phone, and it automatically adds it to your podcast feed. And I haven’t had any complaints, so I’m guessing it works pretty well.

Jonathan: Absolutely. I found it a snap actually to add it to Overcast on my iPhone. It was really simple.

Yalim: Yeah, Overcast is the best. That’s the one I use as well. And Overcast is very friendly to custom URLs.

Jonathan: Yeah.

Yalim: And so it’s just a tap. It’s Overcast.

Jonathan: Are there things that you hope will do in the future, or are you pretty much feature-complete at this stage?

Yalim: Actually, I had a list of things to do, like the basic things that I had to get done. I’m almost done with it. So there are a few little things here and there that I’ll add, not too visionary or anything.

Right now, it can read Google Docs or anything from Google Drive like PDFs, images. That’s all done now.

It translates, which is pretty cool.

Obviously, I want to add Dropbox support, OneDrive support, maybe EPUB support, those kinds of little things. And from there, the product will be feature-complete and anything else will be gravy.

There are some cool things that one can do with AI, like I can let people give their own instructions to the AI, you can send a PDF or an article or whatever or a book or whatever. And actually into the email, you can write what you want the AI to do, like summarize, or in this article, get rid of these parts, keep these parts. Or maybe there is no article. You just write an email, you know, create me a podcast about this topic that researches this and this and this.

Now, these are all cool, just nice, but is there a real need for these? Will this resonate with customers? I’m not sure. I mean, these are just gravy, you know cherry on top kind of things, so I don’t know. We’ll see. Sometimes, people come up…

Oh. Here’s something interesting. An Italian teacher. So he’s someone who teaches Italian to English-speaking people. He contacted me and told me that he wants to use ListenLater while he’s teaching Italian to English-speaking people, but he couldn’t really use it because ListenLater translates the text. He wanted to send a piece of text, but the text is mixed language. So some of it is English, some of it is Italian. He wanted it to stay like that.

So that was an interesting use because the AI’s pronunciation is really good. He wants to use that to help his students.

So that’s something I did. He hasn’t still experimented with it, but it’s actually available.

It’s a different email address, but if you send an email to, … well, this is the first time I’m telling it to anyone other than the Italian teacher. If you send an email to and if it’s a mixed language text, ListenLater will read each part in the correct language with the best pronunciation it can deliver.

Jonathan: And does that auto-detect, or do you have to mark up the language in some way?

Yalim: No, no. The AI does the markup first.

Jonathan: Right, okay.

Yalim: So if you want to, say, learn Italian, and you have a piece of text that reads one sentence in English, then the same sentence in Italian, AI processes text first and labels each section where a new language starts. And then I send that in chunks to the text-to-speech engine, and then they are produced correctly.

Jonathan: Sure, that’s very handy.

What about images? Because you mentioned images before. What’s it doing there? Is it reading the text from images, or can it also describe a photo to me? If I just send it a photo, it’ll tell me what’s in it.

Yalim: It won’t. It just captures text from images.

Jonathan: Right. Okay.

Yalim: I didn’t do that because that exists. ChatGPT does that. I didn’t see the value of ListenLater describing an image as a podcast.

Jonathan: No, I agree.

We haven’t talked pricing yet. What’s your pricing model for

Yalim: The pricing model is pay-as-you-go. Again, I tried not to do the subscription.

The way OpenAI prices its APIs are not very conductive. (I guess that’s the word.), conductive to subscriptions anyway because you pay as you go to OpenAI. The text-to-speech engine charges by character, and the AI piece charges by tokens, which is just a piece of the work.

So I literally took the same model they use and reflect it to the end user. The longer your article is, the more expensive it is to charge. I believe right now, for AI processing, the cost is what? 3 cents per thousand tokens. And text-to-speed generation is also 3 cents per thousand characters.

And of course, this makes it a little bit confusing to guess a price for an article. Did you see the price estimator? Did you notice the price estimator?

Jonathan: I did, and I saw you can refer people for credits as well.

Yalim: Yeah, that’s new. I don’t know how – I asked Sabi. He was the first person actually I talked to about the referrals. I asked Sabi. He said, “I’ll use it.” But so far, nothing. We’ll see. It’s just new. It’s been like 2 days. That’s brand new.

Because, you know, I’m a one-developer shop. I’ve always been a single … For the last several years, 4, 5 years, I’ve been alone, and I love it. Basically, this is a one-developer shop, and I don’t have this big budget to advertise this.

So my biggest advantage here is my users. If I make them happy, I hope that they invite their friends, and it will spread with word of mouth. And if people are not happy, they’re not going to recommend it, and this thing is just going to dwindle down, I guess. So yeah, that’s what the referral program is.

I try to do my best and hopefully, it’ll be good enough for people to refer other friends. And then, they’ll also get their free credits back. So every new user who makes a purchase through a referral will give you back $2 in free credits, which you always get when you start anyway.

Jonathan: Right, so you get some free credit. So anybody right now, if they just want to check this out without any obligation, can sign up and give it a spin.

Yalim: Yes. And then, if you like the service, you can add credit to your balance. You can do it just whenever you want.

And when you run out of it, you get an email. Or, I mean, the way I imagine it to be, it’s like having a subscription, but not actually having a subscription. So best of both worlds.

You can toggle on the auto-refill, which will automatically charge your credit card when your credit balance goes below $0.10. This way, you’ll never run out of credits, but you’ll never pay more than you use.

Jonathan: On the site, you can select the voice that you prefer, and you can also set the speed if you want. But it’s worth pointing out that since this is a podcast, you’re listening on a podcast player and that is likely to have a speed control. So you can actually influence the speed there, and perhaps set it on a per article basis.

Yalim: Yes. And honestly, I think the speed feature belongs to the podcast app. It does not belong to ListenLater.

So if you ask me, why did you then put it there?

Because if it wasn’t there, people asked for it. [laughs]

Jonathan: Yeah.

Yalim: And I thought to myself, well, I already have the API capability to do this. Why don’t I just put it there?

This reduces support emails. And everybody’s happy because after you try it, it becomes crystal clear that the real place to adjust the speed is your podcast player, not the ListenLater servers. That doesn’t make sense at all. But fewer support emails, I’ll take it.

Jonathan: In my limited trials of this, I am a fan, and I’m using it quite a bit. I’ll be interested to hear how others integrate it into their lives.

People can find out more by visiting, all one word.

This is cool. It has a lot of potential.

Yalim, I really appreciate you letting us know about it, and coming on the show today.

Yalim: Thank you, Jonathan! I truly appreciate it.

And I look forward to interacting with the blind community. Anyone has any questions, they can always send me an email. There’s also a form on the website after you sign in, or just, you know, send an email without signing in. I’m quite responsive to emails, and I’m looking forward to hearing from you all.


Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at That’s P-N-E-U-M-A

A Demonstration of

There are several set up steps that you will need to go through if you want to use

The first is to go to the website and sign up. It’s an accessible, straightforward process. You give it an email address and a password. The email address is important because you’ll want to use the email address from which you want to send articles to ListenLater, if you want to use their email service.

I’ve been using ListenLater for a wee while, and I’m signed in. And if I go to the top of the page and just tab through the website, you’ll find it’s very accessible.

JAWS: ListenLater – Microsoft Edge beta. Listen later – Microsoft Edge beta page.

Listen later.

List with 9 items.

Visited current page, link.

Buy credit, link.

Settings, link.

Transaction history, link.

Support, link.

How does it work, link.

Tips and tricks, link.

Pricing, link.

End user agreement, link.

Jonathan: And there are other important elements on this page as well such as your personal podcast feed, you can check how much credit you have left, you can download the Chrome extension, and much more.

When I went into on my iPhone in safari, I did find a button to open the podcast feed and Overcast was right there. It was a very simple process to get this feed in Overcast, so I was able to listen to the articles that I sent to

I’m not a Victor Reader Stream user, but I have had some experience of private podcast feeds through assisting with Living Blindfully plus, so I am aware that what you’d have to do is use the Victor Reader Stream companion and go through its one-off setup process for adding a manual podcast feed. If you want to find out more about that, you can go to the Living Blindfully plus page at because we have detailed instructions there about how to do this for Living Blindfully plus, and it’ll be possible to apply those instructions to your private podcast feed from Once you’ve done it once, then you’re good to go and you shouldn’t have to worry about it again.

One of the important setup processes you’ll want to go through is to make your selection of voice. And you probably want to get this one set up before you send your first article, so that it’s going to sound to your liking when you check out your own personal podcast.

I’m going to bring up the list of links, …

JAWS: Links list, dialog.

Jonathan: And we’ll press S for settings.

JAWS: Sign out.

Send to ListenLater.

Settings. 4 of 15.

Jonathan: I’ll press enter.

And the page has refreshed, although we don’t hear that.

We’ll go to its top, and now we’ll navigate via heading.

JAWS: Settings, heading level 2.

Heading level four, voice selection.

Select one of the 6 voice options to narrate your articles.

Voice. Combo box collapsed, Alloy.

Jonathan: This is the one that I’m using, and I believe it is the default. I quite like that one.

JAWS: Listen, button.

Jonathan: And we can listen to each voice. So here’s the first one.

JAWS: Main region.

Listen, button.

American male voice: Hello! I’m Alloy, your steadfast companion in the world of audio. My versatile voice is engineered to adapt to a wide array of content, from technical explanations to everyday conversations, ensuring clarity and reliability in every word.

Jonathan: We’ll go back to the combo box.

JAWS: Narrator voice. Selection, combo box collapsed. Alloy.

Jonathan: And we’ll make the next selection, …

JAWS: Echo.

Jonathan: and press tab.

JAWS: Listen, button.

Jonathan: Have a listen to that one.

American male voice: Hi! I’m Echo, designed to articulate your words with precision and clarity. Whether it’s guiding through complex instructions or illuminating educational content, my voice ensures your message resonates clearly and effectively.

Jonathan: I’ll shift tab, …

JAWS: Narrator voice. Selection, combo box collapsed. Echo.

Jonathan: down arrow, …

JAWS: Fable.

Jonathan: press tab, …

JAWS: Listen, button.

Jonathan: and have a listened to this one.

British male voice: Welcome! I’m Fable, your narrator for the enchanting world of stories. With a voice crafted for warmth and engagement, I bring tales to life. From ancient myths to modern fables, inviting listeners into a world of imagination.

JAWS: Narrator voice. Selection, combo box collapsed. Onyx.

listen, button.

American Male voice: Greetings! I’m Onyx, the voice of authority and professionalism. My confident tone is perfect for delivering impactful presentations, authoritative content, and formal discourse, ensuring your message is received with the gravity it deserves.

JAWS: Narrator voice. Selection, combo box collapsed. Nova.

Listen, button.

American female voice: Hello! I’m Nova, vibrant and full of energy, ready to bring dynamism to your content. From lively marketing campaigns to engaging podcasts, my voice adds a spark of enthusiasm making every message memorable and exciting.

JAWS: Narrator voice. Selection, combo box collapsed. Shimmer.

Listen, button.

American female voice: Hi there! I’m Shimmer, here to make your day brighter with a friendly and inviting voice. Ideal for casual dialogues or content, aimed at sparking joy. I ensure a lighthearted and accessible listening experience for audiences of all ages.

Jonathan: And those are the 6 voices. So if you like human-sounding speech, pretty impressive.

Since I recorded that interview with Yalim, I have been doing a little bit more A-B comparison of the premium versus non-premium voices. And what I’m finding is that the premium voices, or the HD voices, I think they’re called, because they are using more tokens, actually get the emphasis right more often than not. In other words, it’s really taking a look at what it’s reading, and using appropriate inflection that makes contextual sense. It’s quite remarkable.

I’ll press tab. I haven’t made any changes yet.

JAWS: Listen, button.

Choose this voice, button.

Toggle HD voice switch. Pressed, on.

Jonathan: I’ve got the HD voice on. I think I will keep it on.

Bear in mind that you’ll burn through your credit slightly faster if you have the HD voice on, but I really like the experience.

If I press tab, …

Play HD voice, button.

Play standard voice, button.

Jonathan: What I’ll do is play you the standard voice sample first. And then, I’ll play you the HD voice, and you can see if you can hear the difference. Here is the standard voice.

Standard American Female voice: Call me Ishmael. Some years ago, never mind how long precisely, having little or no money in my purse and nothing particular to interest me on shore, I thought I would sail about a little and see the watery part of the world. It is a way I have of driving off the spleen and regulating the circulation.

Whenever I find myself growing grim about the mouth, whenever it is a damp, drizzly November in my soul, whenever I find myself involuntarily pausing before coffin warehouses and bringing up the rear of every funeral I meet, and especially whenever my hypos get such an upper hand of me that it requires a strong moral principle to prevent me from deliberately stepping into the street and methodically knocking people’s hats off, then I account it high time to get to see as soon as I can.

Jonathan: That’s a little bit lengthy, but I think the context is important.

So I’ll go back, and this is the same text read with the HD voice.

JAWS: Play HD voice, button.

HD American female voice: Call me Ishmael. Some years ago, never mind how long precisely, having little or no money in my purse and nothing particular to interest me on shore, I thought I would sail about a little and see the watery part of the world. It is a way I have of driving off the spleen and regulating the circulation.

Whenever I find myself growing grim about the mouth, whenever it is a damp, drizzly November in my soul, whenever I find myself involuntarily pausing before coffin warehouses and bringing up the rear of every funeral I meet, and especially whenever my hypos get such an upper hand of me that it requires a strong moral principle to prevent me from deliberately stepping into the street and methodically knocking people’s hats off, then I account it high time to get to see as soon as I can.

Jonathan: So as Yalim and I both said in the interview, it is subtle, sure, but you get the feeling it understands more what it’s reading and slightly changes emphasis accordingly. So it just makes it a bit more pleasurable, and you can decide whether you think it’s worth the price.

Speaking of price, you do get $2 of credit in your account when you sign up to And the easiest way to get an article to it is to copy it to the clipboard, or copy the URL to the clipboard if it’s not behind a paywall, or you could attach a supported file format like a PDF.

So you can do all sorts of things with an email, and you send that email to Note the different address there.

The way that I’ve been using this is if I see an article that comes through during the day when I’m busy in my day job, for instance, and I think I’d like to read that article but I certainly don’t have time now, then I send the article, and it’s waiting there in my podcast feed perhaps at the end of the day, when I’ve got some time to go through podcasts.

So there is going to be some delay between when you send the article, and when it pops up in your podcast feed. Some of that will be due to the processing involved, but some of it may also be because of your podcast client. Overcast, for example, to save battery life, does not check the RSS feeds manually for every user. It polls on all of their users’ behalf, and this works very well when you have a common popular podcast feed.

Living Blindfully, for example. When we release a new episode into the public feed, it comes down very quickly, within a minute or so of me making the episodes public. But the Living Blindfully plus feeds, on the other hand, which are individual, you will often have to refresh those manually to get the episodes in a timely manner.

So if you think an episode should be in your podcast feed that is not, it may be your podcast client at fault and you might want to pull to refresh manually when you’re in the podcast feed, which is private to you.

If you’re browsing a newspaper site and it’s not behind a paywall, you can include up to 10 URLs in a single email. So you could be browsing, you think, I’d like to read that. You can bring up the context menu for the link, choose copy to clipboard, open up a new email, paste it in, and keep doing that as you go through and looking at the articles. Once you’ve finished that or you have 10 links in the body of your email, send that email. It’ll process them one at a time and eventually, it’ll all turn up in your podcast feed.

If you use sites that are behind a paywall, this is where the extension for Chromium browsers really comes in handy. It does work with Microsoft Edge. It works with Google Chrome as well. And I don’t see why it wouldn’t work with any Chromium-based browser. So that would also include Brave, which has a bit of a following in the blind community.

You will need to make sure, if you’re using a browser that is not Chrome but is using the Chromium technology, that you’ve allowed extensions from other stores. You have to expressly turn this on in Microsoft Edge, for example. But once you’ve done that by choosing the link to get the extension, the google chrome store will pop up and it will install into microsoft edge and other browsers.

What I’ve done is to set up a hotkey for that extension. I’m not going to go through how you do that here because how you do it differs from browser to browser.

My default browser is Microsoft Edge. I like that one. But you could be using Chrome, or Brave, or any number of other options. So you can hopefully find out how you assign a hotkey to an extension because in the case of, this is really handy.

The hotkey that I’ve chosen is Alt-Shift-L for ListenLater.

To give you an example of just how effortless this is, I am now on the webpage for the New Zealand Herald, which is a newspaper in this country, and they have paywalled content. I’m logged in.

I pay for the Herald, even though their iOS app is a debacle. It’s all right on the web. And if I were to send the URL of an article to via email, I wouldn’t get the full article, understandably, because it can’t be signed in as me.

So I will navigate by heading and find an article that interests me.

JAWS: Grid emergency notice issued in NZ over severe space storm, heading level 2, link.

Jonathan: Ooh, that sounds pretty intense, actually. I want to read that one, so I’ll press enter.

JAWS: NZ Herald, this site can access your location.

Jonathan: And it should be loading that article now.

Are we going to get the link announcement? I’m not sure that we are.

But it’s loaded now. I’ll just confirm that. Yeah, it’s all there.

Because I’ve taken the trouble to set up a hotkey for the 6-inch in a one-off setup process that pays dividends of efficiency in the long run, all I have to do is to press Ctrl Shift L.

JAWS: 1 heading and no links. ListenLater.

ListenLater icon graphic, heading level 1.

Confirm submission. Send this article to ListenLater?

Send, button.

Cancel, button.

Jonathan: And that’s all we have. So we’ll go back.

JAWS: Send, button.

Jonathan: And I’ll press the spacebar.

JAWS: ListenLater.

Send, button.

Transpower issues grid emergency notice due to weather and space Nz Herald – Microsoft Edge beta.

Jonathan: And that sound is a Windows notification. If I read it, …

JAWS: New notification from ListenLater. Listen later article submitted successfully. Via Microsoft Edge. 1 of 1.

Jonathan: Now, I have to tell you in the interests of full disclosure. At the moment, this works beautifully for me on the ThinkPad, which I have connected to my mixer to record this demo. Works every time.

On my desktop PC, I cannot get this to work. I’ve got the extension installed. But every time I press the submit button, I do not get that Windows notification telling me that the extension has submitted the article successfully, and I don’t know why. It’s actually not doing it in multiple browsers. So it’s something to do with that particular machine, rather than any browser that I am using.

Hopefully, that won’t happen to you because it is a really cool feature for just quickly getting material from a webpage to

What I’m going to do now is pause the recording, and we will wait for that article to come through in the podcast feed, and you can hear what it’s like.

I’m back now. And by the time I set everything up, switching from my laptop to my iPhone, and checking the levels and that sort of thing which took me a couple of minutes. The article was already there, so that was pretty quick. However, I did have to manually refresh the feed. So keep in mind that depending on the schedule that your podcast app uses or how it retrieves episodes, you may have to do something similar.

So once you’ve done the initial setup, ListenLater will just come up as one of your podcasts. So if I flick through here, …

VoiceOver: The Lead with Jake Tapper.

Jonathan: Oh well, this is an opportunity for me to point out that for blind people using smartphones, surely that podcast should be called The Lead with Jake Double Tapper, shouldn’t it?

VoiceOver: Legacy, 0 unplayed.

ListenLater podcast. Now playing, 2 unplayed.

Jonathan: I’ll double tap, and flick right.

VoiceOver:, button.


This is a podcast created with, button.

Selected. Current, button.

Settings, button.

Jonathan: That’s iOS telling me that actions are available. I’m going to double tap, and we’ll hear at least some of this article.

TransPower issued a grid emergency notice due to a severe storm in space.

A spokesperson for the agency, which operates the New Zealand Power Grid, said a geomagnetic storm is likely to affect Earth over this weekend because of significant solar activity. G4 severe level events are rare, with only 3having been observed this solar cycle since December 2019.

Our control rooms are monitoring the event, the spokesperson said.

As part of our contingency plan, we are removing some transmission lines from service across the South Island as a precaution. In order to do this, we have to issue a grid emergency notice. However, this initial action should not impact supply of electricity to consumers. At this stage, the spokesperson said they do not anticipate any impact on New Zealand’s electricity supply to consumers.

AP reported the storm could produce northern lights in the US And potentially disrupt communications over the weekend. The site reported a US National Oceanic and Atmospheric administration forecaster said this storm, ranked four on a scale of 1 to 5, poses a risk for high voltage transmission lines for power grids…

Jonathan: I’m just going to pause it there. You can hear that’s very well read. It trimmed beautifully all the stuff that is often at the top of newspaper websites. So not only is it nice to be able to take your news with you in your podcast app of choice, but it also has some accessibility benefits in just taking away the clutter.

So give this a shot. I love this. I’m using it a lot now. If you would like to take it for a spin, go to

And here’s a very cool thing that I can tell you. Yalim has kindly agreed that when you do your first top-up, when you use your credit card to add a bit of credit to your Listen Later account, if you contact him via the contact form on the page to let him know that you are a Living, Blindfully listener who has just done their first top-up, he will give you an extra 20% on top of whatever you paid for. So a 20% deal on credit is pretty sweet, and it’s available to all Living Blindfully listeners just by telling him right after you’ve done your first top-up that you heard about on Living Blindfully. Hope you enjoy it.

Closing and Contact Info

Harold Wilson said that a week is a long time in politics.

A week is a very long time in blindness events and technology, and we’ll be back with you next week for another exciting installment.

Thank you so much to everybody who makes contributions, and we look forward to bringing you some more of them in a week. Until then.

Remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.


Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.