Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.
Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast living your best life with blindness or low vision. Here is your host, Jonathan Mosen.
This week, we’re learning more about Apple’s Vision Pro, and it may not initially be the game-changer we hoped for. Vispero comments on quality issues with their Focus Braille display products. Get access to an accessible traffic signals with Oko, and there’s a new Beatles song coming. Yeah yeah yeah, or no no no?
Welcome! We are indeed in Wellington, New Zealand, and a pretty chilly Wellington, New Zealand actually, as we head into the winter months.
This is episode 235, and there’s no area code 235 in use at the moment in the United States. That may change in time.
However, if you were to use country code 235, you could make a call to Chad. Now this isn’t an individual call, Chad. I mean, wouldn’t that be great if you were just so important that you had your own country code? Imagine the kind of Chad that we all became familiar with in Florida in 2000. You remember those? The dimpled Chad’s, the pregnant Chad’s. What was the hanging Chads? [laughs] Yeah, that’s right. It was hanging Chads. It’s not even that. This is a country called Chad, and it’s in Africa, as all of these country codes are since we’ve started to note to the country codes as well as the US area codes following some suggesting that I do that when we had that big announcement about Living Blindfully. They’ve counted the people in Chad this year, and there are 18.3 million of them. So if you are listening from Chad, a very warm welcome to you, whether you’re pregnant, dimpled, hanging or otherwise.
As you know, Pneuma Solutions sponsor the transcripts for Living Blindfully. We really appreciate that. And we also appreciate the incredible Hannah for doing all the transcribing every week.
We initially started this because we wanted to be accessible to the deaf-blind community. But I know, like many accessibility things, the benefits extend well beyond the original target group. Some people listen to the episodes, but then they check the transcripts to refer to a link, or a particular product name, or whatever. So if you’re using the transcript, I know that you’ll join me in thanking Pneuma Solutions for making them possible.
We talked some time ago with Mike Calvo and Matt Campbell from Pneuma Solutions about the work they were doing on RIM for Mac. This is Remote Incident Manager this very accessible, simple way of getting remote assistance or giving remote assistance. And you can also remote unattended into a machine when you set that up.
And I’ll certainly be making good use of that when I head off to the NFB convention in Texas, as I’ll still have MushroomFM maintenance to do, and I’ll be able to log in to the MushroomFM computer as if I was sitting right in front of it. Genius!
Now the rim for Mac beta is over. It is officially available for both Windows and Mac.
And it’s important to emphasize that these things are interchangeable. What that means is if you are using Windows and you want to remote into a friend’s Mac machine and help them out, you can do that, and vice versa. If you’re using a Mac and you want to remote into someone’s Windows machine, or you have a Windows machine that you want to log into that belongs to you and control it, you can do all of these things.
There are so many ways that you can use RIM in terms of price points. You’ve got the day passes, you’ve got a monthly subscription, you’ve got the community pass. Check it out.
With the recent work on RIM, Pneuma Solutions are onto a winner that is making a massive difference in the blind community. And of course, it’s created by blind people for blind people, but it’s also fully accessible. RIM can be deployed anywhere, and it is a viable, competitive remote incident solution.
If you want to know more, head on over to getrim.app. That’s G-E-T-R-I-M.app, Whether you use a PC or a Mac.
Voiceover: Stay informed about Living Blindfully by joining our announcements email list. Youll receive a maximum of a couple of emails a week, and youll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want.
Join today by sending a blank email to email@example.com.
Why not join now? Thats firstname.lastname@example.org, and be in the know.
Exactly, and who doesn’t want to be in the know in this ever-changing world in which we’re living?
I did want to follow up on the piece we did in episode 234 with MostlyBlindGamer, who is involved in the administration of the r/blind subreddit because by any definition, that Reddit blackout that occurred in the previous week has been an enormous success.
Thousands of subreddits (these are the forums that make up Reddit), were switched off last week. Ironically, switching them off was not accessible. [laughs] Isn’t that just beautiful? So the r/blind subreddit people had to have a sighted person go in and flick the inaccessible switch so that they could participate in this boycott.
But this impacted millions of users in a very widespread protest on the service. And I was encouraged that the Verge, which is a well-read tech publication, gave quite a bit of coverage not just to the whole blackout and third-party apps, but specifically to the impact that this was having on the blind community.
The CEO of Reddit appeared on an AMA. We talked about AMAs with MostlyBlindGamer that stands for Ask Me Anything. And if anything, [laughs] he made things worse. He did acknowledge that Reddit does have to do much better with its own offerings in terms of accessibility, but he gave no indication about timeframes when these issues were going to be fixed.
As I mentioned in 234, they are offering free access to some non-commercial apps that are specifically designed to provide Reddit access to those who have accessibility needs not currently being met by the first-party tools. In other words, the tools that Reddit provides. This includes the Dystopia app, which has been in TestFlight a very long time [laughs], but apparently might be coming out of TestFlight. The Dystopia developer has spoken with Reddit, and Reddit is on board with making the API available without any kind of punitive charges.
And while that is undoubtedly something, and there will be many people who are grateful for that, it doesn’t change the fundamental issue which is Reddit is powered by many, many thousands of volunteers who moderate content, who run these subreddits (these forums). And every time we post to Reddit, we are making Reddit more viable, more compelling, hopefully, with the quality of our posts.
Now, without any kind of compensation, Reddit wants to take our content and the work of volunteers, commercialise this, and charge a bomb for third-party app developers who make the provision and the monitoring of this content possible.
And that circles me back to the Fediverse, things like Mastodon, which we’ve been talking a lot about on Living Blindfully over the years, but particularly since November when Elon Musk fired his whole accessibility team at Twitter. And that kind of woke a lot of people up, myself included, and many of us have moved to Mastodon where there are no algorithms exploiting our content, no one thinking that they own our content, and there has been a mass migration to the Fediverse as a result of this blackout. I’m heartened by that. There is going to be some competition in this space on the Fediverse.
But if they’re all activity-based, which is the protocol that powers all of these tools, not just Mastodon but many others, then they will be able to all talk to each other, and it shouldn’t be a big issue. The APIs are open, it’s a very different mindset, no one’s trying to commercialise our content, and the people-driven social media revolution continues at pace.
I have an account that I follow on Mastodon where I can look at how many new accounts have been added to Mastodon every hour. And since the Reddit blackout happened, there has been a massive spike in the number of new accounts that are being created on Mastodon. Because when something happens that really affects something that you care about and that you use daily, you realise why this social media revolution is important, why we need to go back to the roots of communication online, actually, like the bulletin boards that used to be run way back in the 80s when I first got online, and even into the 90s as well. It is a very important principle we’re fighting for, and it’s just so cool to see this gaining traction all the time.
There are a couple of ActivityPub-based Reddit alternatives that I’ve become aware of. One is called Lemmy which I understand is a bit controversial, but I believe that has got quite a lot of traction nonetheless. There’s also one called KBin.
I think there is some work to be done on good quality accessible third party apps, but I’m confident they will emerge in the same way that Mastodon third-party apps emerged.
We have choice in that space of some very accessible, highly functional third-party apps, my favourite of which of course on iOS is Mona, which we’ve covered here on Living Blindfully.
So bring on the revolution is what I say.
Oh. Speaking of revolutions, the chief executive of Reddit sent a memo to staff saying this will pass, don’t worry about it too much. All these revolting users, it’s going to pass. And I wonder which will pass first the revolt, or the tenure of this particular chief executive of Reddit?
While the blackout 48-hour period has now long ended, there are some subreddits that are not coming back until Reddit reverses course on the API changes, and some of those subs have millions of subscribers. So they really have opened up a can of worms.
I wonder when these tech companies will learn that when they start to try and monetize our content in ways that are detrimental to the provision of that content, it never ends well.
I do want to send a shoutout to the creators of one of the big Beatles subreddits, because we’re going to be talking a bit about the Beatles later a very big story this week pertaining to the Beatles. And yet, the subreddit held the line and did not come back for one of the biggest musical stories of the decade. So good on you!
Advertisement: Mastodon is the social network where the Living Blindfully community is most active. Join us there for conversation about the most recent episode, and items of news to help you live your best life with blindness or low vision.
Let’s talk Apple. Rebecca Skipper says:
“I commend Apple for its commitment to accessibility, despite some of the bugs that have not been addressed for Braille and VoiceOver users. Apple has a lot of products and services, and assistive technology is not their main focus, though they always seem to address the diverse needs of the disability community, unlike Twitter.
However, Envision has a maturing product and is dedicated to assisting the visually impaired. Their app is freely available. You do not need to connect to an external battery pack.
If something breaks on the Vision Pro, a blind user could have an unusable device. That is not a risk I’m willing to take right now.
Let’s see how things go. My hope is that the companies in the assistive technology space will continue to innovate.
In addition, I do not like how Apple kills third-party apps.”
[laughs] I think that the word there is Sherlocking. That’s the word that people use regarding what Apple does when they come along with a WWDC update, and they announce their own version of a popular app.
“It will be interesting to see how this shakes down.”
Thank you for your email, Rebecca. You know what they say about assume, right? That it makes an ass out of you and me. And I had assumed that the Vision Pro would have all of its cameras and sensors and data available to third-party apps.
But based on an article in a publication called Upload VR, I’m not sure that that is happening. It is a very important thing, so I will link to this in the show notes.
But to paraphrase what this article is saying, it says, Apple touted capturing spatial photos and videos as a selling point for Vision Pro. But in a WWDC 23 developer talk, an Apple engineer confirmed third-party developers don’t get access to camera feeds citing privacy concerns.
For the selfie camera, VisionOS will return a virtual webcam view of you as your persona, Apple’s name for its realistic face-tracked avatars. This means you’ll be able to use iPad video calling apps like Zoom with no specific developer integration needed. Your persona will be shown to iPhone and iPad apps requesting the selfie cam.
For the rear camera, though, VisionOS will return a black feed with a no camera icon in the center. This ensures the app works without tracking, but obviously renders any in-app photography experiences useless. It also prevents developers from building their own custom computer vision solutions.
To be clear, developers can build apps that use real-world pass-through as the background and the headset’s APIs tell them about surfaces and furniture in the user’s space, but they won’t get access to the actual camera imagery as they do on iPhone and iPad.
The article goes on to point out that this is consistent with approaches taken by Meta and HTC who are also playing in this space.
It continues as headsets become more widely adopted and accepted in society, though this may change just as iPhone developers didn’t get raw camera access until iOS 4 in 2010.
Now, I don’t want to sound the alarm unnecessarily here, but that is a pretty credible article, and it’s based on data that came from WWDC.
What that means, unless there’s some sort of blindness exception, which there could be and we may just not know about it, is that Aira and Envision and any other startup that might want to assist blind people with all the very impressive camera hardware on this device will not be able to do so. They won’t have a direct camera feed.
It is quite disappointing. I’d like to hope that this is something that the consumer organizations in the United States can work on Apple with.
I completely understand the privacy ramifications of this. But is it really that much different from granting camera access on your smartphone?
When you run an app for the first time, you are asked if it’s okay for the app to access the camera. Now if you are running an app and you think there is absolutely no reason for this app to be able to access the camera, you can decline permission. But if it makes sense to you as it would if you were trying to run Aira on your Vision Pro, you would obviously grant access. So what really is the difference here?
This first version of Vision Pro is hugely expensive. No doubt about that. And it’s going to be out of the reach of most of us, unfortunately.
But we are looking long-term. And long-term, it seems to me, that being able to wear a device like this with the LiDAR capabilities, all the other sensors, the cameras as well, it’s so convenient for a blind person. Because we’ve got a guide dog harness or a white cane in one hand, we need another hand free often to just engage with our environment. So not having to hold anything is optimal.
And when you consider the M2 chip in this device, all the things add up. It’s a very compelling use case for these visual type applications.
But if Apple, at least for now, is not going to allow third party access to these cameras, it’ll be a dream for quite a bit longer. And maybe eventually, we will see relaxation of this.
But I’d like to think that maybe, we will reach some sort of compromise with Apple. Or maybe, that compromise is already happening because all apps do have to be reviewed by the App Store. And if you clearly create a blindness-specific app that’s going to make an enormous difference to someone’s quality of life, why wouldn’t you approve it with the appropriate privacy safeguards in place?
There are two other possibilities pertaining to the Vision Pro and blind people at launch, though.
The first is that Apple may believe that with all the sensors and the artificial intelligence on this device and its speed, there’s no need for any third-party app. We know that when you go into the magnifier of your iPhone, you can get information about doors you’re approaching. We’ve demonstrated this here on Living Blindfully. There’s people detection, which can be handy when you’re finding a seat on the bus. There are various other things that Apple is telling us. It will read signage, for example, and identify objects.
Now if you’ve got a very powerful chip in this thing and they develop some of these things even further, they may say, look, you don’t actually need any kind of human assistance or any third-party app. We have got this covered for the blind community.
The second possibility is that they could do the ultimate, ultimate accessibility Sherlock and create their own visual interpretation service or buy one. [laughs] I’m not saying that I know anything about this. I do not. But imagine what would happen if they bought a visual interpretation service that’s out there, for example, and integrated that into their offerings, made it available in an affordable subscription model. That’s another way of getting around the issue.
And Rebecca, regarding your comment on the Envision smart glasses (and Envision are doing a fantastic job with those), I would observe, though, that they can’t last forever. The Envision glasses are based on Google Glass Enterprise Edition, which is now a discontinued product. This came up late last year, I believe it was. Maybe it was early this year.
And Envision rushed out a statement that essentially said, fear not, we have sufficient stock. We’re committed to developing on the platform. They say there is sufficient scope with the hardware they have for further software development.
And indeed, since the discontinuation of the Google Glass Enterprise Edition was announced, there have been more features added to Envision, including their chatGPT implementation. So I’m certainly not suggesting that it’s in any way threatened, or anything like that.
But you just look at the Vision Pro (which really is not that much different in price, a bit more expensive than the Envision glasses), and all the additional capabilities.
But you’re right. I mean, the battery life is shorter. You’ve got this battery that you have to strap to yourself.
So there are some compromises. And I think in the end, what will determine whether people go for this or not is how many things can you get done with each respective platform.
The Bookshare discussion continues. Dan Teveld starts us off. He says:
I understand your concern that Bookshare does provide a specialized preferential service to blind individuals, which sends the message that blind people need special treatment.
I disagree with you about its necessity. Some mainstream solutions like Kindle Books are not completely accessible. I can’t count the number of times when Apple or Amazon break the ability to pan a Braille display when turning pages. Kindle Books must also include features like WordWise and automatic page flip to work with Braille displays. Amazon gave in to the Authors Guild to allow authors to choose not to enable the reading of books using synthetic speech.
Bookshare transcribes older books which are available only in hard copy.
Some people would argue that blind people should buy books from Audible and other commercial audiobook services. I use Audible, but would like to have the choice in listening to a book or reading it in Braille. A sighted person can choose to buy a hard copy or Kindle version of a book, or buy a commercial recording. We don’t always have that choice.
Mainstream book platforms lack the sophisticated navigation which is available in books transcribed by Bookshare using Daisy markup. Mainstream books also lack tactile maps, charts, and diagrams. Until we have complete access to mainstream book platforms and books, I see the need for specialized libraries like Bookshare.”
George White is writing in. Thank you, George, for being a Living Blindfully plus subscriber. He says:
“I’ve never been a Bookshare member, but I now have a unique opportunity to become a member as a South African citizen. I haven’t got around to filling in the form yet because I don’t know how long I will have free access to it.
While I completely agree with paying for services, as someone with little employment, it can be difficult to come up with a large amount of money for unnecessary expenses. Furthermore, due to currency conversions, things can increase in price. For instance, my Orbit writer, which cost $100 USD, cost me 3,430 SAR due to import fees.
If I had to choose a preferred reading service, I’d probably choose Audible. It has almost all books, though there are occasions when books, even those published by South African authors, are not available for purchase in South Africa. I also love how easy it is to exchange a book that you don’t like.”
Thank you very much for writing in, George.
Audible is a commercial audiobook provider, and it has a tiny fraction of the number of books that are published every year. So there’s no doubt that we do miss out on many, many books, even with a service like Bookshare.
Angela Griffith is writing in, and one of the points she makes pertains to Bookshare so I will include this here. She says:
“I wanted to offer corrections to a couple of statements made on episode 234.”
“1. Bookshare actually does purchase hardcopy books to be scanned, which are not already in the collection and are requested for school use.
- Bose Frames most definitely are still available. Since the first 2 sets of frames were released, there have been 3 additional frames released, offering various levels of improved sound quality and extended battery life. Of course, more information about all of the models can be found on their website, and it looks like prices have even been reduced for those who may be interested.”
Thank you very much, Angela. They are pretty cool devices. I remember seeing the first Bose Frames in 2019 at CSUN. I ended up not buying them because I wear behind the ear hearing aids and I just really couldn’t make the Bose Frames coexist with them very well, but they are pretty nice.
Right. Back to Bookshare, where there is a Flor in the argument this week. It’s Flor Lynch writing in from Ireland, and he says:
“In Ireland, we have a similar subscription model to that of Vision Australia.
NCBI is the organisation for blind people running it. Once you’ve been registered blind, you are known to NCBI, so all you need to do is request the librarian there to add your name, email address, and a password.
You can go to www.bookshare.ie. The www.bookshare.ie collection went live in November, 2019. It’s open to adults registered blind and resident in Ireland, and blind/vision-impaired students/professionals.
Wonderfully, it’s being funded by the government’s higher education department, so we don’t have to pay a subscription.
The books now are submitted by publishers, with the result that the output files in various formats are now of publisher quality. Thus, in some ways, it’s been better than a public library. Even though in Ireland, these libraries are now free to borrow books from.
We haven’t mentioned that people with other reading disabilities can, and do also use Bookshare.
A few other points that may be worth making. There is a Kindle for PC, which, as far as Amazon Kindle’s policy allows, is accessible with JAWS.
Another ebook service is that of Rakuten Kobo, and you can go to www.kobo.com, where the output format is usually DRM-protected EPUB, and is usually readable with a screen reader or via the Kobo iOS app. Although, I don’t know how accessible that is.
Not every book is available on Bookshare, so there’s as much room as a book’s publisher wants available through ebook services such as Kindle and Kobo for our purchase.
Finally from me for now, I’m a Living Blindfully plus subscriber.”
Thank you very much, Floor. I truly appreciate your contribution, both to the show and to the continuation of the podcast.
Some reaction to the interview with Shir Ekerling, the Chief Executive Officer of AccessiBe. Robin Kristopherson writes:
“Thanks for the interview with AccessiBe. I just wanted to flag the below, in case you felt it added any context to the timing of their apology or the story in general.
The overlays are still very much out of favour, so any progress the likes of AccessiBe may have made in the way they position, market, and implement their solutions have obviously been insufficient to allay fears among the experts.
One question I’d love you to put to the listeners is whether today’s implementation of AccessiBe is still proving detrimental to screen reader users.
I’ve now moved away from Podbean because until recently, it all but broke my ability to publish a podcast with the overlay enabled, whilst without it was very doable.”
For those not familiar, Podbean is a podcasting host. It’s one of those technologies that you can use to distribute your podcast to the world. And they’ve been around for quite a long time.
I can confirm this, Robin, because I used to be on Podbean as well. And every so often, I check in to see what accessibility is like on there and what services they’re offering.
And we see this where if you have AccessiBe on a website and you enable it, you end up with a detrimental experience, in some cases. You actually end up, in some cases, with a dysfunctional experience.
And it’s interesting that Podbean is still a problem because when we did that mammoth 3-hour episode on accessibility overlays, I specifically called out Podbean when we were talking with Michael Hinkson because it was a debacle then, and based on what you’re saying, it’s been a debacle at least, until very recently.
And that’s why I found it interesting that when I asked Shir Ekerling to give me a website that we could all go to that demonstrated the benefit of these overlays, the only website he offered was their own a kind of a demonstration page. You can code things in certain ways to make your demonstration page look great.
But I am interested in this question, too. I mean, are there now examples out there of websites using accessibility overlays (not just AccessiBe, because there are other ones out there as well), where if you turn them on, you get a demonstrably better experience than when you don’t have them enabled? Because I can’t think of one, personally, in my own web travels. In fact, just personally, when I find a website that has an accessibility overlay on it, if there is an alternative company that I might use, I use it. And I made this point to Shir in the interview.
There’s actually a backlash. This accessibility overlay thing is working in the wrong way, because I know there are blind people who think, “Look. If there are these companies that can’t be bothered doing accessibility right and think there’s a quick fix, if that’s how they treat blind people, then I’ll take my business elsewhere, thank you very much.” And there is a bit of this going on. I’m not overstating it.
I’m not saying that there’s a massive groundswell because I have no proof of that. But it’s certainly what I hear from some people, and it’s certainly what I think. If I go to a company and it’s got an accessibility overlay on it and I can go to another company that’s fully accessible and doesn’t, I will always choose that other company.
Now the below that Robin refers to in his email is a statement from the 17th of May, and the subject is EDF and IAAP EU joint statement on accessibility overlays. We will decrypt those acronyms as we move through. It says:
End users with disabilities and accessibility experts are increasingly worried by the fact that public sector bodies across Europe have started to use fraudulent and/or harmful so-called overlays on their websites, making the interface less accessible.
This is why a joint statement on overlays has been published today by the European Disability Forum (EDF), and the International Association of Accessibility Professionals (IAAP).”
Let me read to you the statement in full. It reads:
“Digital accessibility is a fundamental part of an inclusive society. Accessible websites and online services ensure that peoples individual preferences and needs can be met, using mainstream devices, operating systems features or the users chosen assistive technology and browser plugins.
Artificial intelligence and other emerging technologies have great potential to improve accessibility and assistive technology. However, to date, none of these technologies can instantly fix an inaccessible website.
There are several technology products on the market claiming to improve website accessibility. Some of these products are called ‘accessibility overlays’.
It can be difficult for people who are not digital accessibility specialists, including users, developers, designers and website owners who purchase technology, to understand what accessibility overlays can and cant deliver, and the limitations and potential risks of any new technology.
An accessibility overlay is a type of technology that enables users to make changes to the way content appears on a website. Accessibility overlays add extra functionalities to individual websites, such as text-to-speech, contrast, magnification, or the ability to control colour to improve readability.
While some of these functionalities can help some users, most are redundant as they are already available in browsers or provided by users chosen assistive technology.
Other types of accessibility overlays try to automatically ‘repair’ accessibility problems directly on the website when it is used. However, this is not always reliable and can interfere with assistive technology.
It is not acceptable for overlays, plugins or widgets to impede access to users assistive technology, choice of browsers and/or operating system features.
Accessibility overlays can cause significant problems:
Users of assistive technology already have their devices and browsers configured to their preferred settings. The overlay technology can interfere with the users assistive technology and override user settings, forcing people to use the overlay instead. This makes the website less accessible to some user groups and may prevent access to content.
Most of the functionalities offered by accessibility overlays are copying built-in mainstream features that already exist in todays browsers and operating systems. There is no point replicating these features with an overlay. Making users aware of mainstream features would be a far more efficient way of providing support where needed.
Some overlays automatically detect when an assistive technology is running on a users device. This can be an issue if there is no ability to opt out of such tracking as assistive technology is often associated with a disability and information about disability is sensitive personal data.
Website owners who are not digital accessibility specialists may be led to believe that overlays can ‘fix’ the accessibility of a website, which is not the case. Overlays do not make the website accessible or compliant with European accessibility legislation. They do not constitute an acceptable alternative or a substitute for fixing the website itself.
EDF and IAAP agree that innovation should be encouraged but it should always be carefully developed.
Digital content accessibility is not the responsibility of the user, it is the responsibility of the website owner. User needs should be met by ensuring that code on the website is compliant with existing laws and digital accessibility standards.
We therefore strongly advise public and private sector buyers of technology to actively engage with digital accessibility experts, persons with disabilities, and their representative organisations to understand user needs and how these can be met.
Only this will ensure that content is accessible on all devices and compatible with any assistive technology. No technology should be added to a website if it is likely to prevent access for some users.”
So reads the statement from the European Disability Forum and the International Association of Accessibility Professionals.
Got any comments on this? Let me know. opinion@LivingBlindfully.com is my email address. Attach an audio clip, or write something down if you would like to. You can also use the listener line in the US 864-60-Mosen. That is 864-606-6736.
We’ve been talking about ad blockers on the podcast recently, and Christopher Wright has some advice on this subject. He says:
“I use Brave on Windows, and it’s great! It automatically blocks ads everywhere.
And also, has settings to get rid of those annoying Facebook, Twitter, and LinkedIn embedded frames on pages. I think it can also hide the sign in with Google prompts as well.
It also works well on MacOS, and I haven’t gotten nearly as many Brave busy busy busy messages.
I don’t know how ad blockers work with Safari on MacOS anymore. I think you download extension apps from the Mac App Store just like on iOS, though I quit using Mac around Big Sur because mine is no longer supported for future updates, and I’m still waiting for Apple to continue making significant VoiceOver improvements in future releases before even considering new Apple Silicon. If and when I do, it will most likely be an M3 or later chip.
Never go with first generation anything.
If you want free blockers on iOS, try AdGuard and Firefox Focus. I’m not sure what the difference is between free and paid solutions. Maybe the paid ones block more. Download, install, enable and forget about it. If you have automatic App Store updates enabled, you’ll never have to worry again.
Brave is also available on iOS and Android, and I believe it blocks ads just like the desktop versions.
I haven’t tested the mobile versions that much, but this is particularly useful on Android, where there aren’t any ad blockers to my knowledge.
I don’t bother with other browsers on iOS because there’s not a point, when Apple forces developers to use WebKit, the same engine that powers Safari. Hopefully, this will change if the rumours about Apple being forced to open the platform to third-party browser engines in iOS 17 are true.
I think the Blink engine (which powers Chromium browsers), is faster than WebKit, so we could potentially see a large speed boost. Of course, correct me if I’m wrong.
For other browsers like Chrome, Edge and Firefox, I recommend installing the uBlock Origin extension. This is particularly useful on Chrome OS as it automatically blocks ads every time I sign in with my account.”
[sings] “Hey, good-looking, what are you cooking?”
It’s funny how you get songs in your head, and you’ll see why I got this in my head in a moment.
We got an email from Anexis Santos who writes in and says:
I hope you’re doing well.”
Thank you, Anexis, I am doing so well. I’m positively dangerous.
“First, thank you for the book recommendation. I currently have the WWW trilogy on my Bard wishlist.”
Oh, that is such a good series.
“I can’t wait”, says Anexis, “to read it.
After listening to the interview with Adrian Spratt, I added his book to my wishlist. I enjoy literary fiction sometimes. And based on the description, I have a feeling I’ll really enjoy it.
I found your conversation regarding blindness representation very interesting.
Unfortunately, you can’t please everyone when it comes to representation. I think there will always be something readers find they don’t like. And because blindness is a spectrum, you can’t represent everyone. Also, experiences vary, no matter your vision.
However, you can always represent yourself. Your experience, that’s enough.
Speaking of books, do you have any good recommendations for cookbooks? I’ve been teaching myself how to cook for some time, and I’m interested in new recipes. I haven’t had a great experience with blindness services teaching me to cook. If you have any other resources for learning to cook, I’d appreciate those as well.”
Ah. Now this is an interesting question, and I hope we get some answers here. I am not a great cook. When I was living by myself, you know, between marriages and all that sort of thing, I did get better at it. But it’s never going to come particularly naturally to me.
I did realize, though, that cooking is a bit like audio production. It’s kind of fun the way you can add different ingredients, you can vary the recipe, and I kind of got into it a bit. Bonnie probably wishes I would get even more into it, [laughs] but it’s just not something that I enjoy.
So hopefully, somebody can give you some tips on this in terms of resources that help a blind person learn how to cook.
I would imagine, like a lot of these things, just getting in there and making mistakes and treating those mistakes as learning experiences would help considerably.
“Second”, says Anexis, “I wanted to mention my excitement over the fact that Apple will eventually release a flip phone version of the iPhone. I grew up with flip phones, and I love that they’re coming back. I’m a fan of small phones. Flip phones are very convenient.
I originally planned to get a new phone soon, but I’m going to wait until Apple announces their new line of products for 2024. Maybe I’m just being very hopeful, but it’s possible for their flip phone to be announced.
Third, I planned to get the Voice Dream Reader app. but by the time I looked into it, it started the subscription model.
I was planning to use it as an editing tool. I use a combination of Braille” (with an uppercase B) “and text-to-speech to edit my books. My last edit is usually for proofreading, so I use text-to-speech to catch typos I may not catch in Braille, since I’ve read it so many times.”
That’s a really interesting point, actually. Having written many books, I find that you get so familiar with your own content that you miss bleeding obvious mistakes. It’s incredible.
And then, you hand it over to a proofreader. I used to have a series of people who would proof my books very kindly for accuracy and that kind of thing. And the things they would come back with that were so obvious that I’d missed because I’d just read the book so many times, it’s astounding.
Anyway, this email continues.
“Because I don’t plan to use the app a lot, paying for a subscription isn’t worth it for me. I still would have gotten the app if, instead of a subscription model, the Voices had to be purchased instead.
I plan to try using Dolphin’s EasyReader. Can you tell me how accessible it is to import your own files to the app?
Also, does the app have familiar text-to-speech voices like Heather?”
I don’t know, Anexis. I suppose I should, too, get a bit more familiar with Dolphin EasyReader. But I just like Voice Dream Reader so much. So the only time I’ve used Dolphin EasyReader is for reading some content from our local library because it integrates quite well with that.
So perhaps somebody more familiar with the product can help you out.
More on the SensePlayer and the Victor Reader Stream.
Now, this email from Rich Yamamoto, who says:
I’m writing to give my impressions of the SensePlayer, which has been a hot topic of discussion, along with the third generation Victor Reader Stream on your podcast. I currently own both, and must say that I find myself using the SensePlayer a whole heck of a lot more than the Victor Reader Stream at the moment.
I really like the way that the SensePlayer handles playing content. I like that the player for media, Daisy, and documents are separate. And I like that the players will play the last file that was in the queue from before I exited the program in question.
However, I will say that I don’t like that the SensePlayer doesn’t hold your place in a particular playlist in the media player. It allows you to store the position of a single file, but not the entire playlist. This makes it weird when navigating between, say, songs in an album or chapters in an MP3 audiobook.
Sometimes, when clicking over to a file, it plays in the middle of it instead of starting at the beginning, and it’s very difficult to get the file back to the start. Or rather, it’s difficult for me to do it. I could be doing something wrong, of course.
Other than that, the Smart Connect features are very cool. I like that I can control my iPhone from the keypad on the device instead of using the screen.
Is it something that I would use on a daily basis? Well, that’s debatable. But is it something I find incredibly useful? Absolutely it is. It’s cool that I can record phone calls, type out text messages in T9, and essentially, not use the screen for almost everything I’m doing.
Finally, (and I have absolutely no idea who will care about this), I love that the power on/off sounds have been updated, remixed, and remastered. It’s a small thing, but it always puts a smile on my face to hear those lovely, happy, HIMS-specific noises when I power my device on and off. Never gets old.
Overall, I’m really impressed with my SensePlayer. I tried to hold off on purchasing it. But once I found out that it was shipping in the States, my impulsiveness got in the way.
I think that part of the reason why I’ve always gravitated towards products from HIMS is because my late grandfather, Dawn Bishop, was an avid user of their services and devices. I’ve always been more comfortable with them, and I feel like he would be proud, knowing that I’ve got my hands on one of these beauties now.
If anyone is considering purchasing the SensePlayer, I would say to definitely do it. While Bard still isn’t supported at this time, I know this will be soon rectified in a future update that should be coming out very soon.”
Thanks for the email, Rich.
And if I may say, I think your grandfather would not only be proud that you have a SensePlayer. I know that he’d be proud that you’re carrying on the legacy of being so articulate and capable with your technology. You are definitely, as they say, a chip off the old block.
And I remember Dawn with a lot of fondness because he and I were online from the very early days. I remember corresponding on many occasions with Dawn over the years. He also loved his Window Eyes.
He was a great guy, and there’s obviously a lot of him in you. So well done!
But wait, there’s more, because Rich has now phoned in with his impressions of his Victor Reader Stream 3.
Rich: Hey, Jonathan. This is Rich Yamamoto from Andover, Kansas, and I wanted to give my impressions of the Victor Reader Stream.
I like it. I’ve always been a Victor Reader user since I was younger. I’ve had the first generation, second generation, and now third generation. And I’ve played with the Trek, and I like it to a point.
There’s a lot of issues with stability at the moment in regards to things like podcasts and other Wi-Fi features.
When you try and copy podcasts with more than 10 episodes, or even less than 10 episodes that are downloaded to the internal storage, if you try to copy those to the SD card, sometimes, I don’t know why, but the podcast app will fail and crash and cause the copy process to stop in the middle. And I learned the hard way that this corrupts the data on the SD card and internal flash disk, which is problematic at best.
So it was time-consuming to fix those issues because I basically had to redo my SD card from scratch. Doing that made me very unhappy.
There are some instabilities with the voices. So in version 1.1, you can now download and use various Acapella voices on the Stream, which is great. But they can be, and tend to be slightly unresponsive. Or rather, very unresponsive. So typing things in, or navigating the system can be really sluggish.
Actually, it reminded me of using my 2010 MacBook Pro 32-bit with 4 gigs of RAM on macOS Catalina. And I would not wish that upon anybody.
The playback, though, is very good. The playback is pretty stable, at least, audio playback. I haven’t really tried TTS stuff yet, but it’s pretty stable.
And I do like that you can use podcast chapters in version 1.1 when podcasts work.
As opposed to the SensePlayer, I feel like there’s a lot of other instabilities with the Victor Reader, which is really annoying and very unfortunate.
But I’m optimistic and hopeful that these issues will be fixed in version 1.2, when they hopefully add Audible support and TuneIn support, because yeah, they let that slip on accident on the Victor Reader list.
But compared to the SensePlayer, I would say that the Victor Reader is a lot less stable. The SensePlayer had a much more stable release, at least here in the States it did. And I don’t know why that is the case. I don’t know if Humanware rushed it. I don’t know.
And I would be really interested to hear if anybody else has had these issues with the Victor Reader stream. To be honest, there’s been a lot of negative stuff going across the Victor Reader list, so I haven’t been looking at it very much because I hate playing Mediator for them. Because my thing is, if somebody doesn’t say anything, who will? So I’d be interested to hear from anybody on the podcast if they’ve had these issues.
Jonathan: Ah, the joys of being an early adopter, eh?
“Hello, Jonathan and Living Blindfully listeners,” says Scott Erichsen in Australia.
“I hope everyone is well.
I was just browsing the Executive Products website, and noticed there is now an Executive Products case for the SensePlayer, available at $92.95 plus shipping.
I’ve ordered one, as this will protect the device extremely well when traveling.
There is also one available for the Victor Reader Stream 3 at the same price.”
Now, that is a very long link. You can search the [ExecutiveProductsInc.com]((https://executiveproductsinc.com) site for it. I will do my best to remember to put a link in the show notes.
Scott continues, “Both devices have been discussed widely on the show, and the topic of no case for the SensePlayer has come up, particularly in episode 224 where Earle Harrison was interviewed, and when Vaughn Benison conducted his comprehensive review of the device.
Humanware also do not supply a case for the Stream 3. However, the Executive Products case for the Stream 3 has been out for a few months now.
The Executive Products case for the SensePlayer apparently started shipping on the 21st of May.
Both devices have great features and possibilities, particularly the SensePlayer, with the possibility of a screen reader, or at least, the ability to use extra apps later in the year.
That said, Humanware are working on some good features and updates, as was detailed in episode 212.
Come on, Audible, support these players!
I find I use the SensePlayer more now because of the OCR and SmartConnect features. But I also think the audio reproduction on the SensePlayer is louder than the Stream 3. This could be speaker placement, or other factors.
But one thing to note is that the SensePlayer is certainly better at handling large SD cards. The SanDisk 1TB card I have for my Stream 3 is hardly ever recognized, if at all, whereas the 1TB microSD card I have in my SensePlayer has no problems at all. Both cards are brand new. Humanware is aware of the issue.
In episode 228, Jenny Axler from HIMS indicated that if you want to get a manufacturer’s attention, contacting them directly or through the dealer network, if they use one, is the best way to have your voice heard and your problems addressed.
I’m happy to say I contacted my local dealer for HIMS, Pacific Vision, about the fact that if you have an audiobook as one long MP3 file, such as could be converted from Audible or another source, your own library of archived cassettes etc, it refuses to play on the SensePlayer. By a long file, I mean 700 minutes. I provided a few books that didn’t work as a Dropbox link to illustrate the problem.
I have been told this has been fixed, and it was fixed within 24 hours.
I personally have not seen the fix yet, but that is serious dedication and commitment from HIMS.
I look forward to seeing what both HIMS and Humanware do in their upcoming firmware updates for their respective devices.
Thanks, and have a great week.”
Thank you, Scott, for that informative message.
Caller: Hi, Jonathan! My name is Linda, and I’m calling from Lancaster, Pennsylvania. I’m a new listener.
I have a second generation Victor Reader Stream, and I was wondering is it possible to put Audible on your Victor Stream. And if so, how would I go about doing that?
Also, I have Bard on my Victor Stream. And sometimes, I get a thing on my Victor Stream that will say I have 1 book, but it’ll say book error. I’ve tried deleting it and everything, but it’ll come up and say book error. But I don’t have any books on the particular Stream when I downloaded it.
Also, you were talking about songs that freak you out or that give you nightmares. I don’t necessarily have a song, but certain sounds, like I used to have, if you remember, one of those Vox Clocks with the little buttons and they had the alarms and the volume. And there was one alarm I could not use because of the sound, or whatever. I didn’t know the certain volume. It would really scare me.
And if you remember the show Battlestar Galactica and the Xylon Voices, I couldn’t watch that show because the Xylon Voices just give me nightmares.
Jonathan: Stay away from that show then, Linda.
Great to hear from you. Glad you’ve discovered the podcast, and thank you so much for taking the time to listen and to call in.
It is possible to use Audible with the Victor Reader Stream second generation. Somewhere in the podcast, we have talked about this. And I did a quick search and couldn’t immediately find where it is because we’ve done so much talking of late about the Victor Reader Stream and the third generation. But memory says it’s something to do with the Audible companion software.
But what I would do is call Humanware and see if tech support can help you through that, because it sure is possible with the second generation.
The third, at the time of recording this podcast, is still a work in progress.
Audible are talking to Humanware about supporting the third generation. But since you have the previous one, you should be able to make it work.
And I’ll leave others to comment on the Bard issue that you identified because I don’t own a Stream, so don’t have any experience. And of course, I don’t have Bard either, being in New Zealand.
So let’s see if anyone can help. opinion@LivingBlindfully.com, if you can help Linda out. 864-60-Mosen is the number in the United States. You can call like Linda did. 864-606-6736.
Voice message: Hello, Jonathan and all you Living Blindfullians. Hmm. Living Blindfullians. I don’t know.
Anyway, hope you’re having a good day whatever you’re doing, if you get to listen to this.
I just wanted to say that I loved hearing the Pennsylvania 65,000 call on episode 229. That was awesome.
The voicemail sounded exactly like something you would hear in a good sort of eerie scene of like one of those apocalypse movies, right?
So you call an iconic institution of some sort, and it says, “This office is no longer functioning.”, or something like that. [laughs]
And then, we heard the ringing of the telephone and I thought, “Is there a telephone sitting on a desk somewhere in the Hotel Pennsylvania office? Like, what is the Hotel Pennsylvania now? [laughs] Like, does nobody ever go there?”
And you just kind of hear this phone ringing. I mean, we heard the phone ringing, but did the phone actually make a sound? I mean, there’s that whole philosophical question. If nobody actually heard it ringing in the actual office, I mean, it was really creepy and interesting.
Just a few little thoughts there from me on episode 229. I have to go finish listening to it now.
Jonathan: Good idea.
That’s Sara Hillis with those musings.
Yes. You can check out anytime you like, but you can never leave.
Let’s check in with Devin Prater who says:
I’ve had a Focus 14 for the last few years and really started loving it until the spacebar broke. I think something about the spring inside it broke because it’s stuck in the down position, and I can’t get it to come back up.
I don’t even want to know how much it costs to get it repaired, and don’t want to see how long after repair it’ll be before I have to send it in again.
Now, about Braille on mobile devices. When I read Braille with an uppercase B, I have to say”, (Good man!), “I feel better. I feel like I’m interacting with the content eating with a fork and spoon, rather than having it stuffed into my face, as it were.” That’s a lovely analogy, I must say. [laughs]
“The iOS Braille catastrophes and Google’s lack of support for my NLS e-reader, which is the only working display I have right now, mean that I’m just going to have to learn to use JAWS with Braille.
Hopefully, Apple fixes that bug or Google really puts time into Braille support. I really hope both companies listen to more of the community, and not just like 20 trusted testers where 2 of them use Braille occasionally.”
And Reginald George writes,
“I have removed all personally identifiable information from this email.
However, a vendor tells me that he has had 3 clients with Braille displays with intermittent charging port issues as described below.
It is charging on the FS USB cord I provided him with. When it is connected via USB to the computer, it goes in and out/disconnects easily.
My thinking is this, the charge port is sort of working, but not completely. It is working enough to charge, but there are issues to do with a full connect.
Connect it over Bluetooth, and just charge it using the cord. That may resolve things right now. But the port needs looking after, or it may reduce even further in functionality.
The problem is with the cases and the layout of the Focus devices. The charge port is hard to get at, and I have 3, yes, 3 clients who have had similar issues.
One workaround was to get a short dongle that plugs into the display, and sticks out about 1.5 inches, and gives easier access to the port. This way, there is no undue force on the charge port with people pushing on an angle.
Intermittent problems like this can cause repair technicians to wrongly assume that there is no issue with the device, and send it back without replacing the offending part. Tech support representatives also rightly question these claims as at least, 50% of the time, it’s the computer’s USB port that has the problem, or the cord itself. So one must be articulate, persuasive, and have tried everything before sending in a device with a partially functioning port, especially if said device is being shipped out of the country.
I have not heard a definitive answer to that question, and whether this is a temporary or permanent change.
While on the subject, there are so many things I love about the Focus. Braille study mode for new students, for one, being able to invoke that on the fly. The quality of the Braille and the way it feels smooth between the cells. The instant switching between devices.
I never liked the keyboards on FS Notetakers.
I liked my BrailleNSpeak keyboard. It had good key travel, was quiet, and the keys felt like they had some give when they hit bottom. They weren’t so clacky.
I realise these are all subjective feelings. It will be quite interesting to see what the next generation of Braille displays bring us if they ever arrive.”
Now, an email from Freedom Scientific.
“My name is Mark and since January, I have been the product manager of Blindness Products within Vispero.
For several years now, I have known you as a podcast maker and an advocate for Braille” (with an uppercase B, which is always good to see), “and accessibility.
Of course, it concerns me very much to hear reports of quality issues. As a Braille user myself, I understand all too well how much impact it has on your professional and personal life if you are dealing with a Braille display that is not reliable, especially if you must send your unit in for repair.
The good news is, I believe these quality issues are long behind us. Vispero adjusted our manufacturing processes a couple years ago, and the failure rate is now much more in line with what we would expect very low single digits.
Please keep me informed if you continue to hear issues, so we can take corrective action with any legacy users. The failures were typically experienced between 12 to 24 months, so we also believe we are largely clear of legacy issues, too.
Regarding repairs, because we are in a transitional phase with regard to our repair policy, Braille display repairs will temporarily take place in the Netherlands. However, this has no consequences for our customers in the US, because we will send them a replacement unit so there is no reason to worry about longer repair times.
I hope I have informed you sufficiently. Of course, I am always available for questions or comments.”
Thank you very much, Mark!
That’s an official statement there from Vispero regarding the Focus issues that we have been hearing listener feedback on for some time now on Living Blindfully. As always, when a manufacturer chooses to take the time to go on the record and respond, I truly do appreciate that.
Advertisement: Living Blindfully is brought to you in part by Aira. I thank them for their sponsorship of the podcast.
You know we’ve become used to some businesses offering free Wi-Fi. It’s a nice touch, and it makes us feel valued whenever we come across it.
And I know similarly that when I learn about a business that has purchased Aira Access, it’s like putting out a massive “Blind people are welcome here.” sign. I know that if I need it, I’ve got a trained professional agent available to me to provide assistance, and that means that the business cares enough to pay for that. I appreciate that.
From airports, to Starbucks, to Target, and more, Aira Access can assist you to navigate, shop, browse and not be reliant on friends, family or others who may not understand our needs as well. And don’t forget that as well as the offerings in physical locations, there are other businesses providing Aira Access that can be used from home.
So you should check out the app to find out what’s available. To do that, just open the Aira Access section in the Aira Explorer app on your favorite device. You can also visit Aira’s website to find out more at Aira.io. That’s A-I-R-A.I-O.
If sighted people have access to a piece of information, it’s only fair that blind people have access to that same information in an accessible format.
And today, we’re talking about the information provided by traffic lights. Surprisingly, this is still an issue in parts of the world in 2023.
Having inaccessible traffic signals accessible is one of the tasks that the Oko app is seeking to address. And to tell me all about the app, I’m joined by Willem Van de Mierop from Belgium. Welcome to you, Willem. Good to have you on Living Blindfully.
Willem: Hi, Jonathan! Thanks for having me.
Jonathan: What inspired you to develop the Oko app?
Willem: Myself, I’m an AI engineer, so computer science, and my 2 co-founders are as well.
And during COVID, we started talking with a mutual friend of ours who mentioned that because of COVID, he could go on less walks with his friends or, for example, not ask people for directions. So as a blind person, it was harder for him to go on his walks and just be out.
And so we started thinking as AI engineers, if these autonomous driving cars can drive at 60 miles per hour on the highway, why can’t we help out a person at 3 miles per hour on the sidewalk? Or even with a guide dog, sometimes even faster, but similar techniques, of course, can be used to help people with outdoor navigation.
Jonathan: We don’t talk often enough, I think, in the mainstream about the impact of COVID on the disability community. And I read some data recently that suggested that it’s been shown that blind people are 8 times more likely to catch COVID than the general population because the amount of contacts they are having with surfaces and the proximity to people tend to be greater.
Willem: Yeah, I can imagine that. Yeah.
So the idea that less people are around and that he got less people to help him out with his walks, we decided to create kind of like a hacky device at the beginning with a camera, some wires, a very big computer, and then go out and test it with a few people. And people love the idea of really using AI to help with some of their biggest challenges.
But we quickly realized if we want to make this affordable and scalable for everyone, we need to be able to run it on the devices that they already own. But that’s a very complex task because AI algorithms, especially those that help out with outdoor navigation, those are very complex algorithms. If you want to run them on a device, then they need to be optimized. For example, for a smartphone.
The moment we were able to make very accurate algorithms that were optimized for a smartphone is the moment that we decided to quit our jobs and go for the company full time.
Jonathan: You mentioned navigation and the use of AI. It’s an area that’s developing.
And I think, there’s an area where there’s a lot of human resistance. I’m not sure whether the human race quite wants self-driving cars in general. Blind people do. Or at least, some do. But a lot of people are saying I don’t want to hand over my ability to drive to an AI.
How close do you think we actually are to seeing these self-driving vehicles being widely available on the car lot to buy? And do you think that ultimately, the problems are going to be human adoption or technical that might slow this down?
Willem: I think AI is definitely catching up. I don’t think we’re as close as Elon Musk likes us to be.
There are definitely techniques where you can, for example, memorize a certain city. It’s called mapping, where you do, for example, as Waymo, where you try to stitch an area together to pinpoint, “Okay. Where exactly am I?”
To be able to make a super smart, generalized AI that is able to cope with all of these different tasks of outdoor navigation is a very complex task, but we’re definitely getting there. So AI is evolving at a rapid pace. And by using these state-of-the-art algorithms, you can create these kind of autonomous driving cars.
But I think that in the beginning, they will be good at certain tasks. For example, staying in the highway, even in traffic jams, going into different lanes. It’s only when you’re, for example, going to cities that it becomes much, much harder, I think.
Do you think that people will own these vehicles? Because the other scenario that I’ve heard is that it might be kind of like an Uber app that we all have where when you want a vehicle, you summon one, and they’ll be idling around waiting to be summoned, and you will just get into the vehicle and go.
So this argument goes that the concept of car adoption is something that will go by the wayside, ultimately.
Willem: I think that’s a very good way of adapting to the new technology.
So indeed Waymo, that’s one of the applications that already is active actually, where they have memorized a city. And then, they try to navigate autonomously using that technique.
But I think another way that’s very interesting is by using freight transport. Those are pretty easy things if you stay on the highway on its own.
So I think those are very easy ways of adopting this new technology.
But in a sense, some people like to drive. Some people hate to drive. So I think the adoption will differ based on that, of course.
Jonathan: Gkoing back to the Oko app, it sounds like the things you can do will differ depending on where in the world you live. Can you talk with me a bit about the different functionality it offers?
Willem: Yeah. So the application uses the back camera of the phone and then the computing power on the phone to translate visual information into audio, haptic, and visual feedback for the blind and visually impaired user.
Our algorithms all run locally on the smartphone. So that means no Wi-Fi, no cellular connection required. You can run it in airplane mode if you like.
And we do two types of things already. So the first thing in Belgium is that we recognize the pedestrian traffic light. We inform the user of the red and the green light, but we also help them to recognize the destination of the bus and the tram. The way that we do that is we read out the destination that is displayed at the front of these trams or buses.
In the US, we only recognize pedestrian traffic lights at the moment.
Jonathan: So I hear about the “Walk”, “Don’t walk.” sign in the United States, but I don’t think many other countries have that, do they?
Willem: No, that’s exactly right. So that’s why the team actually had to create a whole new AI model to be able to understand these different traffic lights.
So in Europe, we have a red man and a green walking man, and those are vertical traffic lights. Those are stacked up on each other.
In the US, it’s one square traffic light that changes between a red hand, a white walking man, and then a red hand with seconds beside of it. So that basically means the don’t walk state, the walk state, and then the countdown state.
To be able to get the AI to understand all of these things, we had to learn it from scratch what a traffic light looked like, and that’s specific to the US market.
The nice thing about the European market is that moreover, across the world, traffic lights are very, very similar. So that means that we can quickly expand into different areas. And that way, of course, easily use that general ability of the AI to go into different markets.
Jonathan: Yes. Your traffic lights do sound like our traffic lights.
Although in New Zealand, at least in most of the major cities that I’m aware of, almost every traffic light is accessible because it’s making a sound, and many of them also have a tactile pin that pops out when it’s time to cross the street.
Is that a common thing in Europe as well the audible signals?
Willem: I can say that you’re pretty lucky that you have all of your traffic lights installed.
Generally, in Belgium, about 15% of the traffic lights are equipped. In the US, it can be much, much worse.
For example, in New York City, it’s only 3% of the traffic lights. In Chicago, it’s only 1%. So it really differs from city to city. But most of the time, it’s very low numbers.
Very important to say is that the application is actually more than just telling you what status is of the traffic light. Because the application runs locally on the device, you can imagine when you’re holding it against your chest with the back camera pointing outwards. The moment you do a rotational aspect with your chest between 10 to 15 degrees, the moment you receive that feedback, you instantly know, “Okay, that’s also the direction I want to go in.” And that’s something that’s very useful in these big, big wide streets in the US where we stop people from veering off into traffic because the moment they veer off into traffic, we become silent. And then if they reorient, then they know again, “Okay, this is the direction I want to go in.” So that’s something very important about the app.
And as well, you can imagine if you have these audible speakers or accessible traffic lights installed, but not at every intersection, and sometimes, they’re very bad maintained, so not working or just badly installed. That means that by default, it becomes a service that you can hardly rely on. So that means if you go out, will there be an accessible traffic light? Will there not be? Will it be working? Will it be installed right?
That’s the beauty of our app. It just works at every intersection, regardless of physical infrastructure. And that becomes something that you can rely on.
Jonathan: Yes, it’s extraordinary to many blind people outside of the United States that the whole question of audio signals have become this major philosophical political debate. [laughs] So that might account for the very low numbers of accessible signals over there.
How do you train people to make good use of this app? Because I imagine, as the developers, you must be a little bit nervous about liability issues if the app’s sending an erroneous signal that it’s okay to cross when in fact, it is not.
Willem: That’s definitely true. It’s important to note that in Europe, we’ve already helped cross 350,000 streets so far, and we’ve never made a mistake between red and green. So that’s always important.
We have multiple safety measures in place to not make these kind of mistakes. But there is a way that we teach people to use it. And that’s very important is that we say that we’re an extra addition to everything that they’re taught during orientation and mobility, or just everything that they’re used to, or just trying to bring back more information so that you can make an informed decision.
How we usually recommend people to use it is that you take out your phone, hold it against your chest with the back camera pointing towards the direction that you want to go. You then rotate your chest between 10 to 15 degrees. And the moment that we see that pedestrian traffic light is the moment that you’re instantly going to get that feedback. And the feedback will be vibrations, audio, and visual overlay. And that way, you know, “Okay, this is the direction I want to start going in.” if the walk signal is on.
The don’t walk signal would be a slow vibration, slow sound. The walk signal would be very fast vibration. And then the countdown would be kind of like an intermediate sound.
Jonathan: Obviously, hands are at a premium for blind people because you’ve got a white cane or a guide dog harness in one hand, and that only leaves the other hand free.
Have you investigated gadgets that might fix the phone to someone’s chest in the right position like a harness or something like that?
Willem: Definitely. So we understand that hands are indeed something hard for blind and visually impaired because they have the white cane, or maybe a guide dog, or maybe both. We do see that some people just only use one hand to take out the phone because it’s so useful to get that extra information. And we do see some people creating special kind of harnesses.
So for example, we’re testing in Japan, and there, some people have special leather cases to let the phone hang loose around the chest area with the back camera pointing towards the direction that they want to go. There is also like lots of cases with a cord attached to it or a lanyard. These are kind of fashionable in Europe, actually. And people use those things all the time.
There is always a possibility that in the future, we might implement the software on some kind of hands-free device.
At the moment, unfortunately, that’s not possible yet because we have 3 very important constraints.
The first would be that we need to have a very powerful camera, a very powerful battery, and a very powerful phone to be able to run all of these algorithms on. And the current state of the art of like, for example, Bluetooth cameras or wireless devices are not powerful enough yet. So that might be that we are implemented in these things later on.
The reason that we really focused on smartphone is as well because you don’t need to carry another thing on your walk. You don’t need to charge another thing. You don’t need to buy another thing. You just already have the phone and you install the software on it.
Jonathan: I used to be involved with a company that produced an app (or still does produce an app) that helps blind people to navigate a wide range of environments.
And so one of the things I came to understand as part of that process was that there are some blind people who are quite concerned about taking their phone out in public in these sorts of situations, and having it snatched off them. And for some reason, the country where I got the most overwhelming amount of feedback about this was the UK. I have no idea why that is.
But is that also feedback that you’re getting, and how would you suggest people cope with that?
Willem: Yeah, there is some feedback that we are getting. We were quite stumbled, actually, to get that feedback in the beginning.
Some people use like, as mentioned, a lanyard case, or like small little wires to let the phone be secured around their wrists or just around the chest. So that could be some way of coping with that fear.
But in general, that’s of course a problem with unfortunately, with society that we, as a company, of course, can’t change. That is something too bad to hear, of course.
Jonathan: Can you tell me a bit more about the looking for buses and knowing where, I guess, a bus or a train is going? Is that something that you hope to roll out elsewhere?
Willem: Yeah, definitely.
So all features that we have will be expanding across the globe. But furthermore, more features will be added down the line as well.
You can imagine. We’re using that camera, we’re taking in visual information, and we can translate so much information to audio and haptic feedback for the blind or visually impaired user.
The buses and trams will definitely be expanding as well. It’s just a matter of seeing how, for example, similar the buses are, how similar the destination displaying of these buses are. We just need to look at different markets and examine the similarities.
But also, of course, start testing before we fully launch there. That’s something that we did here in Belgium as well. We did it in the US as well. We truly believe in being close to the community, first receiving lots of feedback about these initial tests, tailoring the application, improving it lots and lots of times before we fully launch in an area.
Jonathan: I realize you might not want to talk about the roadmap. But are you able to talk about any of those additional features you’re looking at adding?
Willem: As mentioned, we really believe in being close to the community. So that means that we listen lots and lots of times to some of the feedback that we’re getting.
So for example, finding a bus stop, but also just helping out to do lots of different tasks.
For example, in the US, (I think that’s something specific to the US), people want us to help them find the push button, to find the walk indication.
But there’s lots of stuff that can be added down the line. The general taking in visual information and translating into audio and haptic feedback is a very powerful method of portraying the outdoor world to blind and visually impaired people.
Jonathan: Are you on iOS only, or is it Android, too?
Willem: Currently, we are iOS-only. That’s for these 3 big constraints that I mentioned.
The company is a year and 8 months old. The team has been working very hard on this iOS application, but we will be expanding to Android in the future.
The initial focus of course for iPhones was for these 3 big constraints. But also because the majority of blind and visually impaired users that we know use iOS. I don’t know if it’s similar with you, Jonathan. Most of the people you know, do they use iOS?
Jonathan: Absolutely. Yeah.
Willem: I do see that there’s an increase in Android. Android is definitely catching up with TalkBack. But currently, the focus is iOS.
Jonathan: What markets is this app available in right now?
Willem: Currently, it’s available to download in Belgium and the US. But we are expanding into different regions.
On our website, the Oko page, there is an option to leave your information to become one of the early testers. And that assures that if we’re going to your area, that you will be one of the first early testers to start using the app and testing it out.
Jonathan: I’ve heard the app’s name pronounced as Oko, but you say Okko, correct? How did it get its name?
Willem: We pronounce it Oko, yes.
And the reason it’s called that is actually a fun fact. We started looking for different names by the application.
As you might know, the company is called Ayes A-Y-E-S. But if you say Hey Siri, open Ayes, then it’s just going to say like, what do you mean Eyes? What kind of eyes?
But Okko is just such an easy name to listen for, and so distinct in so many different languages that you can just easily say Hey Siri, open Oko. And then you’re good to go. It starts working and it starts opening up this application. My phone was actually already trying to open it up.
Jonathan: [laughs] Yeah, that’s right. I’ve learned the hard way. You’ve got to be careful on podcasts like this when you do that sort of thing.
So if the traffic lights are fairly similar around the world (with the exception of the United States), why isn’t it just widely available now in a lot of app stores? If the little green man’s popping up all over the world, why isn’t the app popping up all over the world yet?
Willem: That’s a very good question. As a young company, of course, we needed to make certain focuses. But we will indeed be expanding rapidly.
The thing is that we always believe in high quality first. So that means you want to assure, especially if we’re helping out with this type of challenge, we want to assure that of course, it’s working very, very powerful and very safe before we expand into a certain region.
Jonathan: You’ve got to make a living, of course. How do you intend to monetize this?
Willem: I can tell you that in Belgium, we’re actually the first application ever for blind and visually impaired people to be fully reimbursed by our healthcare system.
Willem: So that’s really nice. Users can use the application for free. And then, the government pays for the license.
In the US, we’re actually looking towards B2B healthcare for cities and governments where they have an enormous cost to install these accessible intersections and enormous cost to maintain these intersections, where we can be such an easy software layer on top of the city to increase their accessibility, but also provide them insights. For example, which intersections are not accessible enough, where certain blind and visually impaired people can report issues about the accessibility, where the city knows that it’s coming from the community, where they know, “Okay, this data is super valuable.” and that way, we can monetize the application and keep providing it for free for the end users.
Jonathan: For those who strongly believe in audible traffic signals, is there a danger that municipalities might become overly dependent on Oko? When what blind people actually want, at least some in the United States and I think all everywhere else, is to have the signals accessible as of right. That if sighted people have this information, blind people should just have it too without the need to use an app to get at that information. So is there a concern that municipalities might just say, “Well, we’ve got Oko now, so we don’t need to do audible signals.”?
Willem: There’s definitely some people that have been saying like there might be a reason for municipalities to stop installing these things, but that’s definitely not our vision. So our vision is exactly the same as you say, we want to make every intersection accessible. Similar to how a sighted person does not have to pay to see that traffic light, a blind or visually impaired person should not have to pay for that, either. And that’s where that data layer on top of the city comes in.
So for example, if we know from the Oko app that certain intersections are very frequently crossed, then we can report to that city saying, “Hey, this intersection is very frequently crossed. You might want to install one of these audible speakers or definitely maintain it very well, because some people that might not use our app still need that audible speaker.” And we actually see that, for example, our tool can also be used to report mistakes of these accessible traffic lights if they’re not working properly, or not maintained properly. So it’s really an extra addition to the city, rather than an alternative to these accessible speakers.
But very important to know is that in lots of areas, our application, it works so well simultaneously, but lots of people actually prefer our app because it gives so much more orientation and it works so powerful that you’re not reliant on that physical infrastructure that’s just sometimes not installed, not maintained properly. And that’s really the power of the app.
Jonathan: So where can people find out more information? I guess right now, they can search for Oko in the app stores in Belgium and the United States and it will pop up. But can people find out additional info, perhaps keep track of when the app might be available where they are if it’s not right now?
Willem: Yeah, definitely. So that will be at www.ayes.ai.
Jonathan: Willem, it’s great to have you with us. Congratulations on creating this, and we look forward to finding out what comes next.
Willem: Thanks, Jonathan! I look forward to the next podcast.
Jonathan: I did take this out for a spin, and it seems to work really well. I’m probably not the best test case because we have so many accessible street crossings here, but it worked exactly as they said it would work.
Lena is very happy with this, and she sent in a testimonial, actually. She says:
“I have been using Oko, Co-Pilot for the Blind, for a few months and lots of street crossings. This iOS app is not available everywhere, but I would like to let listeners know about it because it is a nice addition to our travel toolkits.
What Oko does is provide audio and sound feedback to indicate the status of the pedestrian traffic light.
Many traffic signals in my city have a walk light, in addition to the traditional red stop, yellow caution, and green go lights. The walk signal usually gives pedestrians a few extra seconds to cross a street.
But blind people often miss out on this pedestrian advantage because there is no audible indicator, and traffic should not be moving.
There are 3 sounds to know, and they can be learned in this short tutorial. They are walk, countdown, and don’t walk.
When the walk light is on and 15 seconds or so remain, the light shows the number of seconds counting down another advantage lost to blind folks.
In over 200 city miles I have walked, Oko has made one mistake 2 times. It provided no information when it should have.
I reported this using the report a mistake button, and the developers fixed the problem in a few days.
My favorite use cases for Oko are Saturday and Sunday mornings when there is too little traffic to be certain about the traffic light’s status, and multi-street busy intersections where every second counts.
I was a complete skeptic when I began testing this app. But now, I give it 5 stars and highly recommend it.
No app takes the place of good O&M skills, but this app can be helpful. It does what it claims to do reliably.
Thanks for a wonderful podcast!“, says Lena.
Thank you very much for that great review, Lena.
I can confirm it does seem to work really well, and the sounds it makes are the same sounds as those we hear on our accessible traffic signals here in Wellington.
If you’ve given this app a go if it’s available where you are, what do you think? opinion@LivingBlindfully.com is the email address. Attach an audio clip, or write something down, whatever you prefer.
You can also call the listener line that is a US number 864-60-Mosen. That’s 864-606-6736.
Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. Thats P-N-E-U-M-A solutions dot com.
Caller: Hey, Jonathan! This is Jim from Florida. Hope you are well.
Thanks again for Living Blindfully. I’m loving it.
A couple of thoughts.
First off, great job advocating as a parent. No matter what, disabled or not, people should still advocate as parents, especially when you’re working with someone with, like you said, the airline.
And pick your battles. I’ll be interested in listening to that Jernigan speech. I’ve not listened to it, so I’d be interested in hearing that because I know for educators of the blind in the US and a lot of people who are blind and visually impaired, he’s a mentor. Some of his philosophy, whether he had made it or others have interpreted it, or maybe misinterpreted it, I’ve had some questions about. But overall, I do believe he and another one, Pervis Ponder, are definitely sages in the blind community, that we have a lot to thank for their work and their dedication.
I wanted to say that I had an incident myself when I was in elementary school, where I received assistance from a bus monitor and it was misinterpreted by a teacher who thought I wasn’t using my skills. She scolded me big time, and put me in the developmentally and physically impaired classroom for most of the day as a punishment.
And about 2 years later, she did some summer tutoring for me. And I got a chance to get together with her and explain that the way I was raised, that my father encouraged me that if someone that you knew or that meant well that wanted to offer assistance and it might make your life a little easier without costing you anything, that it was okay. And when she’d taken me from the bus to the steps or whatever it was, it was kind of a busy area and it was a little quicker. And then of course, I used my cane and went to class and did everything else. But it was misinterpreted by someone that thought I wasn’t using my skills that I had been trained.
You know, it was one of those things. But I will tell you, I prefer working with my guide dog more than sighted guide, even from family members because I’ve gotten hurt sometimes. And you know, with a guide dog, they’re amazing, and that’s what they do all day.
Caller: Hi, Jonathan! This is Raz over in Colorado, USA.
And I was just reading the transcript of your latest podcast. Absolutely love those. Thank you for having those.
And I quite enjoy your podcast. I found a lot of good things from there.
And I was reading the part where you’re talking about well-meaning, but kind of frustrating sighted people. And it got me to thinking because I have a guide dog and when I’m around other blind people and sometimes with my therapist, I will absolutely go on a tear about all the sad dead pet stories I get to hear as a guide dog handler. And there’s a lot of them because, you know, people see you in public and you know, you’re a stranger to them. They don’t really know you. But they see the dog, and they think of their dog, and they want to tell you about their stories. And they don’t know just how many times we hear these terrible, sad, dead pet stories.
So you know, when I’m with other blind people, I will absolutely go on a rant about them, and be kind of rude about them, and stuff because it’s frustrating. It’s annoying.
But in my day to day life, every time it happens, I’ve never like yelled at anybody for telling me one of these stories, even though they do drive me crazy. And some days, I’m having a bad day and I don’t want to, like, you know, I don’t want to be made sadder than I might already be. But you know, in those moments, it’s just kind of like I get why they do it. And I know they would not understand if I were to yell at them or anything.
So usually, I will talk to them about it. And there’s kind of a nice way to do it. That’s a little example.
But I think a bigger example I have frequently in dealing with well-meaning sighted people who are actually kind of being disruptive and getting in our way is I have a lot of people who will try to grab me or offer an elbow. And I’ve started to do sort of what I call like coming up with script.
So if anybody ever touches me, first and foremost, I just absolutely freeze up like no yelling, no getting upset or anything. But I don’t want to be touched by a stranger. And I do the same if they grab my dog or my cane, any kind of mobility device, just full stop, no go, you know, you touch me, I’m not moving.
But then, I will then just say, “No, it’s okay. Just keep talking. I would rather follow you.”
And in every instance this has happened, this usually works. People will be like, “Okay.” Like they still want to help.
So they’ll let go, and I’ll just be like, “Hey, let’s start up a conversation.”, and we’ll walk and talk. Or I can just be like, “No, it’s okay, I got it.”
But yeah, so like, that can be a good way to just sort of politely get people to sort of get off the case because most of them, like they’re well-meaning and I think, they panic when they see us. [laughs] So if you kind of like rehearse and even practice sometimes these things to do in a situation where somebody is helping in the most unhelpful way possible, [laughs] you can find ways to sort of get around it.
And then too, I think I did want to share like my favorite, like the worst thing that ever happened that a sighted person did. I think we’ve all had the experience.
I was going downstairs, back when I had my cane. I’m halfway down the stairs, and there were little flicks, so I’m really concentrating.
From behind, no, he was like in front of me. I just hear this guy like kind of shriek, panic, like, “Oh my God! Look out for the stairs.”, you know, and it scared the snot out of me.
So I slipped a little, and kind of stumbled my way down the stairs. I almost fell. I was not thrilled with this, but things were okay.
And he comes running up and you know, I could hear the grin in his voice and just says, “See, I told you about the stairs.”
And like, I don’t think I said anything. This was a long time ago, but I think that’s like the most annoying thing that’s ever happened.
So for sighted listeners, don’t yell at people. We know the stairs are there. [laughs] Trust me. We got this.
But yeah. For most things though, just kind of having a polite (even when you don’t want to be polite), just have a polite way to brush off the help, or redirect the help. You can use that to sort of educate them.
Or if you’re just kind of tired of it that day, you can get them off your case by kind of being polite about it. [laughs]
Good morning, Jonathan! This is Shawn Thiel here.
Just wanted to provide you an update on the state of Castro.
I know that you and others have gone in other directions. But I’ve chosen to stick with them for, I mean, I still like a lot of the features that they have.
And also, their history data on a per podcast level is helpful. There’s one podcast I’m very behind on, but it’s nice to be able to look and see which episodes I’ve heard and which ones I have not. That data is very valuable to me.
So Castro released an update this past Friday, and did a very good job of telling what they had fixed in their release notes which as we all know, is a rare thing these days.
They had not, however, fixed the issue of VoiceOver not reading episode titles in the inbox. And for me, it’s not really an issue because the podcasts I subscribe to are so wildly different that the show notes tell me which one I’m listening to. But I recognize your issue for news-related podcasts, where the headlines are all going to be similar.
So in their release notes, they said that they’d put a link to Twitter in the app, and we could follow them there.
So what I chose to do on Saturday evening was to write to them on Twitter, re-explain the issue, and indicate that many VoiceOver users would appreciate it if it were to be fixed.
And I got a response fairly early my time this morning, which is the morning of Memorial Day as I record this, which I believe is the 29th of May then. And they said, “Thanks. We’ve seen your emails. We’re still looking into it, and thanks for the patience.”
So I’m thinking maybe they didn’t write me back because they wanted to have something more definitive, but they are still responding on Twitter.
And that is the case. Some companies now, Twitter is the most efficient way to respond with an issue. Sometimes, they catch that more often.
But I am encouraged that they are not abandonware, and are still looking to correct VoiceOver issues. So I just wanted to share that.
And if and when that bug resolves, I will of course let the Living Blindfully listeners know.
Speaking of Living Blindfully, I’m enjoying the rebrand. I would love a Living Blindfully t-shirt to probably wear over the weekend. I’m in office casual environment. I could probably get away with it during the work week, but not quite sure about that. But I love the idea. I love the name, and wish you continued success and to keep doing the wonderful work that you are doing.
Jonathan: Thank you very much, Shawn! I have to tell you, I am so ready to switch back to Castro, it’s not funny. And when they do fix this bug, I am, as the Americans like to say, “totally like so there”.
I do still have the TestFlight builds of Castro. So the moment they make some changes, I see it before the release in the App Store.
And I also contacted the Castro people a few weeks ago about this VoiceOver bug.
And just to be clear about what this bug is for those who weren’t listening then, what happens is if you’ve got a lot of podcasts in your inbox (and I subscribe to a lot of news-related podcasts), you skip through, you hear the headline, and Castro used to say the source of the podcast. Now, it does not.
They acknowledged me by email pretty quickly, which is positive, because they went through a not responding phase for a while there. They seem to be past that now. That’s really encouraging.
They say that they are on it, that they’re going to fix it. And I really look forward to when they do because Overcast makes my phone so sluggish when it’s playing.
We’ve been talking about this on Mastodon. And other people are saying, “Yes, it’s awful! It’s like turning my state-of-the-art iPhone into a crawling thing.” [laughs]
And other people are saying, “What what what are you on about? It’s perfectly fine.”
And I don’t know what the variable is. It’s not to do with phone type. I’ve got the iPhone 14 Pro Max here. You know, fast phone.
And I’ll show you this. People don’t believe me, so I’m going to show you how bad it is.
I use Eloquence a lot of the time when I’m not demonstrating things on the podcast, and it’s particularly noticeable with eloquence. But switching text-to-speech engines doesn’t make it go away.
What I’ll do is start playing a random bit of a random Living Blindfully episode. But before I do that, I’ll just flick through my home screen. And you can hear my normal behavior and responsiveness.
So I’m flicking forward very quickly. And now, I’m going back.
You can hear it’s quite responsive.
Now, I’m going to perform a 2-finger double tap, and this bit of the podcast will start playing.
And I’ll do exactly the same thing. I’m going to flick through in exactly the same way as I just did.
So you can hear, there’s very much length in the gap there between when I’m flicking around and when it’s responding. And when you are perhaps listening to a podcast in the background and you’re trying to edit a document or do anything, it is so sluggish, you may as well not bother.
Castro does not do that. In fact, I have not found another podcast app that does it.
I thought that it might be worth trying to disable the effects in Overcast turnoff the audio compression, turn off the smart speed thing.
None of it seems to make a difference. The moment I press play in Overcast, my phone may as well be an iPhone 3GS, or something. Actually, I don’t recall the iPhone 3GS ever being as bad as this when Overcast is playing. So I am very keen to get back to Castro as soon as they fix this bug.
Oh, and as for the merch, yes, I want the merch, too.
As they say, it’s on the roadmap. We’re doing a few other things spinning up the advertising at the moment.
And I will get to the merch because I want the Living Blindfully T-shirt, and all sorts of other accessories as well.
There are times when something happens in your life that’s so impactful, whether it be for good or for ill, that you know you will always remember where you were and what you were doing when it happened. These are called flashbulb moments. We’ve talked about these over the years.
I have had such a flashbulb moment, and I will tell you what was going on.
I was listening to an interview that seems to me to be a pretty nice routine interview on BBC Radio 4’s Today program. They actually published an extended version of it on their podcast feed.
It was a lovely, meandering little interview about an exhibition that Sir Paul McCartney has put together containing photos that he took in 1963 and 1964, when the Beatlemania thing was really kicking off, and there were these 4 young guys in the bubble.
And Martha Kearney, who did the interview on BBC Radio 4, did a fantastic job. Well, for the most part, and I’ll come back to that later.
But it was kind of an audio description. They walked through the gallery, and Sir Paul was describing some of the photos. I found it very interesting. And then, they got towards the end of the interview and the end of the exhibition.
And I have no idea whether this was contrived or not, or whether Martha Kearney just happened to ask the right question. But she asked Sir Paul about the impact of artificial intelligence (AI) on music. And she referenced the fact that some Beatles fans who are tech savvy and up with this audio technology have been creating new Beatles songs.
There was, (until it was taken down for copyright infringement), a very interesting version of New, the Paul McCartney song released in 2013, that featured a younger-sounding Paul doing the vocals, and John joining in, and it’s a very Beatlesque kind of song. You can also hear what is supposed to sound like John and Paul doing John’s song, Grow Old With Me, which is another thing that they were looking at doing for the anthology project.
And Sir Paul said they’re really coming to terms with it, wondering what to make of it. It’s both exciting and scary.
And then, very casually, he dropped a bombshell.
Sir Paul: We were able to use that kind of thing when Peter Jackson did the film, Get Back, where it was us making the Let It Be album.
And he was able to extricate John’s voice from a ropey little bit of cassette, where it had John’s voice and a piano. He could separate them with AI. They tell the machine, that’s a voice. This is a guitar. Lose the guitar.
And he did that. So he has great uses.
So when we came to make what will be the last Beatles record, it was a demo that John had that we worked on, and we just finished it up. It’ll be released this year. We were able to take John’s voice, and get it pure through this AI, so that then we could mix the record as you would normally do.
Jonathan: [laughs] It was so casually done, that I thought I’d misunderstood what was being said, and I wound the podcast back to play it again.
Because initially, when he started down this track, I thought that he was going to make a comment that the Beatles were ahead of the curve in using AI for Free as a Bird and Real Love.
I wonder. Did the BBC realise the scoop that had just landed in their lap? Was there some sort of agreement that Martha wouldn’t probe further? I guess we may never know.
But if you go into the show notes for that podcast episode, there’s a lot of talk in the show notes about the exhibition. Nothing at all about one of the biggest musical stories of the decade a new Beatles song. Extraordinary!
So she didn’t follow up and say, “Okay, Paul. That’s really exciting! what’s this song called?”
But I’m pretty confident that we know the answer. We can only speculate at this stage because interestingly, the Sir Paul McCartney account on Twitter, which I did check into for something as important as a new Beatles single, did retweet the story that covers this on BBC Radio 4, and that was picked up by many, many other media outlets around the world. I mean, a new Beatles single is a hell of a big deal.
But the Beatles Twitter account is completely silent on this.
As I put this show together, there has been no confirmation by the Beatles of this. But I think the Beatles fans who know their history can make an educated guess. I’d be very surprised if it’s anything else. I don’t think it could be anything else. Because remember that we’re not only dealing with the need to get John’s performance on the record. But to make it a Beatles track, we also need George, who died in 2001.
So we go back in history to 1978. John had temporarily left the music business. He always thought he might come back at some point when the time was right.
He had a 2-year-old, Sean, turned three in October of 1978, and he was spending a lot of time being a house husband, raising Sean.
From time to time, the music bug would grab him and he’d sit down at his piano. Sometimes, he’d play the guitar. At other times, he had this keyboard, very 1970s-sounding basic keyboard, (because this predates the era of sampling). And sometimes, he would play that.
They didn’t have a fancy studio at the Dakota, so he would just sit in the living room and play his piano. And in case he came up with anything useful, he would record that on a cassette boom box with a built in microphone, so there’s ambience. Sometimes, there’s electrical hum on some of these recordings because of the circuitry at the Dakota. They are not good quality recordings at all.
By 1978, a lot of the heat had gone. All the legal argy-bargy that was associated with the Beatles splitting up had been resolved.
John and Paul would run into each other sometimes. John didn’t leave the UK after he moved to New York. It took him a long time to get his green card. And when he did, he was pretty settled with Sean.
He was planning to go back to the UK in 1981 to catch up with people, in particular his auntie Mimi, who raised him for the most part, and was getting old.
Of course, that didn’t happen. He was assassinated in December of 1980.
But he was a pretty mellow guy by 1978. And from time to time, he did contemplate maybe the Beatles will do something one day, or maybe he would record with Paul one day, even if the 4 of them didn’t get back together. It wasn’t a burning desire, but it wasn’t something that he ruled out.
There is a story that Paul tells from 1976 when Paul was over at John’s apartment, and Lorne Michaels was on Saturday Night Live. It was just called Saturday Night then, and Lorne Michaels was running this gag wanting the Beatles to get back together, and he offered a $3,000 check for that to happen.
John and Paul looked at each other apparently, and said, “Shall we do it?”
And then, they thought, “Yeah, we may as well do it.”
But then, they decided they were too tired. They did contemplate getting a cab to the studio. So they were very close to just turning up out of the blue and playing something together on Saturday Night Live.
So things had thawed considerably by 1978, and there were some songs that John was working on that he thought could be potential collaborations with Paul, or even potential future Beatles songs. He put some of those on a cassette, and he put on the label of the cassette for Paul.
Some considerable time after his death, when the Beatles were looking at the Anthology project, Yoko gave to Paul that cassette, and it had things like Free as a Bird and Real Love, and they were used in the Anthology project of 1995.
So in 1995, in November, we got the first new Beatles song for 25 years, Free as a Bird. It was done by using the best technology available at the time to use John’s vocal. You couldn’t get rid of the piano playing in the background. You could clearly hear that it was recorded in a very suboptimal environment. Paul wrote some new lyrics for the middle, and the Beatles played around this cassette tape. They also did a version of Real Love, which was widely available on bootlegs.
Mind you, Free as a Bird was pretty widely available on bootlegs as well. They changed the speed and did some interesting things to that track.
So Free as a Bird was released on Anthology 1, and also to a lot of hoopla, because it was new Beatles music.
We had Real Love starting Anthology 2, and there’s a wee bit of instrumental stuff at the beginning of Anthology 3, where a third song might have gone. That third song is called Now and Then, and the consensus is that Now and Then is going to be the new, and what Paul is calling the final, Beatles song.
The recording quality of Now and Then on cassette that the Beatles had was absolutely atrocious. There’s some electrical interference, John is way off mic, it just sounds awful.
And George put his foot down. They worked on Now and Then for an afternoon, and he said, “It’s rubbish! We cannot release this as a Beatles song.”
And I think he was right. If they had released Now and Then, then, with the technology that was available, it would have been a massive bust. In fact, there are some who argue that Free as a Bird and Real Love shouldn’t have been done because you can clearly hear the effect of John sounding so different from the rest of the Beatles given how John recorded his contribution.
There’s also some debate about whether these things should happen at all. Some people say you just don’t touch the Beatles legacy.
Was Free as a Bird a good idea? Was Real Love a good idea? I mean, we wanted it at the time.
And here we are again, now in 2023. And I think many of us want this, too. Things have changed a lot with audio technology since 1995, and I’m proud to say that a lot of that change has taken place here in New Zealand. Thanks to Sir Peter Jackson and the technology that he and his engineers have created.
We’ve heard this technology in action with respect to the Beatles twice now. The first was when the Get Back documentary was released, and the second was when the Revolver remix was released.
And Giles Martin, I think, put it best when he says what they’re really doing is unbaking the cake. So imagine you’ve got a cake in front of you, and you know that it’s made up of flour, and sugar, and eggs, and icing. But it’s a cake. It’s not possible to get just the flour out of it. It’s not possible just to get the egg back again once it becomes part of the cake. And that’s the problem that the Beatles could not solve in 1995.
Now, they can. So they can take John’s vocal from now and then. They can, not only extract it and leave the piano behind if they want to, (which they probably do. they probably want a new piano part.), but there’s also digital audio restoration technology that can clean the audio right up so that John’s vocal is likely to sound very close to, if not exactly like, the rest of the Beatles, like he recorded it in the studio. There’s no doubt that now and then should sound a lot better when it gets this treatment.
There’s another thing, too, and that is that another recording of now and then has surfaced that is quite a bit better. It’s still a cassette recording, but it’s nowhere near as dodgy as the recording was that the Beatles were playing with or working with in 1995.
What I personally would love to see is them go back and remix Free as a Bird and Real Love using the technology that’s available today, because I think that could sound significantly better than what is out there.
But in the meantime, we’re on standby for new Beatles music.
Caller: Hey, Jonathan! This is Peter Donahue calling.
I want to bring up an issue, and I think we should open it up for discussion on Living Blindfully, concerning features that should have been present when PDAs like Alexa, Google Assistant, Siri, Windows Narrator, Roku Audio, and all that.
There’s a lot of great things that these companies have done over the years to improve the accessibility and usability of their products while consistently neglecting what I believe are commands and skills that are basic to blind and people with other disabilities being able to use their products independently.
For example, the ability to speed up or slow down your speech, to adjust your speech loudness and volume and rate, and enable things like hands-free mode or close-captioning.
Right now, you have to go into settings, menus, and all that to change these. It would be much more convenient if when I pull up a movie on Amazon Prime Video that I know includes audio description, that you could just say, “Tell Prime Video to turn on audio description.”, instead of making us have to jump through a bunch of hoops to enable or disable these features.
Close-captioning is another one that should be controllable with a PDA.
This is something that has really gotten us mad and has cost my wife and I unnecessary inconvenience and some expense. We shouldn’t have to go down to T-Mobile, just so we can get help slowing down the speech on my wife’s iPhone, for example.
Because of that and some other issues we were having with our Apple login, a couple of months ago, we ditched the iPhone and are now the proud owners of Samsung Galaxy phones. Adjusting speech rate is a lot easier, even without TalkBack to do it. But it would be much easier if it could.
Anyway, I think this is a great topic for discussion, and one that needs to be raised with these companies and that we insist that these capabilities finally be added to all PDAs to make them easier to use by the blind and disabled.
Jonathan: Good to hear from you, Peter. Hope you’re doing well.
A couple of things.
You can adjust the speech rate of your iPhone using the rotor. And I appreciate that some people do find the rotor gesture a little bit troubling. But you can rotate the rotor until you get to speech rate. And then, flick up and down.
Regarding speeding up the speech, it’s interesting the old Soup Drinker, the Amazon thing, can speed up speech and a lot of people don’t seem to know that that’s possible. So I will just try that. Soup Drinker, speak faster.
Now sometimes, we don’t get any feedback. But I believe, it should work. So let’s just check the weather.
Soup Drinker, what’s the weather forecast?
Soup drinker: The current weather is 7 degrees Celsius, with cloudy skies.
Soup drinker: Today, you can look for clouds and showers with a high of 9 degrees and a low of 6 degrees.
Jonathan: Now, I’m not sure if you can go any faster than this.
Soup Drinker, speak faster.
Soup drinker: Okay. I will speak at this speed from now on.
Jonathan: Okay, it has responded that time. So let’s check the weather.
Soup Drinker, what’s the weather?
Soup drinker: The current weather is 7 degrees Celsius, with cloudy skies. Today, you can expect clouds and showers with a high of 9 degrees and a low of 6 degrees.
Jonathan: Soup Drinker, speak at your fastest speed. [silence] No feedback.
Soup drinker: I will speak at this speed from now on.
Jonathan: Okay. I think that is the fastest.
Soup Drinker, tell me an interesting fact.
Soup drinker: Here’s a fact about Europe from Wikipedia. The wolf, the second largest predator in Europe after the brown bear, can be found primarily in Central and Eastern Europe and in the Balkans, with a handful of packs in pockets of Western Europe.
Jonathan: And how slow can we go?
Soup Drinker, speak at your slowest speed.
Soup drinker: Okay. I will speak at this speed from now on.
Jonathan: Oh no! Okay.
Soup Drinker, speak at normal speed.
Soup drinker: Okay, I will speak at my normal speed from now on.
Jonathan: Now, let’s see if we can go slower.
Soup Drinker, speak slower.
Soup drinker: Okay, I will speak at this speed from now on.
Jonathan: Soup Drinker, speak slower.
Soup drinker: Okay, I will speak at this speed from now on.
Jonathan: Soup Drinker, speak slower.
Soup drinker: Sorry, I am already at my lowest speaking rate.
Jonathan: Right, thank you for that.
Soup Drinker, speak at normal speed.
Soup drinker: Okay, I will speak at my normal speed from now on.
Jonathan: I take the point that you’re making, Peter, about how interesting it is that certain things that you expect to be able to do with some of these voice services just aren’t possible. You ask Siri to go to a particular part of settings, for example, and it can’t. And it certainly would be so much easier if you could just toggle audio description and closed captions on in the way that you talk about.
So interesting discussion. Maybe we can find out from others what else they would like to see in their digital personal assistance.
But for now, it is time I was out of here. Thank you so much for all your contributions and your support of the podcast.
Remember that when youre out there working your guide dog, youve harnessed success. And with your cane, youre able.
Voiceover: If youve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.
If youd like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: email@example.com. Or phone us. The number in the United States is 864-60-Mosen. Thats 864-606-6736.