Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.
You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.
Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.
Whether it’s the new operating system or the new phones, plenty of people have things to say about new Apple things, Be My AI is available to everyone on iOS, the audio interface company Focusrite has some good accessibility news, and getting serious about reading Braille.
Great to have you back for episode 251.
Now area code 251 is for Southwestern Alabama, and it includes Mobile, Alabama, which is spelled the same way as mobile, which is very interesting because I remember I was using a text-to-speech engine some time ago. I forget which one it was. And every time there was a reference to mobile phones, it would say mobile instead of mobile, or as Americans like to say, mobile, like the petrol station. But wait. As Americans like to say, gas station. Oh man, we are divided by a common language.
But anyway, welcome to you if you are in 251, especially the people of mobile, where maybe you got a new mobile phone recently on account of Apple releasing a whole bunch of new stuff.
Meanwhile, country code 251 belongs to Ethiopia where apparently, there are around 151 million people. This is according to the 2023 census, so data just recently obtained.
Back in days of yore when we were Mosen at large, in episode 222, this podcast was one of the first (if not the first) to give you a comprehensive demonstration of the Be My Eyes virtual volunteer, as it was called then. Now, it was a little slow. It was a bit rough and ready. As we are now well aware, large language models can hallucinate from time to time.
But not withstanding all those things, it was still quite astounding. The technology and the user interface have come a long way since we did that initial demo in episode 222.
And earlier this week, ChatGPT announced that they were adding a couple of features to their product. One is to be able to communicate with it by voice. There have been third-party apps filling this niche. Now, ChatGPT is doing it itself.
But most significantly in the context of what we are talking about, they have publicly made available the ability to upload pictures and ask questions about them. And that has allowed Be My Eyes to make Be My AI available to everyone running the iOS version at this stage of the Be My Eyes app.
So it’s a significant development, and congratulations to everyone at Be My Eyes who has been working on this.
There is no better example of the value that people perceive Be My AI has than when many in the blind community were very frustrated by the fact that they got this glimpse of what’s possible, and then the utility of the product declined significantly because ChatGPT is trying to navigate this minefield relating to what to do about people and faces.
Some in the blind community kind of turned on Be My Eyes over this. And at one level, I understand that because our relationship is with Be My Eyes. In the end, Be My Eyes is supplying the product.
But it was sad for me because I know that behind the scenes, there was a lot of advocacy going on, a lot of deep discussions with OpenAI about the right of blind people to access visual information in an accessible format.
And there are all sorts of complexities relating to laws and various jurisdictions, and it’s not an easy minefield to navigate.
You can be sure there are going to be a lot more of these very important, deep philosophical discussions, and the perspective of a range of disabled people needs to be around the table. I certainly hope that the organised blind consumer movement will be seated at that table when these discussions are being had because there are all sorts of important potential benefits of this technology.
But I think it’s important that we turn our advocacy to where it’s needed, and Be My Eyes are on our side over this. So it’s important that we acknowledge that they are actually doing some pretty important work behind the scenes.
And we, as blind people, have to make a distinction when we’re advocating about this between having something that says, “This person is Jonathan Mosen in the picture.” and just “This person is, …”, then defining the visual characteristics. The former probably requires some sort of consent and it’s very complex, but the latter is simply conveying the same information that a sighted person can glean by looking at someone.
So I’m heartened that Be My Eyes continues to have the ear of OpenAI, and congratulations to Be My Eyes for getting to the point where this is now available to anyone who wants to use it on iOS.
If you haven’t used Be My AI yet and you would like to, you can just download the Be My Eyes app from the App Store and register.
And remember, be inventive. It is remarkable when you take a picture. Not only is the description more detailed than anything I have seen by a long way, but then you can drill down. You can ask questions as if you were asking a sighted person to explain that picture to you. And should you need it, a sighted volunteer is just a double tap away.
I would be intrigued to find out how you are using Be My AI, how it’s helping you, where you feel the technology might do with some improvement.
To start us off on this, a really cool story from Jeanie Willis here in New Zealand. She says:
“I have been testing out Be My Eyes for a couple of weeks, but realized I’ve really only been giving it the sort of things I’d often think to scan or try and get my phone to do with other apps and getting better results, but hadn’t really thought to try out things I would only usually ask a real person.
After a friend told us on our cooking list she had had it read a handwritten recipe correctly (and recipes are usually problematic, even if typed), I wondered whether it could read music. After all, music has a very set, limited number of specific graphics and rules to interpret, so why not?
I took a picture of a page of one of the kids’ music books today. It is a very cluttered page, with pictures, music, diagrams, lots of little tips on the side, etc.
It told me it was a picture of a page of a music book, then the title of the piece, told me it was composed by X so it knew where that it is written on the music to identify it from other text, described the picture beautifully, and read all the text. It told me there was music in the middle of the page.
So I thought I’d test it further and ask a follow-up question of whether it could read me any of the music. It did! It told me how many lines of music there were, described the clefs at the beginning, noting it was probably a piano piece with 2 staves, one for each hand, and then very correctly read out the notes for each hand in the first bar, and then told me there were a lot more notes and it was happy to read it if I liked, and to ask another question for more details. What an amazing way to get a quick snapshot of a piece of music!
The usual Seeing AI and other OCR muck about, getting muddled with various other text on the left before finding the title in the middle and sometimes not right at the top. But it gave me that first … Oh, the page number was given before that and announced as a page number, not just a random number stated.”
Thanks, Jeanie. That’s a really great example of a use case for Be My AI that other apps just aren’t doing well.
Sometimes, you do get hallucinations. We had one just a week or so ago, when we looked at an OptiGrill which we haven’t used for a while. We dusted off the old OptiGrill, and I took a picture with Be My AI so I could be reminded of the order of the buttons.
It told me very confidently that it was an OptiGrill, and then completely misread the order of the buttons. It read it with such certainty that I was sure that it got it right. But actually, it did not, and common sense intervened because it put the power button in a place that I thought was highly unlikely. And in the end, I called Aira and I got the actual information. But there have been other examples where it just does a phenomenal job.
So it’s about using the right tool in the toolbox. If I want to read a document, say it’s a piece of mail and I just want the document read, I will use Seeing AI. I find Seeing AI to be the most accurate of all the tools that I have had access to. For me, Seeing AI is noticeably better than Envision in most cases when I take a picture. I don’t know whether other people have different experiences of that, but consistently with a wide range of things, I found Seeing AI gives me the best results, and it will just read the document without any kind of embellishment, or seeking to paraphrase or summarize.
On the other hand, if I’m going to a restaurant and I want to read a menu, I will now read that with Be My AI because I can query it. I can say, “What steaks are on the menu?”, and immediately get that answer. I can ask what the cheapest thing is on the menu, and get that answer. So you get a lot more interactivity with Be My AI.
And obviously, with things like music, you’re tapping into that large language model. So it’s such an exciting time to see this technology emerge, and I’m sure there’ll be more stories like yours, Jeanie, where people are doing some very interesting things with Be My AI and being delighted by the results.
So I look forward to hearing more from listeners as we all experiment with this technology.
Advertisement: It’s important to me that Living Blindfully is fully accessible, and that’s why every episode is transcribed.
Accessibility is in the very DNA of Pneuma Solutions, and it’s thanks to their sponsorship that transcriptions are possible.
As we discuss on this show regularly, sadly, the world is not as accessible as we would like. And it’s frustrating to find that you’re on a Zoom meeting or a Teams meeting, somebody’s running a PowerPoint presentation, and not only is it not accessible, but they haven’t given any thought to accessibility before.
When the time is right, take that person to one side, or maybe talk to the IT manager in your organization and tell them there’s a fix for this. And it’s called Scribe for meetings.
It seems like such a simple solution, but there’s a lot of incredible magic going on under the hood.
All someone has to do is upload their slides as little as 5 minutes before the presentation is due to run. Pneuma Solutions will do the magic behind the scenes, and provide a fully accessible version that you can follow along with.
There’s no need for you to be excluded from these presentations any longer.Scribe for meetings provides the answer.
To learn more, head on over to PneumaSolutions.com. That’s P-N-E-U-M-ASolutions.com.
Voice message: Hi, Jonathan and fellow Living Blindfully listeners. John Lipsey here.
I wanted to take a moment and provide some comments on episode 248 of the podcast, which was really well done.
I will not be getting any of the new iThings this year. I have an iPhone 14 which I’m perfectly happy with, and I’m not gonna transition to a USB-C iPhone until all of my things can transition to USB-C.
I have 3rd gen AirPods. When I can get new AirPods, I assume by then, all of the AirPods pro and non will have USB-C, so I’ll get those.
My Apple TV remote, I could just buy a USB-C one. That’s easy.
The harder one is gonna be the Magic Keyboard. I have a Mac Mini because I can use it without a monitor. I like my Magic Keyboard. I don’t really want a third-party keyboard necessarily. Once those transition to USB-C, then I would be perfectly content to transition my iPhone to USB-C. So then, I don’t have to deal with multiple cables, adapters, this, that, and the other. So that’s that.
Secondly, somebody mentioned in the podcast that with the iPhone 15 having the newer chip, it will support the startup and shutdown chimes that the 14 Pro and Pro Max had. Well, the 14 also has those. My 14 has a shutdown chime and a startup chime. They were just enabled by default when I turned voiceover on, the first time I set up my phone. So the 15 will have those, yes, but 14 also does have them.
The third thing I wanted to mention was regarding the dock in the new watchOS. I had to do some digging around a little bit to figure this out.
I haven’t updated. I don’t do beta software on my devices.
But if you double click the digital crown, it will, apparently, take you to your recently used apps. I have my dock set to just be the apps that I want to use most regularly. And the article I found said that double clicking the digital crown takes you to the most recently used apps, but I don’t know if it can take you to the apps that you just have as your dock, like if you’ve customized your dock, I’m not certain.
But the dock isn’t completely gone in watchOS 10. It’s just different.
That’s all I have. Hope all are well. I will come back with more thoughts again soon.
Jonathan: Yeah, I’ll come back now you’re here.
Thank you very much, John, for the correction regarding the startup sound on the iPhone 14 without the Pro bit. I guess no one on our little group had an iPhone 14 that wasn’t Pro. [laughs]
So it sounds like whatever’s responsible for that sound lives in the bootloader of the phone rather than the operating system, but it’s not chip-specific. But it’s good to know that this is going to progressively roll out because man, that startup sound makes all the difference.
Regarding what I would describe as the app-switcher-like function, (because in the old days when iPhones had home buttons, the SE still does, doesn’t it? But for most of us, we don’t have a home button anymore.) you double click the home button, and you get into the app switcher.
What I find when I do that in watchOS 10 is more likely app switcher than the dock. Because I used to have my most frequently used apps in the dock. That was an option. You could configure it kind of like an app switcher, where your recently used apps appeared.
But you could also have it static, which is what I preferred to do. I have not yet found a way to bring that functionality back with the double click of the digital crown, but you have inspired me to keep fossicking around and searching.
So appreciate the contribution, John. Good to hear from you.
It is amazing to me, and I guess slightly flattering, that there are particularly younger people who are really into their technological history in our community. It’s great because we have to preserve it. And they know little things about when I did some radio thing 20 years ago that I had long forgotten.
If you’re one of those people that collects these things, I will point out that the first ever Living Blindfully contribution to be recorded with a Braille display connected to the iPhone 15 Pro Max belongs to Jackie Brown. [laughs]
“Firstly, a query to your listenership about a possible iOS 17 bug.
After installing it on my iPhone 12 Pro earlier this week, I discovered that double tapping on buttons within apps became extremely tricky. By that, I mean that double tapping, as I’ve done for years, just made VO jerky and put me to a random position on the screen.
I had to get to the button I want again and very gingerly double tap it. Or else, the same thing happened. It was like having a fly on my iPhone that I couldn’t swat.”
Ow! Get off, you dirty little blighter.
“I have heard”, says Jackie, “others are struggling with this, so can take comfort from the fact that my technique hasn’t suddenly disappeared.
I hoped that once my iPhone 15 Pro arrived, it might not behave as I just described.
Alas, however, it does.
I have read that it hasn’t happened for everyone so thought I would share, in the hope there might be a solution others have tried successfully.
Other than that, I like the iPhone 15 Pro, and think it has a deeper tone.”
Jackie, the answer is, listen to Living Blindfully, because we covered this during the beta period.
For me, this problem is absolutely solvable by just increasing the double tap timeout in VoiceOver. You don’t have to do it by much, but just increment the double tap timeout a couple of notches. And for me, the problem has completely gone away.
It was very frustrating when I discovered it during the beta cycle, and we did talk about it on the podcast when it happened, when another listener wrote in about the same thing.
We’ll just take a brief departure from Apple things for the moment because Jackie continues in this email.
“Secondly, as I’m sure you are aware, Sonos released its Move 2 speaker this week. I wanted to use it connected to my PC, but there was so much lag on both Bluetooth and the line-in method that I’ve had to forget about going down that road.
I purchased the Sonos USB-C to 3.5 millimeter line-in adapter. But as soon as JAWS stopped speaking, the Move cut off, and I waited for several seconds to hear Jaws again, despite several key presses. Bluetooth was exactly the same.
So for anyone else thinking of using the Move 2 as a good quality PC speaker, doot! This is a real pity, as I have no such issues with my Play 5s using the line-in method.
If anyone has tips for either issue, please do share.”
Jackie, I wonder if you’ve turned the Avoid Speech Cutoff setting on in JAWS. It won’t get rid of the latency, but it might get rid of the cutoff, I guess, so that’s worth a try.
But yeah, unfortunately, the Sonos gear is not really optimal. Even the 5s are a little bit laggy, even before you try and get Bluetooth in the mix.
It’s not just Bluetooth at play here. Sonos is processing the audio, and they do so in a way that audio can be synced across multiple speakers over Wi-Fi in multiple rooms. And that’s going to add latency as well.
Let’s get back to the matter of iPhone and iOS. Christopher Wright is writing in again. He says:
“I love the term cynometer. Is that how you spell it?”
Yes, that’s how I would spell it – C-Y-N-O-M-E-T-E-R. Actually, I would spell it R-E, because we would use the British spelling in New Zealand. But yeah, you got the point. [laughs]
“Yes, I’m very cynical, but it’s justified”, says Christopher, “particularly in Apple’s case. They’ve done many things in the past and continue to do things that make me stick to this, and the iPhone 15 is the latest example.
I’m willing to bet you the reason they removed the headphone jack”, (the trauma, the trauma!) “isn’t because of supposed courage.”
No, I agree with you. That’s absolute bullsoup, complete and utter bullsoup.
“Rather,” he says, “they were trying to push their overpriced Bluetooth headphones, as well as increase the market for MFI lightning accessories.
Of course, it’s your personal decision to purchase new things from Apple, and I wasn’t trying to tell people otherwise. I’m simply pointing out the philosophy that seems to be at play here.
The iPhone and other Apple products appear to primarily be status symbols in many circles, and it’s truly sad Apple’s marketing seems to heavily cater to this.”
Okay. In the interests of full transparency and full disclosure, Christopher, I have to say, I think you do have a point, even for me. I mean, I’m using my iPhone because I like the Braille support, I’ve invested a lot in the apps, it works. Despite all the bugs that we talk about here on the show, I still think it’s the best option for my particular needs. And I’m comfortable enough in my own skin to like what I like and use what I use.
There’s this concept in music about guilty pleasures. There are certain songs that apparently, you should feel guilty about liking and that you don’t want to tell anyone that you listen to or whatever, you know?
I’m thinking, “Nonsense! Nonsense! It’s you! It’s music! If it moves you, if it makes you happy, if it invokes emotions, then who gives a flying flamingo what anybody else thinks?” If you enjoy it, love it.
Now, all that said, I must confess, I do know a bit about what you’re talking about now. [laughs]
’Cause Bonnie and I went into a local store for our carrier (ShoutOut to One NZ!), which actually gives us a pretty good experience most of the time.
I said to Bonnie, “Bonnie,” I said, I spent a wee bit of time ejecting the SIM from my phone when I changed to the 15 Pro Max the other day. And why am I really doing this? ’Cause the world is going eSIM. Eventually, Apple will sell the eSIM-only version of the iPhone (which is currently being sold in the US) worldwide, so we’re going to have to make the move to eSIM sooner or later.
And had I an eSIM last week, I would have been able to just remove the eSIM from my phone, re-scan the QR code, and we’d be up and running. At least, this is the theory.
Some carriers do it even better than that. They have a transfer system that is officially endorsed by Apple so when you move phones, the eSIM goes with you.
None of our carriers do that yet. And I was on eSIM with a previous carrier before we migrated. Those of you who’ve been listening for a while will remember why we migrated from Spark at the end of 2021.
So we went into the One NZ store 24 hours after my iPhone 15 Pro Max had landed, and we just had some time. And I said, “Let’s get this done.”
So we went in there, and I said, “I would like an eSIM for my wife and me, please.”, nice 1NZ person. And they got it all set up, and we scanned the QR code.
And I said, “Do you mind if I just make sure that this is working in the store before I leave the store?”, because I’ve worked with enough technology to be a bit sceptical, you know?
And she said, “That’s fine. Of course.”
So I found that I couldn’t make voice calls. It was coming up with the 4Gs.
We only had four of the Gs. We didn’t have five of the Gs where we were.
I’m thinking, you would think that a carrier would make sure in an area where there are pockets of 5G that they would put a 5G cell in there so they could showcase their speeds, and everything.
But whatever. I am not Jason Paris. I do not run One NZ.
So I could use the web, but I couldn’t make any calls. When I made a call, it was just silent.
So while I was trying to do my own troubleshooting, they moved on to set up Bonnie’s eSIM. And the way that they do this (and I think it’s a little bit primitive compared to other implementations in some other countries) is that they email you a QR code, you activate that QR code, and it should work.
I know that there are some implementations where you can do it from your carrier’s app and the eSIM just activates, not even a QR code needed. But One NZ don’t have that, and I don’t think any New Zealand carrier has that at this point.
Well, the first problem they had was that the email for Bonnie’s eSIM wasn’t coming through. We were sitting there, waiting and refreshing and waiting, and there was no email.
And the One NZ person said, “Okay, I’ll just start the process again.” And in the fullness of time, (we had to wait a while.), both emails showed up at the same time. So there was obviously some sort of network lag going on with One NZ’s carrier provision team, or whatever.
So we use the second eSIM QR code, thinking the most recent one will work. Bonnie had no signal whatsoever, no service.
We tried the first one, and she had no service, either. And we were trying all the troubleshooting steps.
Meanwhile, I could not get voice calls working.
And in the end, Bonnie, she got a bit annoyed, she got annoyed. She said, “Why are you bringing us in here? We had a perfectly good-working phone. Why couldn’t we have left well alone?” [laughs]
So after about 45 minutes, I said to the woman, “Do you think we could just get new physical SIMs and we’ll do this another day?” This is not our day for eSIM.
So she created a new SIM, we put that in, and we left the store with the status quo. Nothing of consequence had been achieved.
And when Bonnie’s 15 Pro Max arrives shortly, I will be the one that has to go in with that little SIM ejector tool thing, eject the SIM tray, and put her SIM into the new 15 Pro Max.
The reason why I pause your email, Christopher, [laughs] and tell you this long winded story is that I must confess, (and this doesn’t happen to me very often). [laughs]
I was conscious of the fact that here I am, in a telco store, the day after this very rare (at the moment) iPhone 15 Pro Max has been released.
’Cause they’ve got a shortage of supply, and people are coming in and they’re asking do you have any? And they say no. We do have a 15, but we don’t have the Pro or the Pro Max in stock. You have to order online, and we hope to have stock soon, and all this kind of stuff.
Meanwhile, I’m sitting there rocking. Obviously, rocking my iPhone 15 Pro Max. And I’m thinking, whoa! [laughs] I’m glad I ordered from Apple.
It’s like having a guide dog sometimes. When people come and talk to you about the dog, and they ask what the dog’s name is and all that kind of stuff. And sometimes, that’s wanted interaction. And sometimes, it isn’t, ’cause you just want to get on with your life.
But people are coming up and talking to you about your iPhone. “Where did you get that from?”
I said, “I ordered it from Apple directly, mate.”
They said, “Oh. Maybe, we should have done that.” [laughs] Interesting.
Anyway, I’ll get on with Christopher’s email now.
“These devices have been great tools and toys since I started using them, and they’re not cheap. Hence why I’m a little upset.
I have to give the marketing team at Apple credit, though. They know exactly what they’re doing. Make no mistake.
USB-C is a great step in the right direction. And I’m glad the EU made them do it, even if Tim was kicking and screaming all the way.
Bye bye, MFI program, and good riddance to you.” [laughs]
“USB-C has many benefits, including the ability to provide more power to accessories.
Let’s get USB Braille display and hearing aid support.
Well, we do have USB Braille display support, at least in part, because my Mantis is working beautifully when it’s connected to USB.
How would USB hearing aid support work? That might require some work from the manufacturers as well.
He says: “I don’t know too much about hearing aids, but surely, they make models with USB-C connectors – meaning no need for mandatory MFI or other less than reliable Bluetooth connections.”
Yeah. I mean, you wouldn’t want that all the time because you wouldn’t want to necessarily walk around with a cable dangling from you, which is what I did for years to use my iPhone effectively. And there’s no doubt that when latency is good and things are reliable, MFI is a very nice standard.
But yes, why not have a USB-C port in the door of a hearing aid, for example? Interesting.
“Connect your hearing aids directly to the phone”, says Christopher, “for a more reliable wired experience.
Now that I think about it, I’ll leave lightning behind, when the time comes. It’s an inferior proprietary connector that doesn’t deliver a lot of power.
From what I’m reading, we should now be able to do things like directly connect USB hard drives to the phone and power them, without the need to provide external power. We were afraid Apple would enforce their ridiculous MFI policy on USB-C, just like they did with lightning. But this doesn’t seem to be the case.”
No, it doesn’t, Christopher. I can confirm all sorts of lovely things work with the USB-C port, as was indicated in episode 250 when I was connecting some audio gear.
I also know of people who are using wired ethernet adapters, which is just brilliant because I do take a USB-C to ethernet adapter for my ThinkPad.
And in my view, the ThinkPad should still have an ethernet port because it’s generally marketed at business professionals who may well need it. But anyway, that’s fine. I’ve got the dongle. Got with the program. What choice did I have?
Sometimes, when you go to hotels, the Wi-Fi is just too saturated. And you get much better performance if you just use hardwired ethernet.
So now, you can plug that into the phone, and it works.
Christopher continues, “No more paying Tim a royalty to ensure an accessory works properly with the phone you spend your hard-earned money on.
Come on, EU,” says Chris.
Sounds like a song by Dexys Midnight Runners. “Come on, EU.”
“Let’s force Tim to stop selling products with sealed batteries.
Speaking of that, I find it very strange Apple is happy about right to repair.”
Me, too. That’s an extraordinary turnaround.
“Here goes my cynometer again,” says Christopher, “but it seems very suspicious they’re happy. If they were as angry as they’ve been about USB-C, we would have won. So maybe there’s something sneaky going on we won’t realize, until it’s too late.”
Yeah. I really want to understand Apple’s change of heart here, Christopher. It’s extraordinary, because they have spent quite a lot of money and lobbying muscle trying to defeat these right to repair laws.
And now, it’s all peace and love in California. Strange.
“My first thought,” he says, “is that they’ll stick it to the regulation by charging outrageous prices for replacement parts, including batteries, when this legislation starts in 2027.”
Yeah. Okay. So if the cynometer goes up to 100, I’d give that about a 75. I think you’re somewhat justified there, Christopher, but good on you.
Caller: Hello, Jonathan! This is Reggie in Yakima, Washington, with a bit of a cold so please forgive me for that.
A couple of things, I think, didn’t get adequately covered. But they weren’t really covered in the event either, so I can’t fault you for that.
But one is, yes, the form factors Between the 14 and the 15 are very similar. But will the cases for the 14 still fit the 15 phones adequately? And the battery life, if there’s any differences at all.
Obviously, they didn’t talk about it during the event. But it really annoys me how Apple manages to spend their battery life specs and only say things that they think they want you to hear, without actually giving you the data that you need to make a decision. [laughs]
I really noticed that with the AirPods, … And they keep giving you different numbers, it’s like statistics, playing games.
But I would be curious if the 15 and 15 Pro are any better on battery life than the 14 or 14 Pro. I’m gonna go to the Apple Store app and look at it myself here in a minute. [laughs]
But I think what you said about USB-C is, … I mean, so what? We’ve got a different port. They’ve restricted it to the same capabilities as the Lightning port for now. And even the one with the 3.0 controller in it, if they’re not using that or they haven’t implemented that in the hardware, …
I think, part of the difference is you should have a higher power output. And I don’t think that’s the case, and they haven’t implemented any faster charging or anything to make the USB-C port worthwhile, other than the fact that it’s one less cable, and I was a little disappointed in that.
I was gonna get the 15 Pro this time.
I wouldn’t miss the home button on my SE2020, but I would certainly miss the fingerprint (touch ID), and I really hope they bring that back in some form.
The ultra-wide chip in the 15, if it’s last year’s chip, it must be the first generation ultra-wide, not the second. So I feel that I’ve either gotta get the 15 Pro, or the new SE, depending on how much better they can make the battery life than the current one.”
Battery life’s a big factor for me, too, but I would never go to the Max.
I did notice in iOS 17 beta, that one little bug that I need to report. And it being this close to release, I bet it won’t get fixed.
If I’m on a phone call and I bring up the call screen, (I’m using FaceTime, or I think even on a regular call), where the mute button should be, and the share content, it’ll say mute on when it’s off, and mute off when the phone is muted. So it’s kind of reversed. But I think that might only be during FaceTime calls.
Jonathan: Reg, if you don’t mind, I’m just going to back off the microphone a bit and move away, ’cause I don’t want that cold. [laughs] I mean, you’re a great guy and everything, but I don’t want that cold.
I hope you’re feeling much better by now.
Reg, for those who don’t know, goes way back to my first ever online audio project, which came out in, I think, 1996, and it was called “The Voice Behind the Keyboard,” and Reg was on that.
Alright, let’s go through some of your things.
The iPhone 15, even without the Pro, does have the second generation ultra-wide broadband chip – the Precision Finding.
So if the better camera, the LiDAR, some of those benefits of the 15 Pro don’t excite you, you can get that feature just by buying an iPhone 15.
There were all sorts of rumors about the USB port and what Apple might do, and we’ve covered that in episode 250. And also in this episode, with Christopher’s reference to the very open approach that Apple has taken.
It’s turned out this USB-C port is a big deal. I gotta say, with all the peripherals that I can connect without effort to this phone, the USB-C arrival has made this the best upgrade in many many years. I’m absolutely delighted with what I can do with the phone, thanks to the USB-C port. So it’s a big one.
It’s what President Biden in earlier times might have called a BFD.
Next, the iPhone 14 cases will not fit the iPhone 15. People have tried, apparently.
On the subject of battery life, well, now that the phones are out, (and this message was recorded a wee while ago because we’ve been a bit backlogged over the last couple of weeks), they’ve done a teardown and they found that the batteries in the new iPhones have a larger capacity, but Apple is giving the same specs. So it may be that the newer phones just require a bit more power for something that they’re doing. Battery life estimates are the same as before.
Caller: Hi, Jonathan and Living Blindfully community. This is Raz in USA, in Colorado.
You had a great show this week.
I am just thinking this week about a problem that has bothered me off and on for years regarding Braille Screen Input on the iPhone. This is, apparently, not a problem on Android, so I don’t see why it’s an issue on iPhone.
But it’s the fact that when you turn on Braille Screen Input, whether you have screen away mode or you have it in tabletop mode, there are gigantic letters and words in between the braille dots that come up as you’re typing.
And it occurred to me while I was on the bus the other day, (and somebody was staring at me because I have a guide dog and people do that), that they could potentially be seeing everything I’m typing. And that was very distressing to me because I’m extremely low vision, so I don’t have screen curtain turned on on my phone.
I do use my vision as sort of a navigational aid, not really to read anything on my phone. But it is useful to help find things.
And it bothers me a lot that we, as blind people, are losing this privacy.
And when I bring it up with other blind friends, some of them are like, “Wait. What?”, like they didn’t even know this was a thing. [laughs]
And some of them are like, “Well, why is that a problem to turn screen curtain on?”
And I find that screen curtain is not really an acceptable solution because that adds a lot more gestures on my end. It makes using Braille Screen Input much more complicated and difficult than it needed to be. Whereas previously, I could just flick the rotor and turn on Braille Screen Input. Easy peasy.
Except for that part where everybody could see what you’re typing.
And I know folks have brought up, “Well, can’t sighted people look over the shoulder of other sighted people and see what they’re typing?”
No, they really can’t ’cause I’ve talked to people. You can tell when somebody’s looking over your shoulder. Even I can kinda tell ’cause you can sort of feel them hovering back there, like trying to stare at what you’re typing. [laughs]
But with Braille Screen Input, you have your phone facing away from you, kind of held like maybe in the middle of your stomach or by your chest. And it’s this big, moving, flashing sign of letters and words that anybody who just looks at you can see.
I even mentioned offhand to a sighted friend. She was like, “Yeah, I didn’t realize you didn’t know. But yeah, I’m always looking at what you’re typing.”
That was very unsettling to me.
So it occurs to me like it would be nice if Apple would allow an option to turn off that text in the middle of the words. Or at least, to automate a keyboard function in shortcuts so that when you make that rotor gesture to bring up Braille Screen Input, that the screen curtain turns on automatically.
Jonathan: This is something I have never considered because obviously, as a totally blind person, I just leave screen curtain on. So no matter how I get data into my device, no one can see what I’m typing.
I do understand what you’re saying, and I agree. If Apple’s going to add something here, it should be an option.
I don’t think it should be the default because there are sighted professionals who use Braille Screen Input, for example. It might be that they’re teaching a student to use Braille Screen Input on an iPhone, most likely, and they do need to be able to see the text that’s coming up there.
Also, there may be some people with low vision who actually do want the confirmation of what they’re Brailling into the device, because it can be a very good teaching tool. If you’re teaching yourself Braille, and you’re entering text using Braille Screen Input, getting that immediate confirmation on screen that you’re entering data correctly is beneficial.
But I understand the privacy issue, and I understand why it bothers you.
So maybe in the VoiceOver settings for Braille Screen Input, there could indeed be a toggle – “Automatically toggle screen curtain on during Braille Screen Input.”, or something like that.
I’d be interested in knowing what others think about this. And whether if you use Braille and you have some usable vision, if it bothers you too.
Voice message: Hi everybody, Jonathan and Living blindfully listeners. This is Brandt from a warm and windy Johannesburg. I hope you’re doing well.
This is in response to Bryant in episode 249, Regarding the HIMS Braille display that he has which is a BrailleSense 6 not working with iOS 17 latest.
As of the 18th of September, I can report that my Braille Edge 40 is working perfectly well with iOS 17. In fact, I have zero problems as of this particular recording.
Now for those of you on BrailleSense devices, I do not know what to tell you except at the moment, it seems that those of you who already updated to iOS 17 might be slightly up soup Creek without the assistance of the panel. Nothing I can do for you, or tell you to do except if you have not updated to iOS 17 as of yet, don’t. Stay on iOS 16 and Let the poor guys that have already done it keep you up to date on what is happening in iOS 17 with their BrailleSense devices.
Now Jonathan, just a quick language lesson for you, sir.
Last week, when you read my letter, my email that I sent you, you grievously mispronounced die Bokke. I do not know what happened there.
But D-I-E in English is die. Not true in Afrikaans. It’s D, as in the door, as in the same as the German.
Now, die (D-I-E) Bokke (B-O-K-K-E). The reason why the K is doubled is because if you use one K, the O will lengthen into a wuh sound, so a long vowel. So die bokke is what we in Afrikaans would call the springboks.
Jonathan: I always came away from a Living Blindfully episode learning something.
I understand that there are other Braille displays experiencing issues, and I’m very disappointed to hear this.
Obviously, some of us made a bit of noise because the refactoring that Apple was doing of the HID Braille displays really did break for a while during the cycle.
So the reason why the Brailliant is working well for you, assuming it’s a newer Brailliant, is that it’s a HID display, and a lot of work has been done to sort that out. Not so much with some of the other Braille displays that are experiencing some issues.
I should also say, while we’re talking about Braille and connectivity problems, someone on Mastodon said, “We know you still have a Focus 40 Blue. Could you please check if it works on USB in the same way that your Mantis does?”
And the answer is no, it does not. It gets charged from the phone, so you can charge your Focus 40 Blue through the USB cable, but it doesn’t show up as a Braille display in the way that my Mantis does.
It’s possible that that is related to the connectivity problems that some Braille displays, including the Focus, have been experiencing with iOS 17. So I’m not prepared to say at this stage that the Focus 40 Blue doesn’t work on USB-C because it could be something else going on.
Let’s hope so because it is kind of nice from time to time. First, to get some charge for your display, if you find yourself running flat. And second, to use it over USB.
Caller: Hello, Jonathan! This is Claudia from Tampa, Florida, calling you again.
I was just wondering your opinion on this, and I’m very interested in learning your opinion and the listeners’ opinion as well.
It seems that Apple is really banking on this Vision Pro being a pretty massive success. At least to me, it seems this way because they’re adding the facial video to the 15 Pro and 15 Pro Max.
Very curious to see what you guys’ opinion is on this.
I was very interested to see this.
Jonathan: Nice to hear from you again, Claudia.
Even Apple has finite resources, and you can see the tide changing.
As far as Apple is concerned, Vision Pro and its derivatives are the future. iPhone and iOS are now very mature products.
So you had a grab bag this year of pretty nice features, in my view, but they weren’t outstandingly massive features that would have taken considerable development time. And the reason for that is that a lot of resource, a lot of effort, a lot of thinking is going into Vision Pro.
Along with the car which we still haven’t seen yet, this has been Apple’s big thing for some time now. They believe this is the future. They’re working very hard on it. And indeed, there’s been a bit of cross pollination, in my view.
When you look at some of these LiDAR features that some of us are benefiting from on the Pro models, they are really going to shine on the Vision Pro. And because these iPhones are Pro devices, yes, they want to provide a way for creators to make content that will make the Vision Pro truly come alive. Even if you don’t have one at the time, you can capture a moment, and then play it back in spatial video using Vision Pro.
I think they’re pretty relaxed about the first version of this not being a hot seller. They know the price point is very high. They know it’s going to appeal to a certain type of early adopter with lots of dosh. But it’s going to get cheaper, and the product is going to become more capable over time.
Now, there’s a cornucopia of subjects here (That’s a great word, isn’t it?) from Mike May. But since he starts off on Apple things, we’ll include it here. He said:
“I got my iPhone 15 Pro.
The double tap for VoiceOver is not the same.”
Well, yes. We’re back on this one again. It’s obviously bugging people.
He says: “I have lengthened the delay from 4.25 to 35, but it still seems different.”
Yes, it will be. I’ve gone up to about 6.3 milliseconds, and that seems to be the sweet spot for me. It will vary for people depending on the speed that they double tap. But for me, it’s now completely back to the way it used to be.
He says: “I have been telling people in my talks for 25 years that blind people need a nearby people-finding technology. We are at a social and professional disadvantage in a meeting or social situation to connect with others in the room or bar.
Sendero built an app around 2012 using Bluetooth. If others were running the app, you could get alerts that they were nearby. And then, you could ping each other.
I recently decided to revive that app, so it may be coming back. I have a test version ready to go.
I was thrilled that the new iPhone has the ability to find other users as though they were AirTags for the same purpose.”
I agree with you, Mike. This is a big deal for blind people. And that’s why Bonnie and I have both upgraded this year, so that we can find each other.
It’s going to take a while because there are plenty of people who won’t have this new chip on their phones for some time to come.
But as this rolls out through the iPhone ecosystem and people upgrade over the next few years, this is going to make a big difference. And this is why I’m delighted with the iPhone 15 range this year. With the USB-C and all the peripherals you can work with and this new chip, I think it’s one of the most significant upgrades we’ve had in a while.
Mike continues: “Sendero also built an app called Easy Listening, along the lines of HeardThat. The latency was way too long to be practical.
I started testing HeardThat 4 years ago, when I saw it at CES. I think everyone could use help in noisy social settings.
I sent my complaint to YouTube about the Lyft driver.
I highly recommend the Zetronix” (now, that is spelt Z-E-T-R-O-N-I-X) “Kestrel Pro camera glasses.
I record every interaction with ride share, in case they reject my dog. I have found having a personal body cam is great for other video recordings as well, including documenting aggressive dog behavior and just recording family events.
I am thrilled with the new Aira ability to use screen sharing to see my iPhone screen. I use this to get help labeling buttons. I hope Apple will allow those custom label profiles to be shared from one user to another at some point. I have made about 30 labels on the Bird Buddy app, for example. More about that later.
I prefer the Amazon Fire OS to the Android Voice on the TCL TV. I had both.
We had an Uber driver in Richmond near London with the first name of Mohsen.”
This really is a stream of consciousness. [laughs]
“It’s spelled M-O-H-S-E-N.”
Yes, I’ve seen that.
“Love our soup drinker microwave. Will send a report comparing the open com version one and version two.
Phew! I am caught up.” says Mike.
“Thanks as always for an amazing job.”
Really appreciate that, Mike. Always good to hear from you.
Karen McDonald writes:
“Hello, Jonathan and all,
Since the latest update of iOS 17, I’ve noticed that while using a Braille display, the gesture to show or hide the on-screen keyboard doesn’t work.
Has this option been taken away, or am I missing something?
Also, Apple took away a lot of the classic sounds one can choose for text tones, although they are still available for ringtones. One of those sounds was one I’ve used ever since I’ve had an iPhone, and I haven’t found one yet that I like. Less than happy am I.”, concludes Karen. But she still wishes us peace.
I presume she also wishes Apple peace, but I am not sure.
Thank you, Karen.
I thought that they were all still there under classic, but I haven’t investigated too much because I use my own text tones that I’ve created. So if you can tell us which one you’re missing, we can go ferreting around. Although I’m sure, you’ve done that already.
But under text tones, you should be able to double tap classic, and I’m told they’re all supposed to be there still.
The one thing you can’t do is change the notification back to the tri-tone, which is such a familiar, iconic iPhone sound. I noticed that some apps are still using it, but the apps themselves have to make that decision. We can’t change the default sound.
Carolyn Peat also has comments on the sounds and iOS 17 before we go on to other topics.
“3 topics interested me from a recent episode.”
Well, that’s good. [laughs]
“iOS 17”, she begins.
“I recently updated to the new iOS, and have to say I am disappointed that once again, the old chestnut of notifications has arisen.
The sound that some apps used to make before the VoiceOver would read the notifications is now very quiet. And if you had a hearing impairment, you would struggle to hear the notification for your Uber, or any of the many food delivery services.
I also don’t like the loss of the sound when pressing the side button to activate Siri. There is now no sound at all.”
And, on to another topic – guide dogs.
“Imogen, my first guide dog, retired back in May, and is now living the good life in New Plymouth.
I wanted to let George know that the best advice I ever got was from other guide dog handlers. They told me it would take a year for things to settle and feel as if they were clicked into place, and this was very true. Also, your dog will test your patience to see what they can get away with, just like having a toddler in the house.
I loved my first experience of having a guide dog, and can’t wait to meet my next one when it is allocated to me.
I agree that it is best to declare your blindness at the interview stage. I call it addressing the elephant in the room. This also makes communication easier, if you get the job when and if you come across an accessibility hurdle you need to overcome.
I did hear a story today where some companies are looking at having candidates interviewed by artificial intelligence (AI) to help them shortlist candidates. I wonder if this will be a positive advancement, or creates more accessible problems for us as a community.”
This area of artificial intelligence in the recruitment process is quite concerning because it can reinforce biases, and you don’t have a chance to challenge them. So certainly, I can tell you from first-hand experience, it is something that recruitment agencies dealing specifically in the disability space are talking to HR professionals about because we know that so many employers out there have these misconceptions that we have to constantly work against that blind people are a health and safety risk, or a productivity risk, or any number of things. At least, if you’re dealing with a human being, you can try and tackle those things head on.
If the artificial intelligence is just making those assumptions and you’re never sure why, it really could be a problem. So it’s something that the disability community in general has to be quite aware of, and there have been some articles about this in the last couple of years.
Returning to Apple and iOS things, Shawn is writing in and says:
On episode 250, you mentioned still using an encrypted iTunes backup for your old phone to restore the new 15 Pro Max.
I was wondering if you could please outline this process. I obviously don’t expect you to erase and start over with the old phone, unless you were already doing that anyway. I’m a little confused as to what to do if I decide on iTunes.”
And Shawn said he’s getting the new iPhone 15 Pro between October the 10th and the 17th.
Thank you for writing in, Shawn.
Even if I wanted to do that laborious task again, I couldn’t because I can tell you that when I get a new iPhone, there is a queue of Mosen Vulture adult children who are quite happy to swoop in and take the old iPhone away.
Actually, for the record, this time, it has gone to Henry, the wonder son-in-law, who’s got an iPhone XS at the moment through this pipeline that we have going here in Mosen land. [laughs]
And that iPhone XS is starting to get a little bit rough around the edges for various reasons. Its Bluetooth is acting up in certain circumstances. So I felt like Henry was the one who would get it this year.
It’s hard for me to keep up, and I really need to put a spreadsheet together, or something, so I know who got what when. Because if you are a parent, you will know this whole, “It’s not fair, dad.” thing that goes on with things like this. And one tries to be scrupulously fair.
But that’s not what you wrote in about, Shawn, is it? So let me just try and talk you through this.
What you would do is connect your phone via a lightning cable, since you’re coming in on the old iPhone. So a cable from lightning to a USB port on your PC. That might be lightning to USB-A, but you could potentially use the lightning to USB-C cable that Apple ships if you’ve got a USB-C port in your computer.
The iPhone will ask you, when you cable up to iTunes, if you want to trust this computer. You may need to double tap trust, and also enter your passcode at various points in this process.
When you run the latest version of iTunes, you can tab through until you find the iPhone button, and press the spacebar. And you should then see all the information about your iPhone – the phone number, how much storage is available on that phone.
If you keep tabbing around, there’ll be a backup section. You don’t have to change the radio buttons default, which is to backup to iCloud because it’s convenient. It does it in the background when your phone’s plugged in and charging.
I make regular iPhone backups, just in case.
Even though you’ve got that as your default, you can still keep tabbing through and choose to backup to this computer.
Now, there is a checkbox in there that asks if you want to make an encrypted backup. You want to check that. And then, you’ll be prompted for a password. Once you’ve done this once, then it will keep using the password until you change it. I set that up years and years ago. I’ve used the same password every year when I’ve done these backups, and it just works, so this is a one-off setup process.
Whatever you do, do not forget this password. Nobody can help you. It doesn’t matter how much you beg and plead with Apple. They can’t help you because they don’t know it. It’s securely encrypted on your PC, and there’s nothing anybody can do for you if you forget this password. And it’s case sensitive.
So if you use a good password manager, (1Password is the one I use, and I highly recommend it.), you might want to create a secure note in there that has your iTunes backup password for your device.
And then once you’ve done that, you just hit the Backup button, and it will go ahead and back up. It may take quite some time.
When it’s time to restore to your new iPhone, you will be given the opportunity to restore from your computer.
As part of the setup process, you need to cable up again, this time with a USB-C cable at the iPhone end, of course, ’cause it’s an iPhone 15. You will be asked for the password for your encrypted backup so you can restore it. It will just take a little bit of time, but it will all get done, and you will find that most of your stuff just works without having to enter too many passwords as you do from an iCloud backup.
So I hope that helps. Obviously, I’m not in a position to go through the process now, but that’s a broad overview of how it works. It’s pretty straightforward.
And also, because this is a mainstream kind of issue, there are plenty of articles online about creating an iTunes encrypted backup that you can also reference.
Voice message: Hello, Jonathan and everyone listening to this particular Living Blindfully episode. My name is Kelly Stanfield, and I am a long-time listener, first-time contributor.
I’ve been playing with iOS 17 for working towards 48 hours, I guess. I don’t remember now exactly when I downloaded it.
But I’ve run into one thing that is fascinating within the phone app now.
Used to be you could be in your list of recent calls, and double tap on, say a phone call to Peggy, and it would just dial that same number to call her at again.
But now, it just does not do that, even though it ought to.
I have to swipe up or down, either one, to the more info button, double tap on that, and then swipe through the information I’m given about the previous call to them, until I get to the option to call back to whichever number that had been previously called to. Whether it was the iPhone, the phone, it’s automatic to whatever. It’ll be like “call iPhone”, it would list something of the sort. And then, you’re stuck in that contact card info. You have to back out of that later, when you get off the phone.
And I noticed when I would be in the mail app and would want to just mark a message as read, (Okay, I see that it’s here. I’ll come back and read the full of it later.), I’d swipe down to mark as read, and I can’t quite really understand what Alex is telling me he’s done. But it is no longer unread, [laughs] so I could tell it’s actually done what I wanted it to do.
Well, thank you for calling in for the first time, Kelly. Because if nobody called in, where would we be, eh? Where would we be? We’d be listening to me for 2 hours waffling along. That’d be terrible.
Now when I go into the recents list of the phone app, my behavior is the same as it’s always been, which is that whether it’s a call that I’ve made or a call that I have missed that’s been received, if I double tap, it does call back. So I wonder what the difference is between your configuration and mine. This is obviously with an iPhone 15 Pro Max.
Also, when I swipe down to mark as read and I double tap, I just have the message marked as read. I don’t get any feedback. But it could be some setting that I have set differently.
So maybe others can comment on whether the recent calls list is working the way it always has, or whether they’re seeing what you are.
Advertisement: When I finish the show, I’ll be rewarding myself by spending more time playing my favorite game of the moment. It’s called TimeCrest.
If you’re old enough to remember the classic text adventures, it’s kind of like that, but way better – with an epic soundtrack and many twists and turns.
In Time Crest, you’re the main character. A young mage named Ash contacts you frantically through a pocket watch, asking for your help. Ash’s world, Alincia, is about to be destroyed by falling meteors. But you demonstrate the ability to save Ash’s world by turning back time.
Chatting with Ash is kind of like texting a friend who needs help. And I must confess, I’ve come to look forward to the text conversations and what’s happening in Ash’s life.
Play Time Crest on your iPhone, iPad, or Apple Watch. Play it for hours at a time. And believe me, I have. Or you can dip in and out when you have a few minutes.
This is such a cool game.
Let’s return to HeardThat – this remarkable app that we’ve been talking about on the show in recent weeks.
Harry Bell is writing in. He says:
“I have been trying out HeardThat.
But to use it with my AirPods Pro 2, I need to be able to mute the microphone on my AirPods Pro, in order that the app can use the microphone at the top of the iPhone. I have researched this, but I cannot find any way to mute the AirPods Pro microphone.
The result is that I get an echo with lag, as the AirPods Pro microphone and the iPhone microphone both process sound and feed it to me.
Do you have any suggestions for this?”
It sounds like something you’d want to talk to HeardThat about directly, Harry. But I’m wondering whether your diagnosis is quite correct. There certainly is an echo because when you talk, it can take a while for the round trip to take place to any Bluetooth device. It’s just enough that it can be kind of disconcerting.
So regardless of the microphone being used, I think what you may be getting is that you’re able still to hear the person talking, or yourself talking. And then, you’re getting HeardThat relaying what it’s getting. And there’s just enough of an echo for it to be a little bit disconcerting.
There’s not much that heard that can do about that because it’s just in the nature of Bluetooth. The latency would go right down if you use something wired, but you probably don’t want to have a wire stretching across a restaurant table, for example.
I believe that even when AirPods are connected, an app can expressly use one of the iPhone’s microphones. And I believe that’s what’s happening in HeardThat’s case – that it is using the correct microphone. It’s doing what it should, but you’re getting that disconcerting echo because you’re wearing AirPods and still hearing in your real ears what’s going on. I may be wrong, but I think that’s what’s happening.
But definitely check in with HeardThat. I’d be interested to know how you get on with the app.
Let’s go to India and hear from Anil. He says:
I would like to point out that the Next Big Thing contest has a catch. Just below the submit button, it says the following:
This contest is open to all individuals 18 years or older who are residents of the United States, the United Kingdom, Ireland, and Australia.”
No! I won’t be getting my Amazon gift card for my really good idea after all. I should have read the fine print, Anil. I appreciate you letting us know.
That does seem very very disappointing to those Freedom Scientific customers around the world who may have great ideas. We know, for example, that there’s a large number of IT professionals in India brimming with good ideas, and some of those will be blind people using JAWS. So that is a shame.
I wish I had read a bit further down, and I would have pointed this out myself. So thank you so much for letting me know that.
We are mentally transporting ourselves to Australia where Lachlan Thomas has been typing, and he has typed this:
I’ve listened to at least some of your discussion about the Lyft driver who made YouTube videos about refusing passengers with service animals. I agree that this driver is in the wrong, and he is spreading misinformation about dog guides.
Not being a dog guide user myself, this is something I’ve obviously not experienced. But I have friends who have had these problems.
I was particularly disturbed by Bonnie’s comment about drivers refusing cane users. I’m a cane user. I love my cane. I wouldn’t go anywhere without it because I depend on it. If I was refused a taxi because I use a cane, I would be livid with rage.
I don’t have a lot to contribute to this discussion. But one point I don’t think has been raised in your discussions is the driver’s nationality. I realize this is a very controversial topic, and I’m not too sure if I should bring this up or not. The man who created the video appears to be of African-American or Hispanic-American descent.
It is very well known that African-American and Hispanic-American people have faced extreme discrimination through history. Imagine if this man was a ride share service client and was refused a ride because of his race. Imagine how outraged he’d be. He would be complaining, and there would be a massive uproar.
And yet, he is condemning discrimination of those who use service dogs. Do you find this ironic? I do.
I don’t want to bring race or nationality into the discussion if I don’t have to, but I thought it worth mentioning this.”
Thanks very much, Lachlan.
And yes, in episode 249, we did have a listener cover this, and your email came through before we published that episode, hence the overlap. But it’s a point worth making.
And I guess I make the same comment that I did back in 249, which is it is so sad that those of us who experienced discrimination are not necessarily immune from dishing it out to others who face different reasons for being discriminated against.
Lena continues with this topic, and she says:
If Google gets enough complaints about the content in the 2 videos you mentioned, Google will intervene. I have complained.
In iOS, go to this channel, find the offensive video, but do not activate it. Swipe up to options, double tap. Swipe right to report, double tap. Double tap on the choice that applies. There are 6. Double tap on report.
Guide dog users are not the only visibly disabled people who are encountering refusals from ride share drivers.
I still enjoy the show.”, concludes Lena.
Phew! That’s a relief.
Thank you very much for writing in again, Lena.
I have actually complained using their legal form on the basis that he is advocating the violation of multiple laws in multiple countries, and really haven’t got anywhere with that process. So if people do try the more conventional reporting technique, it’ll be interesting to see whether anything happens.
You might remember that a while ago, we spoke on the show with Dan Clark from Focusrite.
We talked about the Focusrite VoCaster 2, which we’ve demoed on the show. That is a great little audio interface for podcasters. Very accessible, and it’s got some powerful features on there.
And we also talked about the accessibility issues that were plaguing the Scarlett range.
Dan has been in touch to let me know (and I have been following this already) that the 4th generation of Scarlett Solo 2i2 and 4i4 have been released.
Now, there are none of the bigger models in the Scarlett range available in 4th generation at this point. But you’d have to say that if they’re releasing these 4th generation Scarletts, they will get there eventually. No confirmation of that, though.
These new devices come with a raft of improvements around screen reader support, and Dan wanted to make sure that I knew fully what’s going on so I could tell you, and I’m glad to do that.
“The new Scarletts are supported by the brand new Focusrite Control 2. This software was built to be completely accessible from the ground up. As preamp controls including gain are digitally controlled on the 2i2 and the 4i4, these, too, can be controlled from the software.
Every element of the software should work with your screen reader.”, says Dan. “But if we find we are falling short in any area, then we’ll be looking to make improvements.
One improvement we’re about to release is having the software react to interactions with the physical hardware, so that it switches focus and announces the current state.
Aside from the new 4th generation, we’ve also made a series of updates to Focusrite Control 1, which is used with 3rd generation Scarletts. Although not fully accessible because there are simply some things which are impossible to fix, we have made a series of improvements such as line-in/switch pad and air for preamps, mixer channels, pan, level, mute, and solo. Mute, dim, stereo switch, input source, and level output controls.
Unfortunately, the following items still have some accessibility challenges. Editing mixer channels input/output name labels is not possible. Adding a channel to a mix is possible, but has user experience challenges around focus order and navigation. Selection of target output mix requires prior knowledge to know to navigate to the target, and click to select.
We’re looking at creating a support article for this.”
Dan continues, “Although Focusrite Control 1 for 3rd generation Scarlett will never fully get to an ideal situation, I’m delighted with where we are with the 4th generation and Focusrite Control 2, and I’m extremely hopeful for the future.”
Thank you very much, Dan, for keeping us updated. It’s great to see Focusrite responding in this way.
When I got my Focusrite Scarlett 8i6 3rd generation device, it was completely inaccessible in terms of the software, and there were challenges around that.
I have subsequently migrated to another product which is fully accessible at the moment, but that was mainly because of the accessibility challenges that showed no sign at that stage of being resolved.
The Focusrite brand is great. The customer service is exceptional. They’re good people.
And now that we have these accessible options emerging with the 4th generation, and even some improvements now with the 3rd, they certainly should be in contention when anybody is considering an audio interface.
Voiceover: Stay informed about Living Blindfully by joining our announcements email list. You’ll receive a maximum of a couple of emails a week, and you’ll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want.
Join today by sending a blank email to announcements-subscribe@LivingBlindfully.com.
Why not join now? That’s announcements-subscribe@LivingBlindfully.com, and be in the know.
Xavier Fucci (hope I pronounced that correctly) writes:
“Hi all Living Blindfully listeners,
Just a note of concern.
Humanware has been especially lazy with updating their various notetaker devices.
Just recently, an update came out. If you still haven’t updated, DO NOT DO IT.” (written in blocked capitals)
“It’ll brick your productivity. It messes with an encryption part of the OS and makes Google Docs unusable. Some apps that rely on Android keystrokes (the part that manages encryption) will sign you out upon restart.
Also, due to Humanware’s laziness, they will never fix it. So PSA for you all, don’t do it.
I shouldn’t have to do this, but the update blatantly lied, saying there were improvements, and Humanware hasn’t put this through QA.
Clearly, something needs to change with them before the BrailleNote becomes a FailNote. Yes, that’s my official name for the thing.”
In case that isn’t clear to all listeners, the email is referring to the BrailleNote Touch. Clearly one unhappy customer there. I don’t have a BrailleNote Touch to try.
So how’s it working out for you? Have you installed the update recently? Is it working as you hoped? Or are there wide-spread problems that others are seeing as well?
You’re always welcome to get in touch. opinion@LivingBlindfully.com. And you can call the listener line in the US: 864-60-Mosen. That’s 864-606-6736.
We’re returning to the important subject of accessible household appliances.
Jane Kerona says:
“Hi, Jonathan and Living Blindfully listeners,
It’s Jane Kerona from Silver Springs, Maryland in the US.
Someone on episode 246 was asking about accessible refrigerators and being able to access water as opposed to ice.
Although many features of my Whirlpool refrigerator aren’t accessible and it drives me crazy, this model does have different spigots on the door for ice and water.
I have mine connected to a reverse osmosis filtration system, so that my ice and water are made with purified water, and I love this feature.
The thing about this refrigerator that infuriates me is the touch screen on the door that allows you to set the temperature of the refrigerator and freezer, as well as choosing between large cubes, small cubes, and crushed ice. I think there’s also a light that you can control with that panel. If you get your fingers anywhere near any of these controls, they change without any notification – no beeps, nothing. Every so often, I have to get an Aira agent to help me reset my refrigerator.
I got so frustrated, that I duct taped a board over the touch panel so I’d be less likely to change the settings inadvertently.
Things are better now, but I still resent my refrigerator.
Unfortunately, I didn’t write down the model number, so I can’t share that. Maybe that’ll be my first task with the Be My AI, which I got approved for last week.”
Well, hope you enjoy that, Jane. Do let us know how you get on with it.
This is a really important and frustrating topic, isn’t it? – This whole question of accessible appliances. And it makes shopping for them so convoluted, because there are many things that a blind person has to take into consideration when making that purchase.
And here in New Zealand, Jeanie Willis writes:
in response to the person who asked about an accessible fridge with an ice dispenser, I have a Samsung twin-door fridge with an ice and water dispenser on the front of the freezer door. Don’t know the model, but it is likely they won’t be the same in the US anyway.
But the important points are that it has an individual button for each of the 3 options – water, ice cubes, and crushed ice, and makes a clear beep when one of these is pressed. So it doesn’t matter that they are a flat touch panel, that you wouldn’t know they are there by touch. There are no other buttons, just those three, and the place they are is clear by the shape of the panel above the dispenser.
So my advice would be to anyone wanting something like this, to ask in store for a model that has individual buttons for each thing, no buttons near it that could be accidentally pressed, and crucially, that the buttons make a sound when activated.”
Alco Canfield says:
I have a Frigidaire which has 2 sprigits in front – one for ice on the left, and one for water on the right. I also have put a mark where you can get either crushed or cubed ice.
I hope this helps.
As for Braille, I love the new and improved Kindle. However, when I read, suddenly I am thrown back a few lines and have to scroll down to get my place. This is annoying, and I hope will be fixed in 17.
I will probably make another call to Amazon. I wonder if they will pay me to quit bothering them.”
Well, they certainly have the profit to do that, Alco because mate, my Amazon shares are beautiful right now. I’m very happy with my Amazon shares. [laughs]
“Thank you so much for everything you do.”, says Alco.
Well, thank you for writing in and making it worthwhile.
“Hi, Jonathan,” writes Mohammed.
“I hope this email finds you and the Living Blindfully community well.
I’m still listening to your podcast and have in fact, been a plus member for a couple of months, since I think you’re doing a fantastic job informing the community of what’s going on in terms of tech, society, and our place within it. Keep up the great work.”
I’m extremely grateful. Thank you so much for your subscription, Mohammed.
“Now, I’m writing about a couple of subjects, namely the All Terrain Cane and Braille.
Let’s start with the All Terrain Cane. I was intrigued by this product, since I like to go out and spend a little time in nature.
My parents are from Morocco, and that is a mountainous country where a normal cane just wouldn’t serve as well. At least, outside the cities. The cities present another problem, but we can perhaps talk about that another time.
Ever since I lost my sight in 2018, I’ve been wondering how to navigate places like that on my own. Truth be told, even before 2018, that was challenging, as I didn’t see very well and was likely to step into holes or the embrace of cacti and its prickly brethren. I was, however, able to navigate on my own on the beach and in other sandier, flatter areas.
That, however, became much harder without my sight. The turning mechanism of my cane would very quickly become choked with sand, which was very hard to remove. I gave up at a certain point, just relying on people I was with at the time.
So I heard about the All Terrain Cane and decided to buy it.
After a healthy amount of procrastination, I am about to get married and go on a honeymoon soon, where I hope to try out the cane in earnest, so I’m really glad I got the thing in time, even though it would have been my own fault entirely if I hadn’t.
I went to the website and ordered it. It arrived just a couple of days ago.
The cane costs $120. And together with shipping fees and taxes, (both taxes that I paid when buying the cane and import dues when the cane arrived), it set me back about €200 in total.
It arrived in a relatively small package with the cane folded up and secured using some sort of stiff, strong wire. It was relatively easy to remove using scissors.
I decided to take it out for a spin. I’ve recently bought a new home in an area that’s still being developed, so there are plenty of sandy paths and dirt roads to give the cane a quick try.
As I said, I hope to try it out more thoroughly in Tanzania, where I’ll be going. I have no idea if I’ll have the opportunity yet, but we’ll see.
The first thing you notice as you lift the cane up is its weight. This thing is built like a tank.
As I walked down the street on my way to a dirt path, I noticed that I had to use my wrist a lot more and apply a lot more force to move the cane around than I would normally do. You get used to it quite quickly, but it can feel a little jarring in the beginning.
The second thing you notice is the sound it makes on pavements. The All Terrain Cane is very loud, and I mean very loud. I mean, people confusedly trying to figure out why a freight train would wander off its tracks and wind its way through a residential area loud.
So after thundering and clattering my way down the street and finding the dirt path, then noise ceased, and the cane really came into its own.
I’ve not felt this secure walking on anything but pavement since I lost my sight. The cane kept moving smoothly. I could feel all the bumps and depressions in the path, and was able to anticipate. I steadily got more enthusiastic as time passed.
Using the All Terrain Cane felt like wandering down a particularly bumpy street. The mechanism kept working. The cane almost never gets stuck. It’s amazing! I don’t use terms like that lightly, but it really is very good at what it does.
I have yet to try it in really challenging terrain, but I’m confident and hopeful it’ll hold its own there too, as I’ve heard others say that it works very well in places like Colorado and Arizona which have terrain that’s a lot rougher than what’s available in my direct neighborhood.
I wouldn’t use the cane on a regular street. It just makes way too much noise.
I live in the Netherlands, where many people travel and commute using bicycles. You would not hear them if you used this cane, and that can lead to dangerous circumstances.
I will be taking it with me on any trip I make though, so that if I find a good place to hike a little, or go to the beach, or any of the other things you can do in places that don’t have paved roads, I’ll have the option to navigate confidently and independently.
I do wish there would be a holding bag for it of some sort, so I can put it somewhere when I’m not using it, and clip it to a backpack, or my belt or something.
I’m very happy with my purchase, and will let you know how it performs in rougher areas.
Now on to Braille. At the beginning of this summer, I decided to finally stop messing around and start learning it in earnest.
I work at Vispero as part of the JAWS development team. And at that time, we started seeing the first hint of Split View, which was mentioned in FSCast episode 234 that came out in August. By the time you read this on Living Blindfully, the betas of JAWS and Fusion should have come out, and the audience will have an opportunity to play with both Split Braille and Face in View, which I think is another fantastic addition to the product.
Split Braille was nowhere near ready for primetime at that point. It needed a lot more work before we could finally release it to the public.
But I immediately saw the potential, especially since I was used to the concept of having multiple windows in view from my sighted days. I couldn’t see well, but I used ZoomText and utilized the ability to have multiple windows side by side regularly.
I was already considering putting a bit more time into learning Braille. I was learning at that point, but slowly. It had no priority because I was able to do my work with speech just fine.
Split Braille pushed me over the edge. I was heavily involved with the team working on it, although I didn’t write any code for it myself. I immediately grasped the potential of this feature, and I took action within the week.
I found myself a Braille teacher, and got to work. I’m now passable at reading Grade 1 Literary Braille and Computer Braille. Goes to show what a bit of motivation can do.
As I was learning, I realized that reading Braille felt like, well, reading. It felt like reading in a way that listening to speech never does, and never will.
I’m still very slow at reading, and speech is still my primary way of consuming written content. But I’m just at it fully for 3 months so speed will come, I’m sure.
It’ll be good to be able to read something without having to pop my AirPods into my ears and visibly shut myself out of my surroundings.
I even sometimes take a moment during meetings to read my notes in Braille while still listening to the meeting. I used to do that all the time when I was sighted. It’s great to have that back.
Anyway, I don’t know if I’ll ever be as good at it as some of my colleagues who will happily remind me that they’ve been reading braille for much much longer than I am alive. But it’s already becoming a useful tool. Add split Braille to the mix, and it gets harder and harder to argue against putting serious effort in having Braille skills. I, for one, am convinced.
Though I am an employee of Vispero, this story is entirely my own. I’m not speaking on behalf of Vispero.
That’s also why I didn’t get into the details of the split braille feature. There’ll be a write up on the JAWS or Fusion Beta page, where you can read all the ins and outs.
I’m looking forward to what the future brings, be it stuff we’re working on at Vispero, or products like the All Terrain Cane that are a great addition to blindness life.”
First and foremost, congratulations! I always used to say to people, marriage is a wonderful institution. But don’t forget, blind people struggled to get out of institutions for years. [laughs] I hope you enjoy married life, and congratulations on taking that plunge, and also for the work that you’re doing on Braille.
I’ve been working with Split View for a wee while with the private builds. It’s good to see it out there in public now. It is a really great step forward for Braille.
It’s tempting me enormously to think about getting an 80-cell Braille display in the studio. That’s how game-changing it is.
Advertisement: Thanks to Aira for sponsoring Living Blindfully.
Aira may be available on more devices than you realize. Trained agents are available on the devices that you use every day. For example, you can use the Explorer app on your Android or Apple smartphone.
The apps always supported the rear camera, of course. But the latest version of the apps also support the front-facing camera. And that means you can even get help to take your next great selfie for uploading to social media.
And regular Living Blindfully listeners will be very familiar with the Envision smart glasses. If you own a pair of those, you can use Aira hands-free to assist with a variety of tasks, including guiding you through unfamiliar locations. If you’re doing a bit of travel again, it’s a great alternative to waiting for meet and assist at an airport.
Aira is also on the BlindShell Classic, so you’ve got access to your phone at the touch of a physical button. And it’s on your PC and Mac as well, which makes it always within reach when you come across an inaccessible website, or you just want to speed up the process when you’re on a busy website and time is of the essence. Aira is there on your device, on your terms.
To learn more, you can visit Aira at their website at Aira.io. That’s A-I-R-A.I-O.
Dennis Long says:
I have a small corner desk.
Is there any small mixer and microphone that will work for my situation? I mainly want the phone audio to be clearly understood along with myself.”
Dennis, I’m not sure that a mixer is the right tool for this. I guess it’s not clear to me whether you want to record on the phone as well, or whether you’re going to record on a computer.
It’d be much simpler if it’s the latter, if you’re going to record on a PC or a Mac. Because in that situation, what I would suggest you do is you get the Focusrite VoCaster 2. That has the ability via Bluetooth or a cable to connect your iPhone to it.
Now, I would recommend using the cable because latency is better. And actually, audio quality is better. It’s a pretty simple cable that you can pick up from Amazon that will connect from your VoCaster 2 to your phone. There are even cables that have a lightning connector at the other end.
Although if you’re upgrading to a new iPhone 15, you’ll want a USB-C cable instead of a lightning cable. Or you can buy a standard cable and put a good old dongle on the other end.
So you plug that into your phone, then there is an XLR input in the audio interface, so you plug in a reasonable microphone. Now, there are actually some very good quality mics for a good price these days, since podcasting is so popular.
Models that come to mind include the Samson Q9U. That’s a really nice mic, and it does double-duty because it also does USB as well as XLR. XLR is the more professional standard that microphones use for plugging into things.
There’s also the Rode PodMic at a pretty good price.
There are quite a few good quality microphones at a reasonable price point these days.
Once you’ve got that all set up and you’re connected to a PC or a Mac, you’ll be able to record you in one track. So that’s your audio in one track, and then the phone audio in another.
So when you’re using a tool like Reaper, you’ll be able to make sure that levels are even between you and the phone. You can do that after you record.
So if you play it back and you find that, “Oh man, my phone’s a bit louder than me.” or whatever, you can fix that up in post-production. That’s one of the really cool things about doing this on a Mac or PC and recording your phone that way.
But in this day and age, a mixer would be overkill for that. And it actually might not give you as good a result as this because you can fix things up, you can equalize, you can adjust after you’ve done the recording.
Paul Hopewell writes:
Many thanks for your excellent Living Blindfully plus podcast, which I have subscribed to from its start.”
I really appreciate that, Paul. Thank you for helping to keep it viable.
“I recall that a while back, you recommended a trouble-free accessible iOS app, which I think blocked advertisements on Safari and also let you specify your cookie preferences once, which the app then applied to all websites. If it does all this, it is an awesome app, so I hope that I remembered correctly.
Can you please remind me what the app is called?
Many thanks, and keep up the great work.”
The one I’m using, Paul, is called WIPR. It’s spelled W-I-P-R.
I think there are a couple of tiers of this. And the big one, when you buy the whole package, WIPR is, I believe, supposed to let you specify your cookie preferences. This is a big deal in countries where European data privacy regulations are in effect because you go to a website and we’ve got this, “We use cookie preferences”, and on and on it goes.
There’s another one called Super Agent for Safari, and my understanding is that that app is specifically designed to deal with that very issue.
So you specify your cookie preferences there. It applies them without bugging you.
So if WIPR doesn’t do what you want, it is a very good ad blocker, and it can hide certain social media widgets that are of no interest to you but just create clutter, which is a big deal for a screen reader user, you might want to check out Super Agent for Safari and see if that does the job for you as well.
It’s Eden. We haven’t heard from Eden for a while.
And she says:
“I have a problem, and I know you have the answer, being an audio guy.”
Oh, man. No pressure. No pressure, Eden.
She says, “I’m getting close to exit speeds with court reporting school. I plan to work with Zoom.
One of the things I need to do is use an app to record the audio from Zoom. Having Zoom on the computer seems to make that impossible.
I can sort of do it with my Yamaha AG03 mixer. But sometimes, it does not work. And it only works when I use this app that is miserable accessibility-wise called VoiceMeter.
If someone sets it up and nothing accidentally gets bumped or you don’t want to change anything, it’s fine. If it gets messed up, too bad. So sad for you.
What mixer or situation would you suggest?
Here’s what I need to do.
I switched from stenography to voice writing. So basically, I’m listening to a live court or deposition. And then at the same time, I’m re-speaking it.
This works better than AI trying to get a multitude of voices, as it just has to train Dragon to mine.
Anyway, my CAT software (Computer Aided Transcription) records my Dragon track on one file from my headset mic. On the other track needs to go the Zoom file.
The reason for this is that they can sync, i.e. my voice writing with the Zoom audio. That way, if I have a real muck up with Dragon, then I can hear the audio.
In live areas, it would be easy. My voice writing will then be done with Soundproofing Mask that has a mic in it. Then, I would just have set mics.
Just FYI (for your information), just like stenography, we have to get to 225 words per minute. But I think it’s just as accurate as writing steno if you have someone who really learns their craft.
I will be testing for my CVR (Certified Voice Writer) in November, after 4 months of studying voice.
After a year of writing steno, my hands were hurting, and I was barely halfway there.
Anyway, I thought I’d ask you and the listeners what apps I should use to record Zoom from another device, and have it be able to be recorded by a piece of software. It can be a hardware or software solution.
Sorry for the lengthy information about what I’m doing, but I thought some people might be interested. Good money can be made in the career, and especially if you train in voice. It doesn’t take long.
The only sad thing is, I was all eager to use J-Say. However, because of the way my program interacts with Dragon, it refused to work with the script.
Thankfully, my information all comes up in Braille.
So anyway, whatever help you can provide is great.
Also, if anyone is interested in court or deposition work, or even learning to edit said reports, I’d be happy to talk to anyone interested in either voice or steno.
Keep up the good work with the show.
Glad iOS 17 is pretty stable. I’ve been running the beta. I have not had any issues with Braille. I have a BrailleSense, though. I am relieved because I only use Braille for, I’d say, 95% of the time.
I also just tried an ActiveBraille. I wanted to like it. I really did.
The ATC is not conducive to my reading style. I did not realize until I tried it, how I read. It was not working, and I realized my hands barely move.”
“My left hand reads the left part, and my right hand finishes up. No hand ever leaves the display.
It was suggested the ATC would work better if I just took my hand for a second, but I unfortunately don’t think I can change a reading style I’ve used since 3 years old. Plus, I know some people like the keyboard a lot, but my hands just don’t fall on it.
It seems to be a quality device, but I don’t know what my next braille device will be. I was interested in the Orbit slate, but never heard more about it.
I am interested in the Monarch. Maybe my new career will allow me to afford it.
Just FYI, in some states, the base salary for reporters can go up to over $100,000. Now, that’s not every state and every reporter. But yeah, maybe then I can afford my serious addiction for tech, especially Braille devices.”
Thanks very much for writing in, Eden.
This problem is solvable with a bit of hardware – an audio interface that supports loopback. What that will mean is that you’d be able to use a tool like Reaper to record your audio on one track.
You would set that to record from your microphone input of an audio interface. And then on the second track, you would have loopback selected as your input, which would record from the output of your sound card that’s got zoom going to it. It’s essentially doing in hardware what you’ve been trying to do with inaccessible software.
There are a few fairly low cost options for doing this. The VoCaster 2 from Focusrite immediately comes to mind.
And in the Living Blindfully archives and also over on the Blind PodMaker feed, you will find a review and demonstration of the VoCaster 2. It’ll be perfect for a situation like this.
And you can even add a little bit of audio compression into your microphone to make it sound very nice, if you really wanted to do that.
It’s a great little device, not too expensive, and ideal for what you want to do.
Also, Audient do some great products with loopback, including the Evo 4, and that would work as well.
Audient has some fantastic accessibility things going on with their hardware, so it would be worth checking that out, too.
But the VoCaster 2 or the Evo 4, set that up correctly and you will absolutely be fine with what you want to do. Zoom will be completely isolated on one track. Your audio will be completely isolated on the other. You’ll be good to go.
Voiceover: If you’re a member of Living Blindfully plus, thanks for helping to keep the podcast viable.
If you haven’t yet subscribed, why not do it today?
Get access to episodes 3 full days ahead of their release to the public, you’ll get advanced notice of some of our interviews so you can have a say in what we ask, and you’ll help keep the podcast viable by helping to fund the team to do their work.
Our guarantee to you is that everyone who works on the podcast is blind or low vision, so we’re keeping it in the community.
Find out more. Visit LivingBlindfully.com/plus. That’s LivingBlindfully.com/P-L-U-S.
Pay what you can. It all helps. Thanks for your support of Living Blindfully Plus.
It is remarkable to me that the Beatles, despite these changing times, have a song for every occasion.
For instance, when I received this email today, I immediately thought of that Beatles classic. You know the one.
[Jonathan sings with piano music]
Hey dude, this phone’s so bad.
Take a Samsung and make it better.
Ashutosh is not happy. Not happy at all. And writes in to say:
“I would like to know your detailed opinion about Samsung inaccessibility.
I’m using a Samsung Galaxy A33 5G phone.
It has many accessibility issues in Samsung’s main apps, such as the Samsung Members app, My Files, Galaxy app, and many others.
Even Samsung TalkBack has many bugs, such as it randomly speaks only 2 or 3 lines of a paragraph, and the rest of the lines are not spoken until I restart TalkBack. Samsung TalkBack cannot increase or decrease the volume of WhatsApp’s audio messages, but Google TalkBack can. It speaks the wrong sender name in WhatsApp groups, and the wrong follower name in the Twitter app.
In my country, India, there is no Samsung Accessibility Team support.
However, I found the email address of the Samsung USA Accessibility Team under the contact details for Samsung TalkBack in the Samsung Galaxy store.
I’ve emailed them many times, but I’ve not received a response.
Could you please let me know if there’s a Samsung Accessibility Team social media group, or an email list, or Google group, like NVDA and Google Have? I’m asking because I’ve not received a response from Samsung India or the Samsung USA subsidiary.”
I’m sorry to hear you’re having that issue.
It’s not something I know a lot about. I’m not really in the Samsung ecosystem. I own a Galaxy S21, but I haven’t used it a lot, I must say.
I just don’t find the experience as fluid as working with my iPhone. And obviously, the lack of hearing aid support for my particular hearing aids, and the Braille issues that are ongoing mean that it’s not really a starter for me.
So if anyone has any experience with the trick to getting good quality support from Samsung, or can verify the bugs that are being reported here, do let us know. I’m sure people will appreciate that.
Voicemessage: Hello, Jonathan and Living Blindfully listeners.
The podcast is fantastic as usual.
Jonathan, a couple of years ago, you kindly helped me with getting my iCloud drive onto my laptop, which uses Windows.
I’m currently using Windows 11. And about 3 weeks ago, for some reason, without any notification, it stopped syncing.
Just wondering if you would have any idea why that would be.
Whenever I try to log in, it says we encountered an error. We encountered an error.
Having spoken to both Apple, who had really no interest or idea in helping, and also Microsoft, who oppositely (I’m not sure if that’s a word), but oppositely, tried to help as much as they could. Unfortunately, though, they were unable to.
Just wondering if you would have any thoughts or advice on how I can start syncing again.
Regards to all, and a fantastic show as always.
Jonathan: Thank you, Michael. That’s Michael Pantelidis with that one from Australia. See, I never forget a face, Michael. I never forget a face.
If Microsoft’s trying to help you out, … And they are very helpful on that disability answer desk, aren’t they? They really are super. If they haven’t been able to help, I don’t know whether I can suggest anything that you haven’t tried.
But what I would do in a situation like this is completely uninstall iCloud for Windows, go into various file folders, and check that all remnants of iCloud for Windows are gone from the file folders. If it’s possible to find out what’s in the registry, delete that too, if you can. That’s very tricky. Don’t do that without any kind of expert assistance. [laughs]
Sometimes, when a program leaves a lot behind in the registry, you can find online a tool that will remove the right remnants from the registry. So that may or may not exist.
And this is the sort of thing really that Apple should be helping you with. I mean, I know they get a bit funny about Windows. But this is an Apple product that is broken, so they should be helping a bit better than it sounds like they are.
Then I would do a complete reinstall.
One option that might be worth a try is if you have system restore points being saved on your computer, you could try and see if there’s a system restore point that takes you back to when the syncing was working. I think it’s unlikely that will fix it, but it’s worth a try if you really have to.
Failing that, I mean, if this is not some sort of general problem and it really bothers you, you could do a complete new install of Windows. That’s an incredibly drastic thing to do.
If anyone else has had this experience, perhaps they can provide you with some tips on what solved it – when your iCloud drive is working. And suddenly, it is not, and you’re using a Windows computer.
Very best of luck.
Mike Bullis is back.
This time, he says:
I recently purchased a Samsung S95C series television, and coupled it with a Sonos Arc. I’m having a couple of issues, and wonder if any of your listeners know what might be going on.
First, on the Samsung audio menu, the audio description button is disabled. There is some helpful verbiage that says audio description is only available from certain material. The button can’t be enabled.”
So Mike, I’ll just stop there and say that in New Zealand, we’ve got audio description on our free to air TV. And the audio description certainly works at that point, but it doesn’t work anywhere else.
So if you want to use audio description on things like Netflix or third party services, then they need to be activated within their respective apps.
I’m not sure if that’s helpful or not.
“I am trying to use the apps installed with the Samsung, rather than plugging in my Apple TV and accessing them that way. I may change my mind about that because the Netflix app provided by Samsung doesn’t talk, and that’s where a large amount of audio description is available. The Hulu app, on the other hand, does seem to work better through the Samsung than it does with Apple TV.
So as with so many other things in life, I will piece together a solution through trial and error, and advice. Much of blindness”, says Mike, “has always been like that for me – ask the blind community, experiment on my own, and cobble together something that works.
What?“, exclaims Mike.”You want a life where you go to the store, buy a new TV, bring it home, and set it up and voila, it works right out of the box? That world, regrettably, simply doesn’t exist yet for us.
While many companies strive to provide accessibility, it is still a long and frustrating struggle to use their products.
When I got my first iPhone back in 2009, I simply couldn’t figure it out. All I wanted to do was make a simple phone call. But no! I needed to learn gestures, and taps, and swipes. Darn it! I just wanted to make a phone call.
Now, I love my iPhone and even keep an older spare around to use as a dedicated book reader. But the beginning of this relationship was ugly.
So back to the TV. Any advice anyone has is welcome. Maybe I will soon love my TV.
I must say, this TV is 55 inches and so thin, it weighs almost nothing. One could pitch it through a window without much strain at all.”
Don’t do it, Mike. Don’t do it.
“But not yet.”, he says.
I’m pleased to hear that.
Good luck, Mike.
And you are so right. You make a very astute observation. We need so many tools in the toolbox. And I guess, this is where a podcast like Living Blindfully can be so helpful.
But regarding your audio description thing, I think you’ll find if you go into, for example, Hulu, there’s probably under the language options when you’re watching a movie or a TV show there on Hulu, the ability to switch audio description on, and that’s where you do it.
The audio description feature on your TV itself will be applicable only to whatever service that you watch through on your TV.
And I know things are a bit different in the US, but I presume you have digital free-to-air over there, and that’s probably what it applies to.
I wonder if the speech on your TV is any better than the speech on ours. I really wish we could get Eloquence, or Vocalizer or something on that TV. That’s my one biggest criticism of it.
Hope you enjoy the Arc, though. It is really nice.
We’ve got the Arc, we’ve got the sub, and the era 300s as Dolby surrounds. When you get immersed in a good movie in Atmos, or a good bit of music in Atmos, it’s quite glorious.
Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.
Here’s a good question from Daniel Crone, from whom we haven’t heard for a very long time. He says:
“Where may one read show notes for a particular podcast?
And how may one write to someone who talked about the Atari emulator software last week?”
Alright, I’ll answer the second part first.
That was Joe Norton, Daniel. Joe, if you’re out there and you don’t mind me passing on your email to Daniel, let me know and I’ll try and connect you guys.
I try not to do this too often. Otherwise, I’ll spend all my time, my copious amounts of free time [laughs], playing a matchmaking service.
So if you did have a question about the Atari emulator, Daniel, you’re also welcome to send it in here ’cause it may be a question that other listeners would like the answer to as well.
Now, let’s talk about show notes because from time to time, in most podcasts, the podcast host, …
What? You mean there are other podcasts?
Apparently, there are other podcasts. What?
Anyway, the podcast host will make reference to the show notes. How you access those show notes will depend on several things.
If you are listening using a podcast app, be that on your PC or your smartphone, or you’re using the Victor Reader Stream or the SensePlayer, something like that, normally, there will be a way to get to the show notes.
I don’t own either a Sense Player or a Victor Reader Stream so perhaps, somebody can chime in and let us know how you get to the show notes on either device.
There is no browser built in to either device. So sometimes, show notes will have hyperlinks to things that we’ve talked about, or places that you can go for more information. And if you’re not using a tool that offers a browser, you won’t be able to take advantage of that. But you should be able to just read the show notes which will tell you (in the case of Living Blindfully), what’s on, and the time in the file that you can hear that thing.
If you’re using a podcast app on your smartphone, or your PC, or your Mac, then the show notes are in there somewhere.
For example, in Overcast which is the one that I’m using at the moment on my iPhone, when you’re going through your episodes, some of the show notes are spoken automatically. If you’ve got hints enabled in Overcast, then you’ll hear some of the show notes.
But to read the full show notes, you can flick down on the actions rotor to info, double tap, and the info will be there, and the hyperlinks exist, and you can do all sorts of magic with the show notes.
When all else fails, or just as another option, you can also go to LivingBlindfully.com. And when the episodes are published there, you’ll see the full show notes.
There is a very cool feature of the Living Blindfully website that we integrated when we built the Living Blindfully website early this year.
As I say, we do put in the show notes what is on, and the time in the file that it occurs. Now, if you get your screen reader’s cursor on the web (be it browse mode or virtual cursor mode) on the time and you press enter, the player built into the Living Blindfully website will start playing from that time point that you pressed enter on.
And of course, because it’s a website, everything’s hyperlinked that I choose to hyperlink in the show notes.
There’s a very easy way to get to the show notes on the Living Blindfully website. You go to LivingBlindfully.com, type a slash, and then the episode number.
So for example, if you wanted to go back and hear the iPhone recap, you just go to LivingBlindfully.com/248, and voila! That’s actual French. It will come right up.
So I hope that’s a little bit of help, Daniel, on how to access the show notes.
It will vary from client to client, so I can’t give specific instructions for every client. Any good podcast client worth its salt will display the show notes to you.
Here’s one that’s never too far away from the podcast – more talk about ableist language.
And Ioana is writing in. She says:
“After hearing the interview with the writer of Look at it This Way,” who was of course, Sammy Sweet Spirit, “I decided to send my reflections about ableist language.
Thanks to you and your podcast, I have become much more aware about use of negative connotations of blindness-related words. But they also made me reflect on something I have not heard mentioned much in this conversation.
If we, rightly, object to people equating blindness with ignorance, being clueless, etc., we should at least examine the positive bias of equating sight with seeing, understanding that we all casually use with no problems. When we substitute seeing with understanding and being insightful, etc., it invites, by pure logic, the opposite metaphors.
Does it mean that one should avoid the positive ones as well?
I don’t think so. I rather like the richness of using such metaphors, and avoiding them would feel clunky and limiting.
But one could make the point that they nurture their opposite negative associations on some level.
Although I feel that, as long as we accept and use the positive connotations of sight, it makes me more tolerant of negative uses of blindness within reason. By that, I mean that in certain contexts, it’s important to call out and condemn them. And at other times, simply use both them and their positive counterparts as a starting point about raising awareness about the whole bias issue.
A good parallel might be how we now would not tolerate negative statements about women such as being weak like a girl, etc., but we would also frown upon positive association with male attributes, as I sometimes heard my music teachers say things like “play with conviction in a manly way”.
Not sure if this parallel helps.”
Thank you very much, Ioana.
I agree with you. And just speaking personally, at the risk of starting a hornet’s nest, I try and avoid the sight words as well. So I will say “I understand what you’re saying” instead of “I see what you’re saying”, because just speaking personally, I think it is important to be consistent.
And if we’re going to be concerned, (and some of us are and some of us aren’t, that’s well-established on this podcast), but if we are going to be concerned about equating blindness with ignorance, we should also endeavour not to use the word “see” in a way that confuses it or equates it with understanding.
You’re right, it hasn’t really come up much on the show, but it’s something that I have just done for quite a long time now, as my awareness of ableist language has increased. I don’t think people notice, because the terms are quite transferable.
Saying “I understand what you mean” or “I see what you mean”, (some people use one, some people use the other), and I’ve just chosen to try and eliminate the “seeing” equation with “understanding” from my vocabulary.
I also have some discomfort with the term “vision statement”, but I haven’t found another name for that. I need to search.
If anybody has any thoughts on what you would use instead of “vision statement”, I would be interested in that.
Caller: Hi, Jonathan. This is Sharon Strzalkowski in Massachusetts, in the United States.
I’ve been meaning to call in and to say that I really appreciated the demo that you had squeezed in by a listener on Weatherology.
I really like the app. I actually find it a little more intuitive than the actual Apple Weather app, and I was very glad that you were able to put it in. I know it’s tough to do something of that length.
And the other thing I wanted to say was I just purchased a Brailliant 40 from Humanware. I’m just starting to get used to it, and am very excited about it. So I hope that people might write in or call in about that at some point, and just give their thoughts as well.
Have a great day.
Jonathan: Thanks, Sharon.
Chad did a great job of the Weatherology demo, and it is a pretty cool app. I love that narration that they’ve got going there.
And good luck with the Brailliant. It’s a good device.
For our New Zealand audience, David Harvey is writing in about a ride share app called Zoomy, which is a uniquely New Zealand thing.
And David reports:
“Zoomy has been rebranded to Your Ride, which I found in 2014, a year before Uber, and 3 before Ola.
Your Ride lets you order and register taxis via the app.
However, after updating, there are no improvements to the app, nor is there any Total Mobility integration.”
For those outside New Zealand, Total Mobility is a discount service available, and it’s now up to 75% off the standard taxi prices. So it’s not integrated there.
I remember talking to Zoomy in the very early days. And you’re right, it goes a long way back now, David, trying to encourage them to make the app accessible. They have shown no interest whatsoever.
And it looks like it may have a new name, but it’s still just as disappointing and dodgy to use.
Imke sends an email. And you know what the coolest thing about this email is? It’s that it comes with a pronunciation guide, because I don’t like mispronouncing people’s names and I wasn’t quite sure how to pronounce it. So it’s pronounced Imke.
And it says:
Following up on the discussion of our favourite white canes, I have been using canes from Revolution Enterprises for more than 2 decades.
What I like about them is that they are lightweight, without being wobbly.
I use the 5-piece folding cane, and my preferred tip has become the rolling tip. For me, this tip lessens the strain on my arm and wrist during the cane motion, does not get stuck in cracks as easily as the regular tip, and still communicates enough information about the surface over which it slides.
I tried a soft marshmallow tip before, and found it did not communicate enough information to me.
You can find these at Independent Living Aids.
Regarding the touch technique versus the slide technique, I learned both during my orientation and mobility instruction during my school years, and was advised to use whichever seemed more appropriate, and that is what I have done ever since. My default seems to be to slide the cane since that is easier on my wrist, and communicates more information about what is in front of me.
However, when it is particularly important to use echolocation based on the sound of my cane, when I need to be able to find objects that are not touching the ground but come down low above the ground, or when the ground is particularly uneven, I use the touch technique.”
Nice to tap into your wisdom there, Imke. [laughs] See what I did there?
And on that note, I am out of here.
Thank you so much for all your contributions. It’s been a very busy week, and I appreciate that.
And as you come up to speed with iOS 17 and potentially a new iPhone, do let us know how you’re getting on.
For now though, remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.
Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.
If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: email@example.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.