Transcripts of Mosen at Large are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

[music]

Jonathan Mosen: I’m Jonathan Mosen, and this is Mosen At Large, the show that’s got the blind community talking. On the show this week, plenty of iPhone and iOS feedback. Listener comments on the Envision smart glasses. Braille screen input gets a nifty enhancement, and the Beatles get another remix.

[music: Mosen At Large Podcast]

Hello Connecticut

Welcome to episode 203. 203 in the United States area code system belongs to Connecticut, a nice part of the world. I think the area code is the bit that’s quite close to New York. Welcome, to those of you in Connecticut, a special welcome to episode 203. Quite a bit to tell you as ever this week,

My iPhone 14 Pro Max is here

let me start off by telling you my iPhone 14 adventure. I’m very pleased to report that my iPhone 14 Pro Max arrived. My window for delivery was between the 6th and the 13th of Rccktober. This was after the debacle of an experience I got when I woke up at midnight, I set the alarm, I woke up at midnight and I tried to order my iPhone for release day like I always do.

As with many people, this time it did not go well, but that’s okay. As the old cliche says, “Good things come to those who wait.” I woke up on the Monday morning, the 10th of Rocktober, to find that my American Express card had a notification to say that it had been charged the full amount for the phone. I knew that good things were imminent. Then later that day, I got the official shipping notification from Apple with the tracking number. It said that the expected delivery date was Wednesday the 13th.

This was super convenient for me because on the Tuesday morning, my son, Richard, who you will have heard on last week’s episode driving us around to help with the Envision Glasses demo, he was graduating with his second degree. I’m rather proud of my boy, Richard, because his first degree is in broadcasting, and his second is in audio engineering. What a chip off the old block eh. We have a carrier here in New Zealand called 2degrees. I said to him that since he’s the first Mosen banana to get 2dgrees, we should switch him to that provider.

He didn’t seem to think very much of that for a graduation present. Bonnie and I went along to Richard’s graduation on the Tuesday morning, and I had my phone in silent mode, but pushing notifications to my hearing aids. It said that the iPhone had cleared customs in Auckland. I thought, that’s nice, welcome to the country, iPhone. Then it said it was on a plane to Wellington where we are, the capital of New Zealand. I thought, man, this is quick. I wonder if it’s going to get here today. Then it said it had arrived at the facility in Wellington.

Then the moment came when we were in the middle of this graduation ceremony and it said, it’s out for delivery. I thought, oh no, they’re not going to leave this there. They’re going to want me to sign for it and I might not be there. Things were on my side. The ceremony concluded, no delivery had been attempted. I went home, got some lunch, started the process of backing up my iPhone 12 Pro Max, that I’ve been using for the last couple of years, to iTunes, because I always restore from an encrypted iTunes backup. While that backup was taking place, the phone arrived.

Brilliant timing. I was able to get it out of the box and set it up. Of course, I did have to set it up as a new phone to some extent because the phone was shipping with an older version of iOS than the one on my 12 Pro Max, I’ve been running the 16.1 beta on the 12 Pro Max. If you try and restore that backup to an older version, it will not let you do that. I’d already thought of this and I had the little profile in my iCloud drive that you have to install in order to get the betas. I installed that using the Files app on the iPhone 14 Pro Max while the backup was still doing its thing on the iPhone, and we were ready to go.

I was able to check for updates and find the latest version of the beta there. While it was set up as a new phone, I did do some experiments with this problem I have had relating to VoiceOver being very quiet on a call. I’ll talk about that a bit later when we have some more listener feedback about VoiceOver being quiet on a call. I thought I’m on a roll here. I’m really efficient, I had the profile, I’m doing the update, this is brilliant. It was all to come to a screeching halt, I tell you, because when I had installed the iOS 16.1 beta and the phone rebooted, I could hear that VoiceOver was on, but speech was gone.

There was no speech whatsoever. I could flick around the screen and I could hear the VoiceOver sounds. I could tell it was running, but could I get any response from the phone that was helpful? Not a sausage. Not a sausage. I couldn’t use Braille either because I hadn’t done the pairings yet for the Braille display. The Mantis wasn’t working. I tried all the things. I tried toggling VoiceOver off and back on again. I tried rebooting the phone, which you can do by pressing the volume up, the volume down, and then you hold the side button down, and that forces a warm reset of the phone.

I did that several times. At least the iPhone did play its new magical shutdown and startup sound, so that was good, but I was stuck. I knew that the only way to get unstuck in this situation was to have sighted assistance in some form, whether it be through a camera or otherwise. It so happens that I’m quite blessed to be surrounded by vultures. There’s this very elaborate system where the children, where the bananas, the Mosen bananas– Why do I call them the bananas? Because there’s four of them. One banana, two banana, three banana, four.

Anyway, they have this elaborate system figuring out who gets dad’s iPhone, and for that matter Bonnie’s iPhone, this year. Believe me, they keep score. It was determined that it’s David who’s due for the iPhone 12 Pro Max. It just so happened that David was in Wellington for his brother’s graduation, and he was only too keen to take the old phone off my hands. I gave him a call again, using Bonnie’s iPhone this time, because I had put the SIM for my account into the new phone that I couldn’t set up. I said to him, “If you want your phone, you better come over and help me out of this bind.”

He was over like a flash, he was, and he was able to get me through the update process that was remaining after the restore from the backup. Since then, it hasn’t been a problem. That was a really difficult experience. I kept thinking if I didn’t have all of that family support, if I didn’t have somebody on hand, it would’ve been really difficult. You may have been able to use a service like Aira to try and enter the pin, but that would be quite difficult because the numbers on that keypad are actually quite small. It was not a good experience, and we’ll hear more about a similar experience later in this episode.

The next thing I want to comment on is the fact that it’s been two years since I had a new phone, and that’s a long time for me. The last time I went without upgrading to the latest model was that I skipped the 5, I went from the 4S to the 5S. This time I’ve gone from the 12 Pro Max to the 14. The thing you really notice is how many apps that you’re still happily using have been withdrawn from the store. This is actually a disincentive for you to get a new iPhone. Because if there’s an app that still works for you and you’re still using it, it’s very difficult to keep that app around when you change your phone.

The apps come down from the cloud to when you install them, you can’t store the apps in iTunes anymore. Then when the restore is complete, you look through your apps and you see one that says “In iCloud”, and that’s the time that your heart should sync because you’ll double-tap and it will say, “This is no longer available in the app store. Do you want to delete the app?” Really in most circumstances, you’ve got no alternative but to do just that, to delete the app. It’s pretty dodgy really because we’ve paid for these apps. I’m fully accepting of the fact that sometimes developers move on.

They take the app out of the store, they go onto other gigs. I get that, but I’ve still paid for the app. If it’s working on my phone, if there’s not some change that’s occurred to the operating system and I can still use the app, I should still be allowed to use the app. Apple has taken away the ability to do that by removing the ability to back up apps to iTunes. It turns out that there is a way around this, and it’s an app called iMazing iPhone. It’s like the word “amazing” but with an I at the beginning. You can find it at imazing.com. This app, last I looked at it, and it’s been quite some time since I looked at it, had some accessibility challenges.

I’m not saying that we can’t use it, but I am saying that it’s a bit of a fiddly app. That does allow you to keep backup copies of apps and do all sorts of other quite cool deep things with your iPhone. If you’ve used iMazing lately, I’d be interested to know how you find it, how you’re getting on with it. Have you tried to talk to the iMazing people about just maybe improving accessibility a little bit? Apple really should bring this back? It shouldn’t be this hard.

The other thing that it’s useful for in an accessibility context, the ability to have copies of apps on your PC or your Mac for that matter, is that sometimes you get this scary announcement when you look through app updates. It says something like, “We’ve completely overhauled the app with a new look and feel. The app has been rewritten from the ground up,” and you think, “Oh no.” It’s a leap of faith about installing the app. Are you going to lose access to something that you really care about, about something that’s important to you?

In the old days you, could just back up the old version of the app to iTunes and restore the old version of the app if you ran into trouble. You can’t do that anymore. You can do that with this iMazing app. I still have my updates set to manual and I go through and look at what’s new from the apps that I have on my phone. When I first got the iPhone, it was like Christmas every day because you’d be seeing all these major new features, and they would describe what they’d changed or what they’d fixed. These days, so often it’s just the same blurb, bug fixes and performance improvements, but they don’t tell you what’s been fixed, and they don’t tell you what’s improved.

That’s really lazy and it annoys me. Rant, rant. We really should have a way of being able to keep the apps that we’ve paid for. I fully accept that when they’re out of the store, we’re living on borrow time. We can’t expect Apple to keep iOS in a time warp. One day the app might not work, but as long as it does work, we should be able to install it on our new devices. Now, the one thing that is really cool about the iPhone 14 Pro Max is the battery life. Seriously, on the first day that I was using it normally, after all the restores and things, because that’s not a very good barometer of battery life when the phone’s being busy becoming your phone, but on the first day, I had a reasonably busy day on the phone.

Most of the time I’m on Teams, a little bit of Zoom. On this particular day, it worked out that I was probably on the phone, the good old fashioned phone for about an hour and 45 to perhaps two hours. I got up at my usual time of [5:00] AM. I was playing my workout playlist on the treadmill. I did a little bit of stuff with camera-related apps during the day to identify things. I heard a couple of podcasts while I was eating lunch. I was really fairly busy with the phone, and of course notifications were coming in. The way that I use my phone is I don’t have the screen lock.

I put it on my desk, I have it unlocked. Brightness is set to zero, screen curtain is on, but that way I just hear the notifications coming through in my mixer. By the time I had been using it for about 12 and a bit hours, fairly heavily, the battery was still on 65%. I’m now actually finding that I don’t charge the phone every night anymore because I want to make sure this battery lasts, and we know hopefully by now that the best way to keep your battery conditioned is to keep it in that sweet spot between 20% and 80%. Sometimes I just put it on my wireless charger down here when it needs a charge rather than charging it overnight.

Absolutely phenomenal battery life with this thing. Apple’s done a really good job. It’s too early to say too much about the camera, but I will make a couple of observations about the camera a little bit later in the context of when we talk about your feedback on the Envision Glasses demo. I haven’t noticed a significant speed increase. I was really hoping that the problems I’ve been having with Overcast where it is just so sluggish to use VoiceOver whenever you’re playing something on Overcast would’ve been addressed with a faster phone. Unfortunately, it has not been. VoiceOver for me is just incredibly sluggish whenever Overcast is playing.

I did have a go for a short time at setting the phone up is new just to see what that would be like, and again, to just rule that out as a possible way around the problem of low VoiceOver volume on calls intermittently. Sometimes it works great, at other times it does not. I was reminded just how difficult it is to get your own content on the phone in Apple’s apps. This is one of the best advertisements for Android that you could possibly want. Why can we not have a situation where certain folders are exposed to File Explorer on Windows so that you can just copy your own music and your own ringtones, and all those things directly on the phone?

It is really difficult to just do something simple like get your own ringtones into the phone. iTunes is an absolute bear. It is a horrible app, and it’s overkill when you just want to copy your own media. Fortunately, there is a third party app for this as well. We’ve talked about this on the show before. It is an app that installs for both Mac and Windows. You don’t put it on your phone, you put it on the computer that has the music you want to copy. It’s called WALTR PRO, and that does get around it. It’s a very nice app, and it’s not too bad to use from a screen reader user’s point of view.

I remember when I offered my iPhone something or rather from some years ago to Richard. The moment he worked out that he couldn’t just copy his music onto the phone the way that he can to the Android devices, he wasn’t interested. I completely understand that. When I had the iPhone set up as new and I was checking out this low VoiceOver on a call bug, I did have a chance to play with the dynamic island, which I had to locate by touch on the status bar. It’s in the center of the status bar. When you’re playing music or doing certain things, the dynamic island is there.

You can use the rotor on it, you can double-tap and hold on it, or triple-tap on it and do things. Then I installed the iOS 16.1 beta, and I was voted off the island. I was voted off the dynamic island. It became completely inaccessible, but as the little rubber band likes to sing, “Hang on, help is on its way,” because the iOS 16.1 release candidate is out that is going to be released very shortly. The dynamic island is now fully accessible, not just by touch again, but also you can flick through the status bar and find the dynamic island right there.

This is very exciting because with iOS 16.1 comes access to live activities. We talked about live activities on our various previews of iOS 16. The idea is that certain apps that you might want to update all the time with information can appear in the dynamic island so that you can look at content that these apps are pushing, and not have to go into the specific app. For example, I haven’t seen it yet at the time that I’m recording this, but I’m hoping that Uber Eats and DoorDash, and Uber itself in fact will all be pushing content to the dynamic island so that you’ll be able to tell where your delivery is or where your vehicle is.

Now the dynamic island is exclusive to the iPhone 14 Pro and 14 Pro Max right now. If you don’t have the dynamic island, these live activities will update on the lock screen so everybody will be able to make use of live activities. It was good to get the 16.1 release candidate installed and to get voted back on the island. It’s a great device. The battery life for me is worth upgrading alone.

You do get some battery degradation over time. My battery was at 86% of its full capacity after two years of use. The battery on the iPhone 14 Pro Max is better anyway, so the combination of those two things, I’m really noticing the difference, and it’s a very pleasant thing. I’ll continue to play with the camera and use that in various conditions, but so far so good with the iPhone 14 Pro Max after a bit of a dodgy start.

Announcer: Be the first to know what’s coming in the next episode of Mosen At Large. Opt into the Mosen media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show. You can stop receiving emails anytime. To join, send a blank email to media-subscribe@mosen.org. That’s media-subscribe@M-O-S-E-N.org. Stay in the know with Mosen At Large.

Jonathan: We bring you transcripts of every episode of Mosen At Large, and that’s possible thanks to sponsorship from Pneuma Solutions. One of the cool things about the internet is that it connects us with the wider world. Another cool thing about the internet is that it can create places just for us. Mosen At large is one such place. Another one is Sero. Sero spelt S-E-R-O is a social network designed by us for us. Sero is available everywhere. It’s on your smartphone, your Apple TV, your Amazon Echo, and of course on a fully accessible website.

If you download the Sero mobile app from wherever you get your apps for your mobile device, you’ll be able to sample some of the content free. That includes this podcast and Mushroom FM. Paying a subscription to Sero gives you access to a treasure trove of information including newspapers, forums where blind and low vision people can discuss a wide range of issues, a handy accessible email client, and so much more. You have to check out all the features. You’ll be amazed at how much is there. Go to pneumasolutions.com. That’s P-N-E-U-M-A solutions.com. Access the products link and then choose Sero for more information.

[music: Mosen At Large Podcast]

The Beatles Revolver album is being remixed. Don’t miss the Mushroom FM special

If you’ve listened to my radio shows over the years, you will know that I’m a major Beatles buff, a serious Beatles collector. In the United Kingdom, they would call me a Beatles anorak, which is an extraordinary expression, but anyway. I am very excited about the fact that the Beatles Revolver album is the latest album to get the Giles Martin remix treatment. This started really in earnest with the 1967 Sgt. Pepper’s Lonely Hearts Club Band being remixed for its 50th anniversary in 2017. There have been some other Beatles remixes earlier, including another version of the 1 album.

It was a really interesting mix of the Yellow Submarine album quite a long time ago, in the early 2000s I think that came out. This album project started with Sgt. Pepper. They skipped Magical Mystery Tour for some reason. They’ve done the White album. They’ve done Abbey Road and Let It Be. Now they’ve gone back to Revolver. What’s interesting about the fact that they’ve gone back to Revolver is that Revolver was recorded on a single four-track machine. The stereo mixes of Revolver are pretty primitive. My preferred mix of Revolver up until now has actually been the mono version.

There have been some challenges around remixing this album because you’ve got so many instruments on the one track, because they only had four tracks to play with. It’s amazing to me to think that I’m using more tracks to put this podcast together than the Beatles used to make the whole Revolver album extraordinary. During the Get Back project, which New Zealand’s Sir Peter Jackson, not too far down the road from me actually, put together, his studios developed some incredible technology. I’ve actually got recordings of those Let It Be tapes.

When I listened to them, it was often difficult to hear what the Beatles were going on about because they knew they were being recorded. They knew that the cameras were constantly rolling, filming them. If they wanted to have a conversation amongst themselves that they didn’t want captured for history, they would just play nonsense on their instruments to mask their voices. Some of us who are into audio are familiar with some of this technology like the dialogue isolate feature that they have on iZotope, and various things like that.

Peter Jackson’s film company developed some really slick technology, where not only could they separate the vocals from the instruments, but they could actually separate individual instruments. When you can do that and it doesn’t sound warbly and horrible, then you have the potential to do some very serious cool remixing. You heard that happening in earnest in the Get Back documentary. It’s wonderful. As a result of that, Giles Martin with the help of this technology, went back to the Revolver album and has done a remix. It’s not only a beautiful stereo remix based on the Taxman track that I’ve heard that they’ve released as a teaser, it’s also a Dolby Atmos mix.

If you’ve got a system capable of Dolby Atmos, you are in for a sonic treat. Now, here’s the thing. Here’s the deal, as President Biden likes to say. Friday, the 28th of Rocktober, is when Revolver is going to be released. We are very fortunate in New Zealand that Friday comes here before it comes to pretty much anywhere. That means that early in the hours of Friday morning when it’s still Thursday for most people, I will have the Revolver album. As I have done with all of these releases, what I’m going to do is jump on Mushroom FM, the home of the fun guys, and let you hear the shiny, pristine new version of The Beatles Revolver album.

I have the deluxe version on order, which means there will be plenty of outtakes. It’s interesting in Beatles bootleg land, there haven’t been a lot of outtakes that have made it out of Revolver so we are in for some real treats there. You’ll also hear all of the actual remixed tracks on the album. More than that, I’ll explain the history behind some of the songs. What inspired the Beatles to write these tracks? What musical influences were at play? If you’re interested in the backstory in the new mixes before hardly anyone else has heard them, then you’ll want to join us for this Revolver special. It happens on Friday morning at [5:00] AM New Zealand’s time.

Now, that equates to Thursday the 27th of Rocktober at midday US Eastern Time, [9:00] AM Pacific, and that will be [5:00] PM in the UK. If I haven’t covered your particular timezone, panic not, because if you go to the Mushroom FM schedule page, the schedule is displayed in your own time zone. To do that, head on over to mushroomfm.com/schedule and you’ll see it there. Mushroomfm.com/schedule. Hope you’ll join me for a real special experience, The Beatles Revolver remixed in stereo. Of course, we won’t be playing the Atmos mix because we don’t have the technology to send you an Atmos mix, but that’s something that you can also look forward to.

[music: Mosen At Large Podcast]

Comments on the Envision Smart Glasses demo

Some comments on our review of the Envision smart glasses from Episode 202. We begin in Edmonton in Canada with Wayne who says, “Thank you so very much for your complete demo of the Envision smart glasses. All other demos do not have it set up for anyone to hear what is going on but the demonstrator. The other mistakes are running speech rate at 100%, volume level at 10%. Admittedly, I do have a hearing impairment, and I am unfortunately slowly losing more hearing. I wish envision glasses had things like LiDAR, autofocus, ability to take pictures and video, door detection, describe what’s around you on the fly, etc.

What can your iPhone do that Envision glasses cannot? Does the LiDAR make a difference?” I’ll come back to some other questions from Wayne in just a second. To some extent, I feel like the LiDAR is very much a proof of concept in terms of the blindness use cases that Apple is putting it to. I think they’re looking to the long term when there will be some sort of wearable form factor that a blind person can use. It will be exciting to see where Apple goes in this space. That said, there are some real cool practical benefits of the LiDAR’s people detection feature.

If you are on a bus, for example, and you get good at using it, and you don’t mind holding your phone out in front of you when you board the bus, so there are a few ifs there, their people detector is quite good for finding a spare seat, for example. The door detection feature is nice. Sometimes it gives you useful information, but there are times when I feel like it hasn’t told me all that I really need to know. It can be a little bit hit-and-miss. That might be where you’re holding the phone in terms of the level at which you’re holding it. Maybe I would get a bit better with practice.

Obviously, you’ve got some very seriously cool camera technology in the new iPhones. If you’re talking iPhones with LiDAR, then we’re talking about the 12 Pro upwards. My amateur suspicion and opinion is that the iPhone is likely to work better in a wider range of light conditions, substandard light conditions. In fact, I have had some pretty impressive results with Seeing AI and the iPhone 14 Pro Max pretty much in the pitch-black dark. I’ve used it in the middle of the night just to see how it works. I have actually been able to get some good results, whereas with some other apps, I have not.

It could be that Microsoft’s a bit ahead of the curve in terms of taking advantage of the low light features that are available in the new iPhone in particular, that perhaps iPhones generally. As I mentioned in the review last week, for fine detail stuff, it does seem that the iPhone has an advantage. For example, you remember that when we were using the Call an Ally feature and I was talking with Heidi, she could see that I had Reaper on the screen, but she wouldn’t have been able to tell me specifically what was on the screen to give me guidance about what it was doing if, for example, JAWS wasn’t talking.

I guess that is the critical question, isn’t it? If you’ve got a smartphone, be it iPhone or Android, what value are these going to add for you? As I said, in my conclusion way at the end of that episode, for me, it’s that it’s hands-free. That’s the key thing. There are so many things that are difficult for a blind person to do when you’ve got a white cane in one hand, you’re using that other precious hand to open doors or even hold another bag, and try and juggle that with opening doors. The hands-free thing is a huge bonus.

In terms of one specific thing that the Envision Glasses can do well that the iPhone struggles to do, I would have to say it would come down to that bit there where I was driving around with Richard and reading the signage out the window. That to me is pretty impressive. I’ve not had that luck with the iPhone. Wayne also says, “What hearing aids do you use, and how do you connect to Bluetooth devices?” I’ve got the Oticon Opn is one hearing aids. They’re a bit old now. I think Oticon has much better options out there. I’ve had these for around about three years.

I really like them. I liked the Oticon philosophy which is that rather than trying to filter out noise, noise gives you environmental clues. They do this thing where they accentuate the right frequencies so you can hear what you need to hear. It’s a different philosophy from some of the other manufacturers. Hearing is a very personal thing, not only because people’s hearing impairments differ, but also because people’s preferences just differ as well. The Oticon work well for me for now. The hearing aid industry is changing and improving all the time.

“I really admire people like you and Bonnie,” says Wayne, “Thank you for everything you do for the blind community. That is really kind of you.” Thank you, Wayne. Just quietly, I hope she doesn’t hear this. I rather admire Bonnie myself. We’re going closer to home, at least for me, for this contribution because we’re going up to the Waikato in New Zealand for this comment from Dean Charlton. He says, what a fantastic demonstration of the Envision Glasses you gave, Jonathan. Why do you need an account for the connecting of the glasses when it should be as simple as opening the tab on the iPhone to connect the glasses?”

There are a couple of reasons for this, Dean. One is that the account comes associated with a library of documents that you can store. Let’s say that you are reading some mail or a book, or something like that and you want to keep that as a reference. You can save it in your Envision library. That means that any device that you log into your Envision account will have access to all those documents. The glasses could be just one of several devices that you have logged into your Envision account. You could have, I guess in an extreme situation, a personal iPhone and an iPhone that your employer has given you for work.

You could have an iPad logged in there and they can all access those documents that your glasses have scanned and saved to the Envision library. Another reason too, and we’ll see more of this I’m sure as the product develops, is that your Envision account can connect to other accounts. For example, Aira. You connect the two accounts together. Dean says, “I was amazed with the sign reading when you were being driven around by your Richard. I would’ve thought it wouldn’t be able to read the whole sign as it would get a fairly short glimpse of it due to the driving speed.

I think object recognition is getting better, but still a long way to go. Same goes with color recognition. The currency was very snappy and gave good clear information. Sadly this product is only for people who lead exciting and action-packed lives such as you do, and some people I know. It certainly would be cool though being hands-free with the glasses, prowling a mall, and have it read shop signage. In my case I will have the iPhone 14 Pro to try this with image detection mode on.”

Thanks, Dean. In terms of getting the maximum value from your phone, remember that you can read signage with other apps as well. Envision has its own app that is free, and you can download that from the app store, but there are also others in this space including Microsoft Seeing AI. Another one that I don’t hear a lot about is called Supersense AI. They’re doing some very interesting stuff, and I’ve had some quite good results with Supersense. They don’t seem to have done an update in a wee while though so I’m not sure about the status of that project.

I hope that it’s still alive and active, and that we’ll continue to see developments from them. It might be interesting to do a comparison at some point. I have to think very carefully about the methodology so we can try and be fair. There are a number of these apps out there and it would be cool to put them through a consistent test that would make a fun episode. It is an interesting topic of conversation. If people want to comment on these apps; Seeing AI, Envision, Supersense, some of the other ones that are around to let me know which ones you find work best and why.

Of course, all those ones have something in common, that Swiss army knives on the iPhone. They do a little bit of everything; instant text, scanning stuff, maybe color detection, et cetera. There are also scanning specific apps. One that I find is very good actually is the Voice Dream Scanner app that’s part of the Voice Dream Suite. I do have a small correction to make from last week’s review. I mentioned in that review that the glasses have a light detector, and we demonstrated the two modes of the light detector. Envision AI’s app does not include a light detector, and I said that it did.

I got it confused with Seeing AI which does have a light detector channel. In the case of Envision for now, they’ve kept the light detector for the glasses. My apologies for that. Christian Bertling writes, “I have an efficiency hack for the Envision glasses and personal hotspot on the iPhone. You can create an automation in Siri Shortcuts. You can create an automation that turns it on when you leave the house. Another automation that turns it off when you get to the office. Another automation that turns it back on again when you leave the office. Finally, a fourth automation that turns it off when you get back home.”

Genius. Good idea, Christian. Siri Shortcuts are amazing and you can really do some super duper things with them. Kay says, “That was a great demo of the Envision glasses. I’d be afraid of getting stuck like the people with note takers, barcode readers and Horizon Glasses that are no longer supported. I think of those who loved their Oticons, they were life changing. I know those who prefer them to OCR that is currently available.”

Thanks, Kay. It’s a lot of dosh, and I understand your reticence because it’s a proprietary system. The glasses are the Google Glass Enterprise edition 2, so they’re glasses that are out there. In terms of the way they are talking, the operating system, the software, that’s all done by Envision. I understand what you’re saying. We are taking a leap of faith in the sense that we hope that Envision is well-funded, that they’re doing okay, that the investors are not going to pull the plug, that this is technology that will be around for a while.

Roy Nash: Hello once again, Jonathan. This is Roy Nash from Little Rock, Arkansas. I want to tell you how much I appreciated the demo of the Envision Glasses that you did on your last podcast. You did your usual thorough outstanding job. The documentation on these glasses is absolutely superb, as you pointed out. I got my glasses about two months ago. I got them set up just simply using their documentation. I downloaded the Ally app on my wife’s iPhone and installed her as my Ally. The day after I got this done, she had an accident and fell and broke her hip, and had surgery.

For the immediate time she was not able to go downstairs to do the washing. My wife and I have always had this routine where she would carry the washing downstairs and put it in the machine, and then I would take it out of the machine, put it in the dryer, carry them upstairs, and she would sort the clothes. All of a sudden, it occurred to me that I didn’t know how to use the washing machine. I called my Ally on my Envision Glasses and with her help, I was able to learn how to use the washer. I found that an immediate use for the Envision glasses.

I agree with you that it would be an improvement if they could combine some of the functions, particularly the instant text function with the explorer function. I have enjoyed driving down the road as you did, and reading the instant text, I found it very informative. I wore my glasses from the very beginning just as you said you did. I put them on in the morning and leave them on. I’m surprised at how much I’ve been able to do that I couldn’t do, how much information that I’m getting from the glasses. I’m looking forward to future updates. Once again, thank you for your demo of these glasses.

Jonathan: Thank you very much, Roy. I’m glad you enjoyed it. I’m glad you’re enjoying your glasses. It’s always good when your wife is your Ally. I hope that she is fully recovered soon and doing all right.

Matthew Chao: Hi, Jonathan. My name is Matthew Chao. I’ve been listening to your podcast off and on for a while now. Just wanted to say, great presentation on the Envision Glasses. Wish I could afford them. I’m retired, so [chuckles] I really can’t. A comment or two on those. I would like to see them obviously come down in price, and I’d like to see more attention given to the exploring part where I agree with you where it should be able to identify both things and signage. Also, I like to see something done in terms of making them more rugged in terms of standing up to inclement weather and that sort of thing.

The other comment I have is I bought an Apple Watch Ultra a few weeks ago. My initial observations are that I really like the sound quality. It’s much clearer than the smaller Apple Watch series, and it’s a nice watch. However, when I’ve gone swimming with it, it takes a long time to get the water out of the watch, possibly because of its size. It sounds muffled when you first come out of the water even after ejecting the water from the watch.

Jonathan: Thanks for that report, Matthew. I have heard good things from several people now about the Apple Watch Ultra. I continue to convince myself I do not need one. They do sound really nice though.

Scott Davert: Hey, Jonathan. Scott Davert checking in. I just had a few comments about the Envision AI glasses. The first one of course is that you did a fantastic evaluation/demonstration of the various features and functions, and how they work in your life. I found it very, very intriguing. Thank you very much for doing that. I really liked that you were able to take the Bluetooth transmitter connected to your mixer and get a really nice, crisp, clean recording of the TTS on the glasses. I’d be curious, if you don’t mind sharing, what Bluetooth transmitter you were using to make it be stable.

I have done this with the Apple Watch in the past for various demonstration purposes but the one I was using was on some kind of a voice activation mode or something. Every time you activated the watch and made VoiceOver speak, it would miss the first syllable if you had waited more than 10 or 15 seconds to interact with it. If you have something that is better and wouldn’t mind sharing the information, I’d appreciate it. The other comment I had is related to access. You probably already know where I’m going with this, and probably a lot of the listeners do as well.

We need to develop Braille access for this. The technology is at that point where people can really benefit from the different services this offers. As someone who only uses Braille on my iPhone unless I have absolutely no choice, I’m cut off from it. In the marketing they say, “You can use it with a Braille display. You can take a picture of a document and send it to your phone whether it’s Android or iOS, and then you can read it that way.” Which is true, but if I can’t take the picture in the first place, how am I going to get it over to my phone? Okay, that’s partial Braille access, but I can’t actually use the applications that are on the glasses and the way they were intended to be used as a Braille user.

I’m really hoping that they will develop some sort of Braille interface for this so that you can interact with it and read the information in Braille, which by the way, as a deaf-blind user, would eliminate one of the pieces of the puzzle. If you think about it, when you’re trying to take pictures and things like that, when you can’t hear the speech and you can’t hear the audio feedback, you’re already using two hands. You have your Braille display and you have your phone, or if you choose to go the route of say, for example, manipulating the document instead of the phone, either way, both your hands are totally in use.

If you had access to these head-worn devices like the Envision AI glasses, then you would still have a hand free to do whatever you needed to do because you would have the glasses on your head, and you could read with one hand on the Braille display what you’re getting in terms of information. I really hope that they will look very seriously at adding Braille support very soon. I think that this technology is really at a point where it’s not yet affordable for many, but that it’s becoming something that people can take great benefit from, and deaf-blind people in particular need access to all the information they can get.

I’m hoping that the deaf-blind won’t be left behind in terms of this technology. Apple, for their part with door detection, it works with Braille. Again though, you have to either find a way to secure your phone so that it’s not in an inappropriate place to take pictures, or you have to carry both devices. It’s something that would be great in the form of glasses. In fact, I think that was part of my comment before. One other thing, Aira has text calling.

Hopefully, the Ally app will also get that feature at some point, again, so that deaf-blind people can also take advantage of this technology. I think we’re really at a spot where we’re moving forward. Now, we’re not quite there yet, I think it’ll take another few years, but I would say that those glasses are just one in the many examples that I can think of of technology moving forward, and doing so in such a way that it can really benefit people. Hopefully, those are things the developers will look at.

Jonathan: An excellent contribution with critical suggestions there. Thank you, Scott. I hope that Envision will take that on board. Braille support in these glasses, particularly for deaf-blind people, would be quite a game changer.

[music]

Announcer: What’s on your mind? Send an email with a recording of your voice or just write it down. Jonathan@mushroomfm.com. That’s J-O-N-A-T-H-A-N@mushroomfm.com or phone our listener line. The number in the United States is 86460 Mosen. That’s 864-606-6736.

[music]

I completely lost speech on my iPhone as well

Jonathan: Let’s talk more matters iOS 16. Brian Gaff in the UK starts us off. “Regarding the weird bug reported of either no sound on power up, or stuck in headphone audio, yes, I have had both of these recently, the latter one in the later versions of iOS 15, and the first one in the first two iterations of iOS 16. Indeed when 02 came out a short time ago, it specifically mentioned the lack of VoiceOver after the startup. Encountered this the night before your show, and had to wait until Monday to get somebody to put my pin in, which is what it wanted, but I could not tell that.

Seems a bit daft that you have to trust somebody else to put that in because the operating system locks you out. Irony,” says Brian. “I did ring the disability helpline and we went through all possible ways to reboot, except, of course, the complete reset to factory defaults. He then advised me that I would have to get sighted assistance but did not mention that the new update fixed it. On Looking around the web, I’m led to believe from what I read, that this might be due to the code used to make the sound on the later versions of the iPhone not handling the old system very well, and the sound things never get activated for VoiceOver till after the pin prompt.

It makes me wonder if it’s the audio channel used for VoiceOver that the new chip uses for the startup sound. Anyway, apart from having to turn up the ringtone and notification menu, and turn the screen curtain back on, it seems to be behaving. I have just downloaded 03 as I’ve been having weirdness with email. Time will tell if that is sorted too. At the moment, it does seem to be a bit more power-hungry. My battery condition on my 10R is still 91%, so not too bad, but it seems to get low a bit faster than before.”

Thanks, Brian. It’s a bit of a worry, this business with VoiceOver not speaking. I can now talk about that from personal experience. Yes, often when you do the upgrade to a major new version of iOS, there does seem to be a temporary battery life hit. Then for many, it comes right, but repeatedly year on year, and I’ve been doing this long enough now to see the patterns, some people do seem to experience quite significant battery hits when they upgrade. Sometimes they settle down with subsequent little double dot updates, but at other times, the only way around it is to do a complete restore to get your battery life back.

iOS bugs are making me contemplate Android

Dennis Long: Hey, Jonathan, it’s Dennis Long. One very frustrating thing, and I noticed it back in iOS 15, and it’s reared its ugly head and actually gotten worse in 16, is VoiceOver seems to have a focus issue. One very prominent place for that focus issue is if you open the phone app and you go into recent calls, and you either with a Bluetooth keyboard or using the touchscreen, double-tap on an entry, let’s just hypothetically say you called me, I could call you, but it calls John Smith 10 entries down or 5 entries down in the call log. That’s a problem.

Android’s done a really nice job of starting to catch up. They’ve made some missteps along the way, not introducing HID, I’ll agree with those people, but they’ve done a nice job. They’ve built the Braille in. That’s a good step. They put the Braille keyboard in last year, the year before. They also made it where you can check and uncheck what’s in the menu, the accessibility menu. They’re taking some steps in the right direction. If they can get it together and keep making strides and keep improving, which I hope they do, they could be a real viable alternative, and should not be ruled out if Apple can’t get its accessibility issues straight.

You cannot say we at Apple care about accessibility and then let major bugs go where you can’t do core things with the phone. You can’t make a phone call without using Siri. You can’t use Apple Music, because people have told me, I’ve had friends coming that it skips on Apple Music as well. This skipping issue needs to stop. They need to dedicate whatever resources they need to fix this issue. Now, it doesn’t happen every time but I don’t care. When it’s still there from iOS 15, you haven’t dedicated enough resources. They need to get it together and prove that they care about accessibility.

Jonathan: Thanks, Dennis. It is that time of year again, isn’t it? I’m not dismissive of what you’re saying at all. It’s just that time of year again where we’re right after a major update to iOS and we go through these cycles where there is some frustration. You are right of course. I’ve talked on this podcast and elsewhere about the concept of equivalency before. If a sighted person was being afflicted with this problem, would it be fixed in a heartbeat? Clearly, if you tap on a contact as a sighted person and it calls the wrong person, that can be fundamentally embarrassing, and it would be one of those showstopping bugs that would be fixed.

Now that said, I can’t duplicate that one, Dennis. This is one of the things that can make these things difficult to reproduce for the accessibility team. For example that magic tap one that I cited a couple of episodes ago in the third, was it beta of iOS 16.1. Although they did track that one down nice and quickly, and I’m very grateful for that. There’s this whole business of low audio for some people on a call that doesn’t affect others. These things are complex. That’s not an excuse because Apple is a trillion-dollar company, and they could put more resources into accessibility if they wanted.

The old saying about the other person’s grass always being greener comes to mind. Google has its bugs too. Yes, they are coming along nicely. The trouble we’ve got is that every year, Apple comes up with something really substantive in the accessibility area that just locks in their lead. Google’s doing stuff that for the most part Apple has done some time ago, and then Apple comes up with something new. Now, you may well argue that’s all fine and dandy. If we’ve got all these new bells and whistles, what’s the point if we can’t perform the basic functionality properly?

I agree with that. This thing about the phone call volume has been a frustration for me for a long time, that if I’m on hold with someone, then it’s really difficult at times to do other things while you’re on hold, with the phone. Now of course, if you’ve got a Google Pixel, if you’re one of the few countries where Google’s actually selling the Google Pixel, you could do that thing where they go on hold for you, and they call you back or alert you when the call’s actually been picked up. That would be nice.

Then of course we have the issue where we get all excited about the Braille being finally built into talkback, and that’s a wonderful initiative, only to find that if you have a HID Braille display, you are out of luck, and you can’t use your Braille display as a Google device with Android. I’m not excusing Apple but what I am saying is there are problems with either mainstream company. We’ve just got to hopefully constructively put the pressure on. I hope that when we discuss the bugs that people are experiencing, we are comparing notes, we can hopefully file quality bugs with a view to them being sorted out.

You are right, and we’ve talked about this before. All the bug filing in the world is not going to help if there’s a bottleneck because accessibility at Apple isn’t being appropriately resourced to remedy those bugs. For now my personal view particularly since I have a HID Braille display is I’m hanging in there with the Apple ecosystem, I think it’s slick. I think it continues to evolve in exciting ways. I hope that we can constructively provide feedback that will make it better. I’m not personally at the point yet where I think there’s something significant on the other side, as it were, that would cause me to forego my considerable investment in apps over the years.

Where we all want to be, I hope, is that we have the same degree of choice as sighted people do. Because some people choose Android, they just prefer it for various reasons. Some people prefer the openness, the way you can geek out with Android a lot more. Some people prefer the iOS experience. It is a shame when various accessibility considerations taint the picture or the choice. Hopefully we will get there.

Alyssa: Hi, Jonathan. I just started watching, sorry not watching, but listening to your podcast two or three days ago. I’ve heard a lot about iOS 16. I’m wondering, do you think you can do a review or a comparison between the new iPhone 14 Pro and the Android Galaxy Z Flip, because I’m thinking about switching because of all the accessibility issues that you were discussing this week, and just because there’s just so many issues that are coming up with Apple’s accessibility, and I feel like Apple’s just focusing more on the camera. Is there any way you can try to do a comparison between Android’s talkback and of course Apple iPhone 14 Pro? Thanks a lot. You’re the best. Great podcast

Jonathan: That is Alyssa with that contribution. Thank you, Alyssa. Good to hear from you, and it’s nice that you’ve discovered the podcast. If you go back in the archives, so if you’ve got a podcast client that you’re listening through, you can go through the episodes, they’re all there, you will find quite a bit of Android a little bit earlier on, because I did get a Samsung Galaxy S21, and I was playing with that for a while. We talked about that. We had Ed Green come on from the blind Android users podcast, and we talked about Android. Also Nick Zammarelli did a review of the Flip.

We do have quite a bit in the archives, not so much a comparison, because it’s quite difficult to do that really, but we do talk about these devices earlier on in Mosen At Large. Fossick, as we say, fossick through the archives and you will find quite a bit of Android-related material earlier on in the show.

Digging deep into the problem of VoiceOver being quiet on calls

Balwant, I hope I haven’t mispronounced your name too badly. Thank you for writing in. He says, “Hi, Jonathan. I have been listening to your podcast for a number of years now and find it informative, and I enjoy listening.”

Thank you so much. I appreciate that. “I am emailing you for the first time though regarding the volume ducking as I was experiencing the same problem on my iPhone SE 2020. Below is the fix which has worked for me supplied by my computer and phone expert, Brian Negus, who advises our local group of visually impaired people. Step one; go to your home screen. Step two; turn the rotor to volume. If volume isn’t present on your rotor, open settings and go to settings/accessibility/voiceover/rotor, and swipe through the options until you reach volume, and double-tap to select it.

Now, return to your home screen and turn the rotor to volume. Step three; swipe up with one finger repeatedly until the volume is set to 100%. Step four; finish by turning the rotor away from volume. If the volume was already at 100%, then this fix won’t have worked for you, but if the volume started below 100%, you should notice an improvement next time you use a keypad during a phone call or try to find buttons during a Zoom meeting. You might now want to remove volume from your rotor. Most people don’t need it, and it can result in unintentional VoiceOver volume changes that can sometimes be difficult to resolve.

Here’s the explanation for those who want the details. The volume setting on the rotor determines the percentage of the media volume to which VoiceOver is set. If it is set to 50, VoiceOver speech will be half as loud as any music or audio book. Some people like to have VoiceOver quieter than media, but there is one snag. There is a setting called audio ducking which comes to your rescue when you try to use VoiceOver while audiobooks or music is playing. With audio ducking turned on, the volume of the audiobook or music temporarily lowers to allow you to hear VoiceOver.

However, so far as I can determine, audio ducking doesn’t operate during phone calls or Zoom conferences, and probably other conferencing systems too. During a phone call or Zoom conference session, the iPhone gives priority to the call or conference, and does not reduce their volume, so you can hear VoiceOver speaking more easily. If you have VoiceOvers volume set to less than 100% of the main media volume, then there is a likelihood that the media signal, for example, the phone call speech, will be too loud for you to hear VoiceOver speaking.

If you set VoiceOver’s volume to be 100% of the media volume, then the call and VoiceOver are both talking simultaneously at the same volume, it won’t be easy to hear them both, but at least one won’t drown out the other. So far as I’m aware, there is no way of setting VoiceOver to be louder than the media volume. You might be wondering why I recommend putting volume on the rotor if it isn’t there already. People may begin their VoiceOver journey with volume on the rotor, and use that rotor position deliberately or unintentionally to change the volume setting. They may then decide to remove volume from rotor, but without setting voiceover volume back to 100%. The only way to restore voiceover volume to 100% is by returning volume to the rotor and using it to reset the volume. Of course, if you want to keep volume on the rotor, you can adjust voiceover volume whenever you want, and you could always adjust it to 100% before making a phone call or joining a Zoom meeting, but, for most of us, it’s simplest just not to have the volume setting on the rotor. I hope this information will be of use to your listeners that were experiencing the same problem.

I would make one addendum to this excellent set of instructions that may help some people who are not experiencing this bug but are experiencing something different because voiceover volume, by default, is set, I believe, at 80%, and that is that.

You might want to check how you’ve got a setting set in accessibility, voiceover, audio, and then I believe it’s under sounds and haptics. There’s an option there that determines whether the rotor volume controls the entire volume or just voiceover volume, and you want to make sure that it’s only controlling voiceover volume before you attempt these steps. I also want to thank Mike Thomas, who got in touch with the same advice. Thank you very much, Mike. It’s good to hear from you, and I’m not playing the contribution because it’s a duplicate, essentially, of what I just read out, but I hope you’ll stay in touch.

Now, I have plenty to say about this issue of the low voiceover volume on calls. I want to thank Jana Schroeder for raising this a few weeks ago because it really got me motivated to think about this and do what I can to help because it’s something that I’ve been experiencing for some time. I know that not everybody listens to every episode. You’ve got a life, right, so let me just recap this issue for those who haven’t been following it. The issue is that some callers report that when they make a phone call, voiceover is so quiet that it’s pretty much impossible to hear voiceover when somebody is saying something or when you’re on hold and music is playing away on hold.

Now, as I said when Jana raised this, the way that I used to get around it was that there was an audio destination rotor option that you could rotate to, as long as you could hear voiceover clearly enough, and you could flick up or down, change the audio destination, and voiceover would come back at an acceptable volume. For some time, when you’re on a call, that audio destination rotor item hasn’t been appearing for me, so it hasn’t been possible to change it, even though it is still selected on the rotor.

Now, some people have chimed in and said that they have fixed the problem using the method that was just so very clearly described, so I’ve left that in, but I don’t think this is the same bug, and I’m pretty confident that the bug that we are talking about, which some people may not have heard and so they don’t appreciate how faint voiceover can get, is specific to made-for-iPhone hearing aids.

Now, I’ve spent a lot of time on this in the last couple of weeks. When I got my iPhone 14, I took one for the team and decided to set it up as new for a while. The reason why I did that was I wanted to see if the problem persisted when I hadn’t changed any defaults, and I can tell you that the problem does persist. If you set up the phone as new, you don’t change anything, it is still really bad. This was happening for me with my made-for-iPhone hearing aids, and it was also happening when I would switch on my new Sony WM-1000XM5 headphones, which are a great product, but that has to be one of the worst product names in history. When I used the headphones and made a phone call, I still got the problem of voiceover exceptionally quiet on a call, and no way to change it.

Now I’m pretty confident I understand what’s happening because when I went back and I looked at the contributions, some of which I’ve played and I have been having some email conversations with others who didn’t want their contribution included. Of those contributions where I was absolutely certain they were experiencing this same issue that I’m talking about and that Jana’s undoubtedly talking about, the common denominator appeared to be made-for-iPhone hearing aids. To test this out, I unpaired my made-for-iPhone hearing aids from my iPhone 14 Pro Max, and I restarted the phone, then I enabled my Sony WM-1000XM5 headphones, and when I made a phone call, with voiceover set to 100%, it was very clear. In fact, it was almost too loud with voiceover at 100%, but everything was working as I expected. There is no way that I would have a problem hearing voiceover if I’m on hold on a call, I pair my made-for-iPhone hearing aids again, the problem returns.

Now, I’m not clear that this is the case for every manufacturer of made-for-iPhone hearing aids, but I’m pretty confident that everybody who has reported this issue is not using the same brand as me. If it’s not a universal problem with made-for-iPhone hearing aids, I think what we can say is that it is a widespread problem with made-for-iPhone hearing aids. Here is what you can do. If you pair your made-for-iPhone hearing aids and you make a call and you find that voiceover is super quiet even when voiceover is set to 100% on the rotor, if you have something that can plug into the lightning port, an external audio device, like even the old Apple EarPods. In my case, I connect a cable that goes to my mixer. Connect that, toggle voiceover off and back on again, and disconnect it so that you’re made-for-iPhone hearing aids are back connected to the iPhone again.

What I have found is that every time, without fail, that I do this trick, the problem goes away until I restart voiceover. That could be either by toggling voiceover off with a triple tap of the side button and then toggling it on again or a restart of the phone, but as long as you keep voiceover on after doing this trick, you’re going to have good volume. At least, I do. This is consistent, toggle voiceover off and on again, and it will go back to the horrible volume. I have now submitted a detailed report to Apple accessibility. It was really detailed, and I talked about all the various things that I’ve tried, including, obviously, setting up the phone as new, and I’m very confident about these steps. Apple Accessibility were fantastic. Within a very short period, they had written back to me thanking me for the detailed bug report, and they sent me a profile to install on my iPhone, which has a limited shelf life, which did some pretty low-level Bluetooth logging.

What I was able to do for them was do this trick I talked about, with something external audio plugged into the lightning port and then disconnected again, and I made a call where the problem was not there. I then turned voiceover off and back on, and the problem was back, so I made another call. Then I was able to send my logs to Apple and say, “Okay, have a look at what’s going on with the audio subsystem when I made the call with the bug not present, and then again when I made the call with the bug present.” They have this data now, they were grateful for it. They said they had passed it on to developments. Now if they can reproduce this and fix it, who knows what priority it will get? I understand, all these bugs compete for limited development resources, and I’m biased because this one affects me, and it affects me quite badly, but I do think that you’re dealing with quite a vulnerable population here, made-for-iPhone hearing aid wearers, who also use voiceover.

I should say that when it goes into this mode, standard notification sounds are not affected, so if you get a ping from your email clients or some other push notification when you’re on a call, thankfully voiceover doesn’t speak that on a call, and that’s deliberate because it would be distracting, but the sound of the notification is very clear. It’s only voiceover that seems to be affected.

Now, here’s another made-for-iPhone hearing aid wearer experiencing this. Imke says, “I find that this happens to me a lot, regardless of whether I am streaming to the hearing aids or the speakerphone or using the receiver as normal, in addition, voiceover’s response to touch gestures during a phone call seems to be somewhat erratic and sluggish, which makes matters even more difficult. For me, having the audio streamed to the hearing aids is still the best option because it gives me the best chance at deciphering anything that voiceover is saying. However, I have learned that, whenever I need to make a phone call, during which I need to press additional numbers on the dial pad, which is often the case these days when calling businesses, I have a Braille with a lowercase B display on hand for efficient access. I don’t seem to have the problem in Zoom as much, but it is difficult to hear voiceover when someone in the meeting is talking and the speech from voiceover can also be distracting from listening to the meeting.”

Therefore, I also prefer using a Braille display when I am in an online meeting. The Braille display is a great workaround when one is available, but it would be great if we as a community could figure out a solution to these issues. They’d also help during audio-only use. I am currently using an iPhone SE 2nd generation 2020 that is running iOS 15.7, but I have had this issue on earlier iPhones as well. Yes, I think this is something we need to convey. It is so bad when it happens for made-for-iPhone hearing aid wearers, that you cannot hear the keypad. If some IVR system is waffling away, either with a long series of menu options or just a lot of spiel, and you know the option that you want, you cannot hear the option that you know you need to press until the voice on the other end of the phone stops speaking. That is how faint it is. It is a really horrible bug, and you’re right, a Braille display will do it also, so will a Bluetooth keyboard because you can also type the numbers on the number row.

An Apple tech support rep was rude and unhelpful

Now I had a brilliant experience with Apple accessibility over this issue. We’ll just wait and see whether it gets resolved and how long that might take. However, another listener has had quite a different experience with Apple in general and is wondering how people find Apple Support in general, perhaps outside of the accessibility space. Pam MacNeil wrote to Apple Accessibility to complain.

I’ll read you the email that she sent them because she sent it to me. “She says I am writing as a voiceover user who relies on good sound quality, but if you feel the below is not your area, could you please pass along the following to whoever at Apple deals with such matters? I did just spend about half an hour on the phone trying to get a resolution. I purchased my iPhone SE 2nd generation on the 18th of July, 2020. I had heard this model had excellent sound quality, and as a blind person who listens to talking books, I felt this was the right phone for me.

About five months ago, the speaker near the top front of the phone, which also houses the camera started distorting. I googled the issue and proceeded to take the actions recommended, such as checking the phone was not on do not disturb, sliding the ringer up and down, and even resetting the phone. However, nothing has worked. A friend advised me to pursue the matter with Apple, and so I called just now. While waiting to be put through to the correct department, the AI asked me to input my phone number. However, when I tried to do this I didn’t have enough time to input the numbers. However, I did get put through to someone to deal with my issue.

The person I spoke to was difficult to understand and, likewise, seemed to have trouble understanding me. I am unsure why, as I have a very clear voice. I also got the impression he was deliberately misunderstanding me as he kept bringing odd issues into the conversation, such as raising the issue of attaching the phone to a speaker, which is irrelevant since I need the built-in phone speakers to work correctly. Although I explained I am blind when he asked me to read things out to him, he really seemed to have great trouble appreciating this means I can’t see, as he kept telling me to look for things.

Anyway, long story short, we eventually concluded the phone would have to be serviced by a technician. However, despite me giving my postcode, he couldn’t seem to find a shop nearby for me to take the phone to. In the end, he mentioned a place in Auckland, and I said, ‘No, I live in Upper Hutt.'” I mean that’s a long way. For those people who aren’t aware, that’s like a 400-kilometer difference. “And he tried to refer me to a shop in Robina. I said, ‘You’re joking. That’s in Australia.’ He went silent, so I ended the call as I felt I was wasting my time. I have been an Apple user for some time that I’m so disappointed by the phone’s performance and the lack of service I just received. I’m wondering if I will remain with the brand. Can you help, please?.”

Apple Accessibility wrote back with this. “Thank you for your email. We apologize for your recent support experience with AppleCare, that left your iPhone’s repair needs unresolved. We can submit feedback on your behalf if you provide us with the case number from that interaction. With regard to getting your iPhone serviced, we are unable to set up repairs or other service calls via this address. We recommend to either contact AppleCare Support again to connect with someone who can set up a repair or view your service options at getsupport.apple.com.”

Pam has sent an addendum to all of this and she says, “Hi, Jonathan, since I know you discuss Apple issues on your podcast, I thought I would send my complaint to you. While it is apparent that the person who replied does not believe this issue falls within their area of work, I found the advice to send it on with the case number of the issue unhelpful, to say the least. If the initial note sent by me had been read properly, it will have come as no surprise to realize that I simply didn’t get a case number. I didn’t go into detail in my notes to Apple Accessibility, but the man I spoke to at Apple was downright rude.

He kept sniggering and even started calling me “Spam” after we had established what my name is. I found this guy completely obnoxious and thought Apple should know about the caliber of some of its employees, but I have to conclude they don’t care. Putting all this aside, I am wondering if anyone of your listeners has experienced the issues I have had with the camera speaker on their iPhone SE 2020.” This is an interesting one, isn’t it? Because a few months ago I had a spade of listener concerns being expressed that Apple Accessibility when you called their accessibility hotline number were actively rejecting calls that they perceived were not accessibility related.

Now, we don’t actually have a toll-free number for Apple Accessibility in New Zealand, so if you want to call it, you have to call it on the Australian number, and many mobile plans these days do have free calls to Australia built-in. That’s not a big deal, but I wonder whether had Pam called that number, she would’ve got rejected because they would’ve said the iPhone speaker isn’t accessibility related, and yet it’s particularly important, as a blind person, to her. She went through the regular channel. They clearly don’t understand about what being a blind person is. They gave her the runaround, and then what surprises me is that nobody at Apple Accessibility wanted to just take ownership of the problem and make this right for the customer because that’s the thing that Apple used to be known for. What I would hope would happen is that even if it’s not possible for somebody at Apple accessibility to make the appropriate appointments and do the research. Surely, there must be a means of looking at the Apple ID. Assuming that Pam is writing from her email address, there’s a good chance that that email is the Apple ID.

Looking that up, if there was a case number assigned to this debacle, finding that out, contacting somebody who could take ownership of the problem, and saying, “Look, we’ve got an unhappy customer here. We need to make this right.” For good measure, given that Pam says that she was being called spam by the rep and the whole thing was just a very negative experience, chuck in a $50 iTunes gift card for goodness sake. This is not what people pay top dollar to Apple for.

The App Switcher in iOS16 is not working correctly for me

Let’s go back to an issue that was raised a few episodes ago. Daniel says, “Hi, Jonathan. I’m having what seems to be a focus issue with voiceover in the app switcher. I’m running the latest version of iOS on an iPhone 12. I’ve tried the usual troubleshooting steps with no success. When I bring up the app switcher, there is an app in focus, but if I try swiping left or right, I hear the sound that you hear if it is the only app in the App Switcher, even though it isn’t. I can double-tap and bring them all up. I’ve figured out this only happens if I bring up the App Switcher While in an app. If I bring up the App Switcher from the home screen, the app switcher works like it should. When I am having the problem, I can touch the screen and it will work like it should. Have you heard of anyone else having this problem? It only started happening with iOS 16. Any help would be appreciated.”

Thanks, Daniel. Good to hear from you. Dean Charlton mentioned this in his contribution, and I had heard of it. I’ve not experienced it myself, but I had been told that one magic bullet to fix it was that if you have the invert colors option on, switch it off and that should fix it. Now Dean tells me it has not fixed it for him, but I’m also told that in iOS 16.1, which is about to drop in the next couple of days, at the time that this podcast is published, that issue is fixed. While we’re on the subject of things fixed, the ability to add widgets to the lock screen and customize your lock screen seems much better than it was.

People who found that a bit confusing before, I’m not sure that it’s super intuitive now, but it’s way better. You might like to have another play with that. I’ve got my lock screen rocking. I’ve got my next appointment there at the top. I’ve got a Just Press Record widget. If anything ever happens I’m being refused service or something just needs to be recorded, I can just double-tap a button right from my lock screen and Just Press Record starts to record, I love that. I have that on my Apple watch as well, and I’ve also got the weather conditions, so I’ve got my lock screen customized in a really cool way.

Announcer: What’s on your mind? Send an email with a recording of your voice or just write it down. Jonathan@mushroomfm.com. That’s J-O-N-A-T-H-A-N@mushroomfm.com or phone our listener line. The number in the United States is 86460 Mosen. That’s 864-606-6736.

thoughts on jingles, Optacons and iOS

Steve: Hey Jonathan, “It’s Steve Bauer out of Wichita, Kansas in the United States, and I’ve got several quick points to make based on things that have been talked about over the last few additions of Mosen at Large. Well, it’s at the top of my bucket list, and that is to go to Jam Creative Productions down in Dallas to sit in on a session of them recording jingles and, someday, I hope that maybe I’ll be able to actually purchase my own jingles for my radio shows.

I got my first Oticon back in 1977 and used it extensively throughout my work career. I still have, actually, three Oticons. Two are safely stored away, and the one sitting here on my desk gets used almost every day. It’s one piece of technology that I could not live without. Well, I’m still fighting with my iPhone and the iOS system, usually, once or twice a day when just moving between apps or messages and email or text messages, voiceover just stops, and I have to press the button on the right side of the phone, and then I press it again, and voiceover is back on, and I have to log back in and get going again.

Notifications are still not working correctly, and that is so frustrating. Whether it’s a text message, a weather alert, breaking news, earthquake notice, you name it. For example, I can reach over and pick up the phone, and then, all of a sudden, it’s like an avalanche of notifications. All of the messages are coming in, other breaking news, everything is coming in, just crashing in all at once, and, sometimes, it could be up to an hour or more later after they came in that I actually get them because the notifications are being held up and delayed.

The third thing that’s happening with the phone is, sometimes, when I answer a phone call, voiceover is just speaking all kinds of stuff on the screen. I haven’t figured out a way to stop it. It basically doesn’t even allow me to hear the voice of the person that is calling, and it’s kind of obnoxious. I’m not sure how to best deal with that mess, but, iOS, you still have some problems to solve. Now, the one thing that has annoyed me a lot lately is the problem with the sleep timer on the Bard Mobile app. It is broken. If I have the sleep timer on, say, for 15 minutes, the book will play along like it’s supposed to and fade out just like it’s supposed to.”

Automated Reader: He knew just by looking in her eyes that they had a future. She didn’t care about the past.

Steve: Right now, there, it faded out just like it is supposed to, but let’s see what the timer says now. Under the previous way it worked, when it worked correctly, it would tell you to double tap twice to cycle through the different timer options, 15 minutes, 30 minutes, 45 minutes, and so on, but now, here’s what it says.

Automated: Sleep timer button, less than one minute remaining.

Steve: “Less than one minute remaining.” The way it used to work, I could double-tap on the sleep timer and continue listening, but now.

Automated: Sleep timer, 15 minutes.

Steve: If I go down to the stop and play button.

Automated: Stop button, double tap to stop.

Steve: We’re already stopped. I’ll hit double tap on stop.

Automated: Stop, play button.

Steve: Now the play button is there. Sometimes, this procedure has to be repeated more than once. Now that the play button is showing, I can go back to the sleep timer, double-tap on it and the book will resume playing. It’s broken. It doesn’t work like it should. It’s rather annoying because sometimes you have to go back and press stop and play multiple times before it will start playing, so I hope NLS will deal with this on their next release.

Jonathan: Oh, I know that the Bard app is a very precious app to many blind Americans, so if you are having problems with the app, I think I heard somewhere that they just did a significant redesign. Let us know how you’re getting on.

Should AIra retain its free calls?

Umberto: Hi Jonathan, my name is Umberto. I’ve been listening to your podcasts. It’s such a great podcast, and I just wanted to comment on the question you posed concerning Aira should prioritize paid customers or should Aira take off their free program? I’ve been an Aira user for a couple of years, and I started using Aira when they started, and I really enjoy their services now.

Now, I do understand, though, that Aira is a for-profit company. It is a business, and they need to feed themselves somehow, somewhere. A large number of blind people, especially in the United States, there’s like 70% are unemployed. Yes, Aira does have a job-seeking program where 30 minutes of tasks so that they can help you achieve your resume, or, if you’re unfamiliar interviewing sites, in person, they can help you map it out and navigate, but, other than that, they just have that.

A lot of folks were underemployed. I happened to be in a position where I’m employed, but I cannot afford Aira. I have a lot of other things to do. I have family to care for, I have my own expenses, I have my rents. Rents are going up, just everywhere, in Seattle, where I live, but just everywhere, the inflation costs. I don’t know, I think Aira should be, in my opinion, free. I don’t want to come across as I have an entitlement mentality, but more in the sense that I went to this training center to get my blind skills, and Aira, when they were first introduced, when they came out., they have their five-minute free call that you can make just any time, you didn’t apply for 24 hours like they do now. You can make five-minute free calls just any time. You can make any call anytime during the day, you just limit it to five minutes and you can call back, and you get the same agent or whatever for five minutes.

They really helped me a whole lot. For my grocery shopping, if I needed to chase down a Uber delivery person or an Uber driver, they really helped me out. If I am going grocery shopping and I need to see what label says the expiration date, I can direct the trained agents who can really support, they can really make it easy and very efficient for me with just the iPhone camera. They’re really good at doing their jobs, and I don’t think they should prioritize paid customers over free customers.

I recently moved, and I had to use BeMyEyes. I can use BeMyEyes by the way, but BeMyEyes, they’re helpful, but, again, they’re volunteers. They may not know something, or they may not have the accessibility training or the training that another agent requires to help. They’re called visual– They just get the visual information in an objective manner. Whereas with BeMyEyes, I was trying to get my TV set up, BeMyEyes, literally, I got a guy. Literally, he said, “Oh, sorry, my friend. You should call someone else. You should call someone who can help you troubleshoot, who is more knowledgeable than I do. I’m sorry.” He literally said that straight up. All I needed to know is what setting does what, what program does what, just reading information. That’s all I needed to know. An Aira agent probably would’ve helped me more with that inquiry.

Jonathan: Thanks so much for taking the time to give us that alternative perspective. The perspective of somebody who really would feel the loss if the Aira free program were to be taken away. Over on Twitter, Tanya Harrison said that one compromise might be to offer a prepaid program for Aira with minutes that don’t expire. Without having a monthly plan, you could buy minutes and they never expire, you have them until you use them, that would generate more revenue for Aira than simply giving the minutes away, and it would also perhaps just lessen the burden on the system.

I suppose the counterargument would be there would be still people who may not be able to afford even that. Rebecca says, “I’ve had long waits with Aira, and most of my calls are business related. However, is it possible that the beta programs are contributing to the long waiting times? I’m referring to the new browser-based app. Aira may be going through some growing pains, and this may be a no-win situation. Please, remember that the community complained that Aira is too expensive and not accessible to those who can’t afford it. We can’t have it both ways. As Aira continues to grow, could paying and non-paying customers expect different types of services? Could some Aira explorers make better use of other tools. Could Aira warn customers about peak calling times? Maybe customers who do not wish to pay should use “BeMyEyes” for tasks that are not sensitive. Aira has partnerships with many businesses, and some of the non-paying customers may be using a service provided by another business, for example, Aira access locations. I have learned to be patient and will try to call back later. Aira can’t be the only tool at our disposal, we should always have a fallback plan.

Aira recently sent out an email informing Explorers about rate changes that will be implemented next year unless the customer is on a plan already. If you think you will use Aira, try signing up for the $29 plan, if you can lock in the current rate.

Announcer: Transcripts of Mosen at Large are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at pneumasolutions.com. That’s P-N-E-U-M-Asolutions.com.

Editing text without leaving Braille Screen Input for iOS

Matthew Horspool: Hello, my name is Matthew Horspool from Coventry here in the UK. If you are an iPhone, iPad, or iPod touch user and you also use Braille, you’ve no doubt already come across Braille screen input. It’s a very handy way of being able to input text using Braille on the touch screen of your i-device. One of the things that has historically not been so convenient in Braille screen input, though, is making a mistake because having made a mistake, you’ve had to rotor out of Braille screen input to characters, or words, or what have you, find the mistake, use the delete button to delete it and then go back into Braille screen input to Braille what it is that you actually wanted to Braille.

In many cases, I’ve personally resorted to the onscreen keyboard at that point because it’s just been easier. Well, thanks to a post on applevis.com, I’ve recently discovered a much easier way to edit within Braille screen input, and I’d just like to quickly demonstrate this to you now. I’m using an iPhone 14 pro running iOS 16. It definitely works in iOS 16. I don’t know whether it works in iOS 15 or earlier because, as I said, I only discovered this on applevis.com about two or three days ago. I’ll rotor around to Braille screen input.

Automated: Letters, misspelled word, edit, Braille screen input, orientation locked, landscape, tabletop mode, contracted.

Matthew: Okay, and I will just type, I’m in a message at the moment.

Automated: Hi, hope you are having a good day today?

Matthew: Okay. As it happens, I didn’t make any mistakes in that message, but there you go, I might have done. What I’m now going to do is use my left hand. You could use your right hand if you wanted to, I just find it easier to use the left hand, and I’m going to hold down the letter A, but, again, I could hold down dot two or dot three or dot four or dot five, or dot six. The key is to hold down a dot. I’m going to hold down one finger on the left hand.

Automated: In exploring mode two.

Matthew: Now I have to keep my finger held down. This is very important. As soon as you release your finger, you will come out of exploring mode. Having held down my finger, I’m going to swipe down with two fingers on the opposite hand. In my case, on the right hand, I’m going to swipe down with two fingers.

Automated: We’re in exploring mode. Two lines, characters, words, lines, characters, words, lines.

Matthew: You can see it’s moving between characters, words, and lines, a bit like what happens on the rotor.

Automated: Characters, words.

Matthew: I’ll go to words. If I swipe up with two fingers, by the way, it goes the other way around.

Automated: Characters, lines, words, character, lines, words.

Matthew: Words is what I want. I’m now going to swipe left with two fingers.

Automated: Today day good a having are you hope?

Matthew: Okay. You can hear that it is moving the cursor as though I was rotoring to words and moving around. I’ll go back the other way.

Automated: Hope you are having a good day.

Matthew: Okay. Hope you’re having a good day. Maybe I want to change that. Maybe I want to say “Hope you are having a great day,” the first thing I need to do in order to make this happen is move the cursor from where it is now at the end of the word “day” to the start of the word “day.” I’ll swipe left with two fingers.

Automated: Day.

Matthew: Now the cursor is at the left end of “day.” Now, if I release my finger on the left hand, I can now swipe left with two fingers-

Automated: Good.

Matthew: -and you’ll see that it’s deleted the word “good” so that the swipe-left-with-two-fingers gesture has now returned to what it would normally do, and I’m now going to type great, G-R-T, great, and space. Now I’ll go back into exploring mode by holding down with one finger.

Automated: In exploring mode. One, two.

Matthew: Fantastic. Why it’s saying, “One, two,” I’m not sure. I think that’s the dot that I’m holding down, but I’m now going to try and read by lines. I’ll swipe down with two fingers.

Automated: Now in exploring mode.

Matthew: Okay. It’s got a bit of extra verbiage there, but I can now swipe left.

Automated: Hi, hope you’re having a great.

Matthew: Okay. It’s read up to where the cursor was. If I swipe right.

Automated: Hi, hope you are having a great day today.

Matthew: Okay. You can see that the message now says, “Hi, hope you are having a great day today.” There are one or two other gestures. I think if I swipe, I’ll go to characters, actually, swipe down with two fingers. If I swipe left with three fingers–

Automated: Space, full stop, selected, slide, selected, A, selected, D selected.

Matthew: You can see that does text selection if I swipe right with three fingers.

Automated: D, unselected. A, Y, full stop, space, unselected.

Matthew: That unselect. Swiping up and down with three fingers doesn’t seem to do anything. As I say, I only discovered this a couple of days ago. This is not a comprehensive demo. I’m sure there are other gestures in there that I haven’t discovered yet. I’m sure there are probably other things you can do in this mode, but even with just those few gestures, using Braille screen input just got a whole lot more useful. Hopefully, other people will find it useful too. In the meantime, thanks very much for listening.

Jonathan: Brilliant demo, Matthew. Thank you very much for putting it together. I wonder how long this has been around for. I don’t think it’s been that long. I like to think that if I spent the time now, that I used to back in the old days of writing the iOS books, I would have found this, and I would’ve been able to write it up whenever it was added. Part of me was pretty pleased that Apple really didn’t go out of its way to document new features in VoiceOver. They didn’t do videos about what’s new or anything like that because it helped me make a pretty good living. We sold many, many, many thousands of those iOS books. Thank you for buying them.

Now that I don’t have time to write them, I think, “Why doesn’t Apple actually do this stuff themselves? Why do they leave us to our own devices for a lot of these features to discover them? Interesting, isn’t it? Anyway, it’s good that people are– I get to use the word fossicking twice in the same podcast, fossicking around the operating system and finding these things out because this is a big deal. I should say that I did go to Apple’s website, I searched quite comprehensively for what documentation Apple has on Braille screen input, and Braille screen input is documented and there are commands documented, but this feature is not.

The Graphiti Braille display

I believe we are going to Sweden for this email. I say this with some confidence because there’s a +46 number in the signature and having come back from Sweden recently in Stockholm, that’s the country code for Sweden. Good detective work A and it’s Carl otto writing in. He says, “Hi Jonathan, I found your podcast through the Program L-mailing list where I saw a tip about the presentation of remote incident manager and the presentation the authors did.

Anyway, I heard the episode where you discussed the dot pad and I wondered if you’d tried the Orbit researchGraphiti. That seems like a very similar product. I confess that I haven’t scrolled through all the episodes of Mosen at Large to check.” I think we’ll forgive you, Carl Otto, because there are an awful lot of them these days. He says “I’m curious about these products, in the context of being a software developer. Would they be useful when designing user interfaces in Visual Studio? Would it be useful to mirror an Android or iOS simulator during development to get a real feel for the layout of the user interface I’ve designed? The Graphiti sports a price of $14,995, so it’s quite heavy for me as a sole three company just for testing if it’s useful. Therefore, I’m interested in hands-on reviews and similar, and thought of you.

Carl Otto, I have not had my hands on one of these, and it reminds me that we really should extend an invitation to Orbit research to have someone come on and talk about the range of products that they have. If anybody from Orbit Research happens to be listening, they’re very welcome. It would be great to do that. He continues. “When I first started the episode with RIM, I thought that your voice was familiar, and then it struck me. I’ve been a Beta tester of fusion since its first appearance, and you had their podcast back in the day.

Your review of the iOS 16 Beta was interesting. When it was released, I tried door detection on my work phone, iPhone 12 pro needs more practice, I think. Have you used the handwriting on the screen that you can use when voiceover is running? I think it’s somewhat faster to use than the virtual onscreen keyboard. I found and reported a bug in iOS 16 for the handwriting. You can’t write a lowercase “I,” it always becomes a lowercase “L”. Apple has confirmed this, and I hope they’ll fix it in the upcoming dot release so I can update my primary phone, iPhone 12 Mini.”

Thanks, Carl Otto. See, it’s really cool that there are so many ways to get data into an iPhone these days. I can remember when I got my first iPhone, all you could do was double tap each letter, and maybe split tap as well, I think, but that was pretty much it. Over the years, there have been more and more input methods added, including dictation and Braille screen input, the various ways of using the onscreen keyboard, and handwriting.

Of course, you can use external keyboards and Braille displays. That wasn’t possible when voiceover was first introduced to the iPhone either. For me, my handwriting is just useless, and it’s not an input method that I would ever consider using, but I am fortunate that Braille screen input comes as second nature to me, and I am able to get info into the phone, using the virtual keyboard with Braille screen input, at quite a decent clip. That’s my input method of choice most of the time. It’s good that handwriting is there though, and it’s important that that handwriting bug be fixed. Perhaps by the time we play this, it already will have been. Thank you very much for writing in. I’m glad you’ve discovered the podcast, and I look forward to hearing from you more in the future. Sweden’s a lovely place, by the way. The air felt so fresh and clean when we were out there.

Now, here’s Jenine Stanley, and she says, “Hi Jonathan, thank you for the interview with Eric of Dot. I got my hands on one of the prototypes at the NFB convention, and the thought of being able to view graphical information in real-time was so exciting. I love tactile graphics. I am also fascinated by reports in the Salesforce App, Looker. Unfortunately, these reports, also called views, are, for the most part, inaccessible. I can’t wait to run my fingers over a look of view as presented by a dotpad. Sadly, the cost of such a device would be out of the range of many, if not most, individuals, but it would certainly be great to know that such a thing could be done, and those of us who do work with various business productivity tools, like Salesforce Looker, PowerPoint, Spreadsheets, and more, could have a meaningful experience with that data.

As it stands, we rely on another human to interpret that data. This is fine but doesn’t really give the entire experience of viewing a graph and interpolating information from it. If I could view the data myself, I might make a very different conclusion than the person telling me about it, because I’d know exactly what parts I was interested in and could then add to those parts, by viewing all of the data, not just what’s convenient to describe in a meeting.

I do think that, eventually, tools like this will open career opportunities for blind people, and I applaud Dot for moving the technology forward. Speaking of multi-line Braille displays, I’m sure you’ll be covering the Orbit research displays due to be available in October. I really appreciated the three-line display, though it was only a static model. I saw it NFB this summer. The five-line one was a bit too small per line for me. Who would’ve thought we’d be talking this much variety in displays and research into displays?” Thanks very much, Janine. It is exciting, isn’t it? The Braille renaissance is a happening thing, and I love that.

[music]

Automated: Jonathan Mosen, Mosen At Large Podcast.

Recording voice messages and melatonin concerns

Jonathan: Robin Christopherson is writing in and says, “Great show as always. I’m nearly caught up, so apologies if this has been covered already, but in the Messages app, you can also record a voice message, by simply raising the phone to your ear. I’m pretty sure this has been around for a while, but with the new layout in iOS 16, this might be a useful tip for many. It still works if you have AirPods in, but I can’t be sure about connected hearing aids, et cetera. Of course, it will probably also work if you raise it to a cheek or palm, say.”

It doesn’t work so well, when I’m using made-for-iPhone hearing aids, Robin, which is unfortunate. It’s a really important point because most people don’t use made-for-iPhone hearing aids, and it does work very well in other circumstances, just raising the phone to your ear. “Also,” says, Robin, “and I’m pretty sure I’m insulting your intelligence territory here, but your mention of taking melatonin to aid sleep prompted me to take a look. The below page gave me pause, retaking it for prolonged periods, and also made me hope that your dosage isn’t an unsafe one, and the website is nhs.uk/medicines/melatonin.” Thanks very much, Robin, and I’m certainly not going to be getting into the providing medical advice territory, and people should consult their doctor when putting anything into their body. What I can tell you is that I’ve been taking melatonin for 25 years. What I can also tell you is that melatonin is widely available over the counter in many countries, whereas, in the United Kingdom, it is a prescription-only medicine.

I think it is still, here as well, actually, a prescription-only medicine, but in countries like the United States, which tends to be quite a conservative place in this regard, it is a nutritional supplement, and you can just walk into any number of places and buy melatonin, including its supermarkets like Walmart. It’s treated like a vitamin. The final thing I will say on this is that I have had two conversations, during my career as a broadcaster in the blindness space, with Professor Steven Lockley. When I last caught up with him, he was still at Harvard. I’m not sure if that’s still the case. I first talked to him well over 20 years ago now, on Blind Line on ACB Radio. If you go into the archives for The Blind Side, which is still available online, you will hear the most recent talk with Stephen Lockley where we talked about blind people with a specific condition called non-24.

This is where people who don’t have any light perception have their circadian rhythms going into free fall essentially. For many of us with non-24, melatonin has changed our lives. I wouldn’t be able to do my job without taking melatonin. I’ve not become aware of any side effects over the last 25 years or so, but people should research these things carefully and take advice. What I would suggest, though, is those interested in this can go and listen to Stephen Lockley’s interview in The Blind Side archives, and maybe I should resurrect it and put it in the Mosen At Large archives, because it is such an important subject.

He made an offer then, and that interview was recorded maybe five or six years ago, so I don’t know whether Steven is still practicing or whether the offer still holds, but he said, “If blind people came across GPs, or medical professionals, who were not aware of non-24 and the very positive benefits that melatonin can have for blind people, he would be very happy to engage with those medical professionals and clue them in because the literature is overwhelming. In fact, there is a drug that’s available in the United States, which is a slightly more professional thing than the nutritional supplement level melatonin, which has had an even greater impact, but it’s still melatonin-based, and it’s made by a company called Vanda Pharmaceuticals, I believe.

I was quite surprised at the tone of that page on the NHS website, but it probably explains why melatonin is so difficult to come by in the UK. I’d be interested to find out from blind people in the UK with non-24, have you had much trouble getting through that barrier? When you talk to your GP about the life-changing benefits for blind people with non-24 of melatonin, do you face much resistance, particularly given the very down tone of that page on the NHS website?

Various tech topics

Following up on my reading of Sabahatan’s email and my responses to it, Sabahatan’s back again, and he says, “Hello, again. Of course, by now, that is by then, iOS 16 was released. I expect it took a little to catch up on the mailbag.”

Absolutely, we’ve got a lot of email coming in. I’m not complaining about it, but it does mean there can be a bit of a lag between when people send an email and when I get to read it on an episode. He says, “The punctuation controls and app privacy reports were actually features of later iOS 15 builds.” Yes, indeed. It just seemed like a good time to catch up and acquaint people with those features. “I’d still hold off,” he says, “for iOS 16.1, personally, if you haven’t taken the plunge already.”

“As you said, the widgets are just not very nice, from a voiceover perspective. In fact, widgets seem to be a problem wherever they are, in my experience. On the watch as well, and in Mac OS notification center. It just doesn’t seem to have received any accessibility attention at all, somehow. I did look at pocket casts, but the desktop apps were basically these weird web things, not all that different from the web player. I expect you know what I mean.” Yes, it sounds like they could be Electron apps. “Mobile rule everything now, sadly,” he says, “The iOS app wasn’t that bad actually, but no particular reason to prefer it. I’ve heard very nice things about the ubiquity ecosystem from its fans, but it’s a bit much if you just need the wireless access points because you need to run a controller device or software. The thing is, Sabahatan, that they’ve taken care of that now. There’s a device you can buy, and there are two flavors of it. The UniFi Dream Machine and the UniFi Dream Machine Pro, and you can always take a look and decide which one might suit your needs, but it is basically an all-in-one router, modem, and, most significantly, controller.

It’s got the access point in there as well, and it’s a breeze to set up. Once you get UniFi going and you adopt new devices into your Unify ecosystem, it is just such a joy to use. If you’ve not familiarized yourself with the UniFi Dream Machine range, I’d encourage you to take a look, and I’d be interested to hear what you think. He says, “At the risk of over-geeking, I actually perform my routing on an old Mac Mini 2012 with Linux on it, and I’m looking for something with a similar degree of flexibility. Microtech might be where I end up, but they are absolutely, definitely not for the fainthearted. Their switches are already serving me well though, and my network is all 10 gigabits from the lounge, to the bedroom, and everything that can be wired is great stuff. Right now, the internet is over cable, which is the only black spot. It’s a monopoly high-speed option and has a 1.1 gigabit downstream, but a mere 50 megabits upstream.” Whoa. See, our top speed that we have access to is 8 gigs down and 8 gigs up, not to rub it in or anything.

He says, “I would rant about market fundamentalism and the illusion of technological neutrality at this point, but I’ll spare you that for now. The important thing is that as soon as proper fiber connectivity gets here, I’ll take it up. I agree with you that boot camp would be very nice on the M series Macs. Apple has not rolled it out. Apparently, it’s down to a licensing deal that Microsoft has with Qualcomm. Technically speaking, the Macs are nothing like all other ARM machines, so there would be quite a bit of work between Microsoft and Apple to make it a reality. Although, Asahi Linux has done it all by reverse engineering and now works, and parts of that could be theoretically employed or replicated to support Windows. It’ll be interesting to see if someone else, motivated by the same desire, gets it going first. It’s undeniable that virtualization adds latency, though it’s certainly practical for many things now. Still, if you need Windows, it does make sense to keep an Intel or AMD-powered machine around, anyway, just to support all the software you have fully. Enjoyed the podcast as ever.”

Vispero supporting the Assistive Technology Affordability Act

Hello, once again, to Aaron Espinoza, who says “Hello, Jonathan. I don’t know if you already talked about it, but I would love to hear your opinion regarding Freedom Scientific supporting the Assistive Technology Affordability Act, ATAA in the United States. They have a video on their YouTube account, telling people to contact their representatives to support the bill. It didn’t pass this year, but it’s going to be introduced again, and I’m sure they’re going to support it again.” “Sorry to be cynical,” says Aaron, “but they would love it if the ATAA were to pass and become law because blind and visually impaired people would have $2,000 every three years to buy new hardware and software from Freedom Scientific. It would be a giant cash cow for them. I would also worry about the ATAA hurting innovation. What do you think?” If you haven’t heard it, this is the YouTube clip in Question.”

Youtube: Freedom Scientific, a Vispero brand. Have you heard of the Access Technology Affordability Act? The ATAA is a bill that makes access to technology more affordable to Americans who are blind and visually impaired, by creating a $2,000 refundable tax credit for items purchased over a three-year period. This includes items such as Braille embossers, refreshable Braille displays, screen reading software, and more. Acquiring access technology can empower users who are blind and visually impaired with the necessary tools to create and edit documents, send and receive emails, and access a wealth of information on the internet.

The ATAA enables the purchase of equipment needed to achieve success by making it more affordable. Sounds great, right? To get this bill to pass, we need your help. Contact speaker, Nancy Pelosi, and majority leader, Chuck Schumer, and urge them to include ATAA in The Build Back Better Act. You can also reach out to other House and Senate members, including co-sponsors and those who represent your district. Contact us for more info on how to get involved.

Jonathan: It seems perfectly reasonable to me, Aaron. First of all, your comment on how it might stifle innovation, I think the opposite is the case. One of the problems that we have in our community is that the need is real, but the socioeconomic stats of many blind people are dire. We’ve talked about this, the incredibly high unemployment rate, the fact that people really do have a genuine need for this kind of technology, but they can’t afford to get their hands on it, and that creates this awful poverty trap where people could have a greater chance of having employment if they could use this technology to upskill themselves, but it is often beyond their reach.

If you suddenly have this tax credit floating around, that meant that more blind people had the purchasing power to spend on this technology, it may well encourage new entrants into the market. People get grumpy in protest when people like me make these points, but the reality is that to spread the cost of manufacture across a small number of consumers, as you have to do with things like Braille displays, and even screen readers, the price is going to be high. If you’ve got quality products that are well supported by constant software development, by technical support specialists, by quality control engineers, those people have to be paid, and if you can have a situation where more people can get their hands on this technology because of a tax credit like this, then sure. It probably would be a win for Freedom Scientific and other Vispero brands, but it could also encourage new entrants into the market. They’re taking the chance that they will stand by their products that they believe they’ve got technology that blind people respect and need, and that they will be okay if any other entrants comes into the market, and, who knows, it may well really generate a bit of excitement in the screen reader industry, which I don’t think there has been for some years now.

[music]

Announcer: I’d love to hear from you, so if you have any comments you want to contribute to the show, drop me an email, written down or with an audio attachment, to Jonathan, J-O-N-A-T-H-A-N@mushroomfm.com. If you’d rather call in, use the listener line number in the United States, 864-606-6736.

[music]

[01:57:41] [END OF AUDIO]

Leave a Reply