Jonathan Mosen: I’m Jonathan Mosen and this is Mosen At Large, the show that’s got the blind community talking. On the show today, Apple will soon announce the next big updates to their operating systems. What are you hoping for?
We have a new gadget to play with at Mosen Towers, as our old washing machine gave up the ghost, and more on HIMS products.
Apple’s Worldwide Developers Conference is coming up on the 7th of June, that’s when the Keynote is going to be held, at [10:00] AM Pacific Time. That’s [1:00] PM Eastern Time in the United States. As ever, right after the Keynote, we will be here to try and make sense of it all from a blindness perspective.
On the panel this year will be Heidi Taylor, giving us good descriptions of what’s going on and getting into the weeds on the visuals. We’ll have Michael Feir and we’ll also have Judy Dickson and me here to talk about it.
We’ll also bring you in on Clubhouse as well, so you can go and find that event in the Mushroom FM club on Clubhouse. You’ll be able to listen to the initial discussion that we will do and then we will open it up.
I wanted to start talking about the WWDC anticipation this week so that everybody had ample time to get their wishlist in before the event on this and next week’s show.
As you may recall, when I was running Mosen Consulting, I used to compile my top-10 wishlist for the coming version of iOS. Many of these wishes were Voiceover-related, not surprising, given that I’m blind, and VoiceOver is the tool that makes using the operating system possible.
Some of my wishes were general in nature, while others reflect the fact that I’m also a hearing-impaired person, so I wear made-for-iPhone hearing aids, that side of the operating system is important to me as well.
I’ve been delighted over the years to see more and more of my wishes granted. It’d be hard for anyone to argue that Voiceover on iOS, and for that matter, iPad OS isn’t a powerful and mature product. Not only have many of the obvious Voiceover gaps been filled over the years, but there are some truly innovative features in Voiceover, some of which have demonstrated the benefits of the hardware being so closely linked to the software. The operating system itself is now very mature as well, and all this means that it is getting harder to compile a top-10.
But here is my list for this year ahead of iOS 15. This list isn’t necessarily in priority order. They are items that have occurred to me this year and in past years that still remain. I’ve got at least one new one this year.
First, I want contracted Braille input to work as well on the iPhone as JAWS BrailleIn does in Windows. I’ve tried putting this several ways over the years and I may as well just be blatant about it. That’s what I want. JAWS is the gold standard for Braille input, and when they truly understand how a blind person uses Braille input for editing and text entry, they’re not only capable of meeting it, but who knows, they may find a way to surpass that current gold standard?
I know that Apple has worked really hard on improving BrailleBack translation over the years, and I appreciate that effort very much and the improvements that have been made, but what I really want is for it to feel like I’m actually Brailling into a contracted Braille file when I’m entering, editing and correcting text. That’s how robust the JAWS BrailleIn support is in Windows.
This is an absolutely critical issue as more kids are given iPads in their classrooms. Our blind kids must have robust Braille support during their formative years when their Braille skills are being developed. We cannot and must not hold back on an issue as vital as literacy for the next generation of blind people.
Apple has to do better at getting contracted Braille input and editing right. Once we theorized that this couldn’t happen on an iPhone because PCs were just more powerful. We have long passed that being a viable excuse now. This is a UX problem and that also explains why Braille hasn’t kept up on the Mac either.
Although this is typically not the way Apple does things, I really just wish they would lock a bunch of truly competent Braille-using blind people in a room, put them under a draconian NDA if they have to, but truly understand how those of us who rely on Braille use it, and then fix it.
I’ve taken the drastic workaround after many years of trying to get along with Braille input on iOS, of using an APH Mantis, which has a QWERTY keyboard and therefore frees me in a relieving, liberating way from the foibles of Braille input on iOS once and for all.
This brings me on to my second wish for iOS 15. I’m a supporter of the Human Interface Driver for Braille. Being able to plug a Braille display into something and have it just work in the same way you can a printer or a thumb drive is compelling. It’s just common sense.
But it seems that Apple’s implementation of the HID for Braille may not have taken devices with QWERTY keyboards into account. You can only assign Braille keys to Braille-related functions, and that severely impedes your choice on a device like the Mantis.
A Mantis user must make decisions about what functions are most important for the tiny number of assignable keys available. The irony is that with a QWERTY keyboard, complete with modifier keys, Braille users who use QWERTY devices should be at an advantage, not a disadvantage.
Third, an API for text-to-speech. I think it was last year that we saw brief signs of this. Then it went away. It was removed during the beta cycle. It’d be wonderful if the API made it to release this year, and I’m quietly confident that it will. This is important because it’ll save you storage space and give you much more choice when it comes to how Voiceover speaks.
This is particularly important for those of us with hearing impairments. While a bit of variety is nice for all of us to have, the more voices a hearing-impaired person has access to, the more likely it is that they’ll find something that works for them.
In terms of the storage space savings, Voice Dream Reader, which has become one of my favorite iOS apps, has an excellent range of voices available. In some cases, those voices may be on your device two, three or more times, taking up valuable space. It’s not a good user experience. Then we can lobby for eloquence to come to the iPhone and therefore to Voiceover, which would be utterly epic.
Fourth, as I mentioned on the show recently, as a blind person who wears hearing aids, I would truly appreciate Live Listen offering a stereo option. For those not familiar, Live Listen is where you can use your iPhone as a remote microphone for AirPods or made-for-iPhone hearing aids. If you switch off the local mics on your hearing aid and you put your iPhone in the middle of a table, as a blind person, you have no idea where the speaker’s voice is coming from.
You don’t know where to turn to face the speaker who is talking, because Live Listen is broadcasting a mono signal to your hearing aids. There are times when you do want this and there will be times when you don’t, so it would need to be a toggle.
Fifth in my top-10 for iOS 15, I want the ability to toggle off Voiceover interrupting what I’m reading with notifications while still playing the notifications sounds. As a former product manager myself, I have always been fascinated by why people use what they use and why they use it in the way that they do.
I have been perplexed about why so many blind people still use additional blindness-specific devices, like the Victor Reader Stream in an era where the iPhone has battery life to burn and can play all the same file types that the Stream can.
Why are so many people bothering with another device that you have to charge, load content onto, keep track of, and potentially lose? Of course, it’s more money that many blind people struggle to find.
One of the reasons people cite for this is that some people prefer buttons, and there’s obviously nothing that you can do about that in an iOS update. Another reason is that if you’re reading certain types of content, like an Apple or Kindle book, a notification will interrupt the reading of your book, and it often doesn’t resume reading when Voiceover’s finished speaking the notification. Often you have to start reading from the top of the page unless your dexterity is particularly good.
My iPhone is highly customized. I have unique text alerts for all the important people in my life. Many of my other apps make unique sounds as well. When reading a book, I just want the option to hear the sounds of my notifications but not have Voiceover actually stop to read the notification, because if I hear a sound that indicates someone important is texting me, then I have the option to stop and check the notification. But if it’s a notification I know isn’t important, I can just keep right on reading. This would be a great middle ground between putting the phone on, do not disturb, and being interrupted all the time.
On to my sixth item, make the phone vibrate at power-up. This is probably a lower-level issue than iOS itself, but I just do not understand why. It confounds me why Apple has gone this long without the iPhone or for that matter the Apple Watch providing haptic feedback to say that the device is powering up. It is a 101 accessibility feature.
I’ve trained many blind people on the use of the iPhone over the years. I’ve seen so many of them get frustrated and disillusioned when they can’t even tell if the thing’s booting up or not. It’s not a very good introduction to a new device that you might be nervous about because people are apprehensive about the touch screen when you can’t even tell if the thing is powering on.
How can Apple get so many things right and ignore the need for something so rudimentary?
My seventh? Continuous reading improvements. There are several enhancements to the iOS continuous reading function that you currently invoke with a two-finger flick down that would further improve it for me.
I like to read my Twitter timeline in proper chronological order. Reading from the most recent tweet backwards, doesn’t give you the sense of a story unfolding. When I read my Twitter timeline, this is why I need to do it manually flicking through each tweet because there’s no way of reading continuously up the screen.
This feature would make a real difference in several apps that put the most recent item on the top of the screen. Castro’s Inbox, which is my podcast app of choice is another example of where this feature would be a major improvement.
I’d also like to be able to flick up, down, left, and right to navigate by elements such as sentence and paragraph without interrupting the reading. You could still use a two-finger tap or even perhaps the magic tap to stop reading.
Eighth on my list, external audio description. We have come such a long way with audio description in recent years, it is just amazing that as blind people we now have access to so much audio-described content.
When Bonnie and I are watching TV on our own, since we’re both blind, we’re grateful for all the advocacy and implementation work that’s been done to give us so much choice of content to watch.
My kids are all grown up now, but every so often they will come over, and we’ll do something bonding like watching a movie or a TV show together.
When they listen to the audio description with us, they’ll all watch the movie with description on and they won’t complain. I’ve heard them on occasion say that sometimes the audio description helps them take notes of details they might otherwise have missed. But I can’t help noticing that when they’re watching TV on their own, even in our house, they’re quick to turn off the audio description and that got me thinking.
I would love to see a feature in iOS that allows a user to specify an external device, to which the audio description could be sent. For example, let’s say the family’s watching a movie on Apple TV using AirPlay 2, which can output to multiple devices, I’d like to be able to tell the Apple TV to send the audio description part of the soundtrack to Bonnie’s and my iPhones.
I understand that this is not as simple a request as it first may appear, but I also believe that Apple is capable of handling it. It might also mean that you’d be able to keep the full Atmos soundtrack working without the audio description on capable devices.
It is really disappointing to find that when you switch audio description on, sometimes the soundtrack reverts to surround sound or even just boring old stereo. I think blind-sighted couples would really benefit from this feature as well.
You might well say that eight is enough. That’s an obscure reference to an old TV show, but eight is not enough. I’m going on to number nine. That is, I think CallKit needs to work with more peripherals. Now I say “I think” because what we know is that apps like Clubhouse don’t work on iOS with devices that emulate a USB peripheral through the lightning port, like a USB microphone or an audio interface.
That’s why sales of products like the iRig 2 have gone through the roof as Clubhouse has gained popularity. Perhaps this is within Clubhouse’s power to fix but one would think that if this were true, given how much of an emphasis they’ve placed on creators recently, they would have fixed it. If CallKit is the culprit, it would be great to see this work with more gadgets.
Finally, I’m still getting this one in. I am going to keep harping on about defect equivalency. Many of us are profoundly grateful for the way that Apple has altered the landscape of accessible computing, but we can be grateful while also demanding that it works just as well for us as it does for everybody else.
After all, we pay just as much as everyone else. Let’s be clear, all software has bugs, and you’ve got to stop developing some time. There comes a time when you have to make a call that says, “We need to release this now. We will absolutely keep working on the outstanding bugs, but we’ve got to get a product out there.”
The critical question then becomes what software bugs are unacceptable in a public release that doesn’t have a beta designation?
I believe that in an accessibility context, Apple is failing to address this question appropriately or even humanely. I don’t believe for one moment that the problem with Apple’s accessibility bug-filled initial releases of every major version of iOS for many years is that no one told them about the bugs, that’s demonstrably not the case.
We’ve all reported these bugs. Someone somewhere is making the call that it’s okay to release iOS with the extremely serious, for some users, show-stopping, Voiceover bugs that we’ve been seeing for the last many years. We can and should complain about that.
But I also believe we should complain constructively. To that end, again, I offer a simple guideline that in my view, should assist Apple to determine whether a bug is tolerable until it’s time for another major release.
The key to this is to translate the impact of a bug to an equivalent bug for the sighted. If that happened, I would like to think more of these serious issues would not be deemed acceptable for release. When you include a screen reader in a mainstream product, the mainstream manufacturer, of course, by virtue of it being there becomes a screen reader company, and it must be held to the same standard as any other screen reader company.
There are many benefits to the manufacturer of the operating system, and in this case, the hardware too, developing the screen reader not the least of which is that the cost of manufacture can be spread across a very large number of units and be made free to the end user. But unless we insist on parity of quality, the downside is that we’re the last cab off the rank, an afterthought.
Those are my top-10 for iOS 15, the things that I am hoping for. What about you? I would love to get your thoughts on my list and also additions that you have from your list. You can drop me an email with an audio attachment or write it down and send it in to Jonathan that’s J-O-N-A-T-H-A-N, at mushroomfm.com, and the listener line number 86460mosen that is in the United States, 864-606-6736.
Let’s get into the comments that we have already received. Tracy Duffy writes, “Most of the time, I’m reasonably happy with Voiceover, though I’m sure I may hear things I’ll agree with on your upcoming show. What I’d really like Apple to do, though, is to improve Siri.
“It used to be that you could ask a question and actually get an answer from Siri. Now more often than not, what you get is, ‘I found this on the internet or web.’ Then you read something from a web page. If I wanted that, I’d say, ‘Search for or look up something.’
“If I ask, ‘How old is Jonathan Mosen?’ I expect an answer, not a web page excerpt. When iPhones were relatively new, we had a lot of fun just asking Siri questions, and we were amazed at the things we could get answered. Now, it just seems like a lazy search engine.”
Thanks, Tracy. I have to agree and we use the Soup Drinker for this a lot. I was quite surprised actually, that I still find the Soup Drinker the best option out of all of them. A while ago, we were kindly given a Google Home. I wouldn’t have gone out of my way to buy a Google Home, but we were given one.
I set it up and I thought this is going to be the gold standard because you know when you type things into Google, you get amazing results back. Surprisingly though, I still find myself asking the Soup Drinker more questions than any other assistant because it tends to come through with the answer more.
Carrie Francis is in touch from Canada with her wishes and they are accessible heart rate monitor for the iPhone, sounds to let you know when an app is downloading and finished downloading onto your iPhone. Auto-routing always switched on for those of us who have trouble tapping on the Accept Call button for FaceTime in iOS 14.6.”
Can’t you just do a two-finger double tap? Maybe I’m misunderstanding what you’re saying there, but a two-finger double tap, the magic tap answers the call on FaceTime. Anyway, she continues. “Oh, that’s all I can think of for right now.”
Thanks, Carrie. I appreciate those thoughts.”
Agent Summerson: Hi, Jonathan. Hello to all your listeners. Agent Summerson here from Boreham of the UK, Big A from Accessible Computer podcast. I think I’d like to see in Mac some improvements in Voiceover utility so you have a clearer interface for Voiceover utility to be more intuitive, to be a bit reorganized in an intuitive way.
Another thing which I’d like to see also sorted out in Mac, it is the option to open multiple instances of the same program. It is not possible at the moment. I’d like to see that option available in Mac.
Again, what I like in terms of software, I like to see the assistance of Apple Siri must, but really, really must be improved. Siri is not at all a voice assistant of 2021. It’s not. I think Siri is dragging back a lot of Apple products. I think 8 times out of 10, I receive from Siri, “Hold on, I’m working. I’m on it.” I’m really fed up with that. I really am.
Dictation also, it is important to be improved, not only in English, but also in other languages. It is funny to say in Romanian language, “I can make dictation to insert properly the word ‘pod cast'”. For some reason, it is writing this in two words, pod, P-O-D, space, cast. P-O-D by the way means “I can”. [chuckles] Funny, isn’t it?
Always, when I say the word podcast in dictation in Romanian language, I have to double-check if the word is correct, which it’s not. There was not one time since I’m using this facility to write down correctly the word “podcast”. Dictation, Siri.
Also, I’d like to see in Apple Watch, I’d like to see the option to be able to listen voice messages. Voice messages coming via WhatsApp, via Messenger, via iMessage. Voice messages, short messages, audio messages, I mean.
Scott Davert: Hey, Jonathan and listeners. This is Scott Davert. I wanted to comment on iOS 15. Before I do that though, I would like to tell you a little story. Back in 1997, I got on the internet for the first time using Windows 95 and JAWS. I think it was JAWS 3.2. It might’ve been JAWS 3.0. I can’t remember now, but you know what I did?
I went into Eudora with my Braille light and I typed an email and it didn’t have any mistakes in it. I sent it and it was amazing, but I can’t do that in 2021. Not on the iPhone anyway, or the iPad. I think there’s also issues on the Mac, but I haven’t run the Mac in a while.
Anyway [chuckles] that’s my top request. Fix what’s broken. I don’t even necessarily care if there are no new features, although we already know there will be, which is nice, but if you said, “You know what? Voiceover is a mess right now, particularly in the area of Braille input. Let’s focus on that,” I would be perfectly happy with nothing else other than those fixes.
Now that said, I’m all for innovation, I’m all for the development of new features even if they don’t necessarily benefit me like new voices, for example. That doesn’t help a Braille user, but I’m glad when they come up with these things because they do help a lot of people and they make the experience more enjoyable.
That’s my main thing. Fix it. You broke it, fix it. I could write an email in 1997 without a problem, why can’t I write one in 2021?
As for adding new stuff, the one that I would like to see actually, it doesn’t have anything to do with Voiceover or Braille or any of that directly. It has to do with MFI support. MFI, of course meaning made-for-iPhone. There are a lot of devices that do this, a lot of hearing aid manufacturers, cochlear implant processing manufacturers or speakers that do this or the AirPods, of course.
Wouldn’t it be cool if you could specify that experience by device? For example, let’s say I have my hearing aids connected. This could apply to AirPods too, I imagine. I’ve never been able to use them, but it would be nice to be able to control the amount of silence that occurs before the connection times out. If you want it to be five seconds, fine. If you want it to be five minutes, that should be fine, too.
Yes, of course, if you have a dead Bluetooth connection but it’s still streaming, it will consume the battery, but give the customer the choice. You went ahead and you created this MFI standard which is separate from Bluetooth, which is great, and I understand to some extent why that was done, so take advantage of it.
You’re no longer bound by the Bluetooth standard and the Bluetooth standard profile, so don’t be. Take off with it, run with it, make it a unique experience for each individual. Those are the things I’d like to see in iOS 15. I’ll be very curious to find out what other people will be talking about and I look forward to reading it in the transcript.
Jonathan: Well, that all gets a big thumbs up from me, Scott. I’m very fortunate with the MFI hearing aids that I have, that they are quite well-behaved. They wake up pretty quickly from hibernation and it isn’t overly aggressive. There’s a slight delay, but I can live with it.
When I tried some other MFI hearing aids, the experience was nowhere near as good, so it does vary a bit from manufacturer to manufacturer. There was one set of hearing aids that I was trying where it was so bad that I actually created an MP3 file that just had silence, an hour of silence. I would have that in my iTunes library, my Apple Music library, whatever it’s being called this year, and I would play that file just to keep the hearing aids awake.
I think the problem that we have is that so few people recognize deaf-blindness as a distinct impairment. You’ve got people working on made-for-iPhone hearing aids and you’ve got people working on Voiceover. Do they understand the impact of what they’re doing on Voiceover users who also use made-for-iPhone hearing aids and how they interact?
For example, I’ve got an iPad here gathering dust. I would probably have upgraded the iPad by now if it weren’t for the fact that I can’t easily handover from one device to the other as a Voiceover user. If I switch my iPhone off and wait for, say, five to ten seconds, which I think is a reasonable time, and then I tap the power button to take the iPad out of standby, I want Voiceover to come through my MFI hearing aids.
It should be seamless. It should just work like Apple things are supposed to just work. I can’t make that happen, which means that my iPad has become a very nice slab of glass that I haven’t really used since I got MFI hearing aids two years ago.
I do also hope that MFI hearing aid support makes it to the Apple Watch. I think it was briefly there last year during the Watch OS 7 cycle and then they took it out. It’s obviously something they’re cognizant of. That would be really good to get that working in a reliable way, but again, will it be okay for Voiceover users? Will the handoff issues that plagued the iPhone and iPad plague the iPhone and the Apple Watch?
“Hi, Jonathan and fellow Mosen -at-Largers,” writes Diane Simms. “While I do love my iPhone and iPad, I do have two wishes for the future OS major updates.
One, I’d love for the TTS voices to pronounce words properly and be able to figure out the context of what is being spoken. For example, originals has an L in it, but my TTS voice says “originas”. I’m not sure if this is true for all of the TTS English voices, but this should be made so for all of the English voices.
This used to not be a problem two years ago, but I feel we have put up with this long enough. It’s bad enough having to correct our PC screen reader’s pronunciations. I know that some English words do have dual pronunciations, live, wind et cetera but humans have a hell of a time with the English language these days, so I won’t get my hopes up too high.
Two, Siri is great but can be much better. I’d love to see it with skills like the Soup Drinker or Google. They are not perfect either but neither are the humans that program these wonderful gadgets. If Amazon and Google can do it, then I challenge Apple to step up to the plate and hit it out of the park.”
There you go. A baseball analogy or two to end that contribution from Diane, thank you very much also for your positive comments about the Mosen explosion, and also mushroom escape. Wonderful to have you listening.
I don’t know what has happened to pronunciation in iOS of late. It has got very weird, certainly with my TTS of choice, which is the Compact Daniel. It’s doing some strange things as well and I hear other people commenting on this.
I wonder what problem was being attempted to have fixed because I didn’t hear anybody complaining about the pronunciation before it went weird.
Lielle Ben Simon is in Israel and says, “Hi, Jonathan, my wish list for iOS is better Braille–” with a lowercase b, “support, mostly in Hebrew, solve bugs Braille-related,” oh, he’s got an uppercase B this time so he’s redeemed himself, “and an easier way to move between Braille tables.” He says he would also like new vocalizer voices and he’d like to be able to use acapella voices as well.
I’m heartened by how many people are bringing up Braille as a really important thing for Apple to fix. I hope that they get there this year in iOS 15.
Ad: Be the first to know what’s coming in the next episode of Mosen at Large. Opt in to the Mosen Media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show.
You can stop receiving emails anytime. To join, send a blank email to email@example.com. That’s firstname.lastname@example.org. Stay in the know with Mosen at Large.
Shirley: Jonathan, this is Shirley. I’d like to make a comment regarding the individual that said that U2 no longer works with Bard. That is not a correct statement because I’m using it with Bard on a regular basis. You can press the F3 and O and that will get you in to the NLS app that is on the device.
I will say that probably sometime about a year ago, I had to contact hims. They had hims send me some kind of a file that I put on my unit to make the NLS app work again, there was some problem that occurred. I could not use it for a while and I couldn’t figure out why.
I was able to contact them, what I said, and install some file that they sent me, which made it work again. I don’t know if that’s the problem this individual is having that commented about it but I use it on a regular basis on the U2 Mini. I also have a U, too, and they basically work with the same program. I hope this individual can get that part of that figured out.
I do realize obviously that there are problems in terms of the internet browser and some other things of that nature.
Steve Bauer: Jonathan, this is Steve Bauer. This time you have the one from Culver City, California. Thank you, thank you for everything you do for us in this podcast every week. Always lots of great little tidbits. Some of them not so little. Case in point, this Samsung microphone that I’m talking into, because of your recommendation. It’s made my Zoom activity a whole lot more effective. I appreciate that.
I’ve just binged on three episodes and I’ve got one more to go, but I wanted to comment on a couple of things that were mentioned a couple of weeks ago about HIMS products by Rick and Dan. The BrailleSense U2 and U2 Mini did stop working with Bard and hoping for a software upgrade, I decided to write to tech support.
It’s been a while since we’ve seen a BrailleSense software update. They’re still selling the device on their website, which amazes me because I bought mine almost seven years ago and it had been on sale for a couple of years before I bought mine, but tech support wrote back and for the analysts, Bard 6 no software update required, they sent me a text file, which they told me to put in to the database folder of the U2.
The file was called MLSserver and it was just one line of text.
I put that on the database folder, and voila, Bard started working immediately, no reset required.
Dan mentioned several things about the QBraille, which I can totally relate to. I’ve had mine for about 10 months. I saw the QBraille in a couple of different conventions, kind of fell in love with the concept and finally bought one last summer.
The things I don’t like about it, I really only could have found out through using it, I guess. The short message time display is not a good thing at all and the alt-tab thing really bothers me only being able to see two, sometimes three, things in your alt-tab list.
Perhaps the most outrageous one for me is the inability to reliably switch between contracted and computer Braille. Sometimes you get one keystroke, sometimes you get two, it’s totally unreliable. That means that I can’t use first letter navigation, or even pressing something like h for headings.
Also, the Bluetooth performance with my Windows 10 machine is just abysmal, lots of lag. Frankly, my Apex works better with my Windows 10 machine than the QBraille does. Having said that, it works great with USB, and I do enjoy using Bluetooth with my iOS devices, I have not found any advantage to using a hybrid mode, I frankly just use the standard Braille mode.
I find it a little problematic switching between devices, sometimes, often it will do it in a second or two. Sometimes it takes 20 or 30 seconds to switch. Sometimes it’ll connect to whichever of my i devices happens to be closest to me.
Again, the two keystrokes to switch are somewhat unreliable, I find it a lot more reliable to press the pairing key with L to go into the pairing list and then arrowing down to whichever one I want and pressing Enter, that seems to be a more reliable way to switch between devices.
One of the features I do like about the QBraille with iOS, I don’t believe is documented in the user guide. I only know about it because it was a feature of the U2. Also, our awesome dealer here in LA, Sweetnam Systems, provides a Braille getting started guide for the Braille products they sell. The terminal clipboard is mentioned there.
You can press I with space with Enter and then type up to 1,000 characters. You can also copy text from other files. Then when you press Enter, all that text gets dumped into whatever edit field you happen to be in your iOS device. Given how squirrely voiceover with Braille input can be in certain situations, especially the male program, I find that feature very useful.
But QBraille has some weird keys that do strange things on iOS, and I can’t believe they’re by design, I don’t know if they’re by accident. One that I do find kind of useful is the page down key turns the volume up, and the page up key turns the volume down.
Also, the control key will cycle between the three Braille output modes, computer Braille, uncontracted, and contracted. The Windows key takes you to the spotlight, the Alt key will turn speech on and off. The insert key will take you up through pages one to two to three. There’s no way to go back down other than with the usual method. The Home key takes you to help, and the N key takes you to notifications. Very strange.
I keep hoping that much like the Polaris with the upgrade to the BrailleSense 6 that maybe HIMS might someday come out with a refresh for the QBraille, which would have maybe a new motherboard with a later version of Bluetooth. Probably not but I can dream.
Jonathan: Eden has been in touch regarding the QBraille and I just want to correct an error of fact in this. She says at the beginning that I’ve been impressed with the QBraille. I think she may be confusing the fact that I have read an email from a listener, who is impressed with the QBraille with my own thoughts.
My own thoughts which I have stated here before for what they’re worth is that I wouldn’t buy the QBraille because of where the space bar is placed. I’ve expressed that view for what it’s worth again, to HIMS salespeople. I think that to have the space bar above the Braille display so you have to kind of tuck your thumb under, is an ergonomic mistake. I would not buy the QBraille purely on that basis.
What I’m using is the Mantis. It’s one of those rare technologies where I like it so much, I’m already thinking with some fear about what happens if it ever dies, or if they stop manufacturing it, because I wouldn’t want to go back now to a Braille keyboard with a Braille display. I’m just so happy with this device.
Anyway, that said, here is Eden’s experience. I think it may have been reacting to Dan’s fairly positive thoughts on the QBraille, but she says, “Hi, Jonathan, I, like you, have been impressed with the QBraille. I’ve always loved HIMS products. The QBraille is no exception.
I had my QBraille for two and a half years with no issues until one dot would not go down. I sent it in for a repair thinking, wow, I will have one, maybe two cells to replace. I was correct. Only one cell was bad, but instead of being quoted the price for one cell, I was told for quality control, they no longer provided single-cell replacements.
Let’s just say I had a friend who could help me and I yelled and screamed and got my unit repaired, but I just wanted to warn your listeners this was a thing now.
My friend who interceded for me, who works at HIMS, said he hoped to work to change this nonsense policy since it’s obvious they can replace just a single cell. It is a big difference between $2,195 and the $200 I actually paid.
If this policy continues,” says Eden, “I will no longer purchase HIMS products.”
Thank you, Eden. But wait, there’s more. As the good infomercials say, the bad infomercials say it too, by the way, but Eden is back in touch on the email, and she says, “Since the email I last sent you, I’m happy to say HIMS has rescinded the nonsensical policy of having you pay for an entire Braille–” with a lowercase B, “display to replace one cell. Thankfully, someone else said HIMS saw that this was not good practice, so I’m happy to report my experience should not be anyone else’s. Sometimes,” she says, “I worry I am too vocal, but when I’m able to speak out about something that is unfair, I’m hoping I’m helping those who may not know how to advocate as well.”
Thank you, Eden. I’m glad to hear this. In a market such as ours, it’s important to be able to do things as cost-effectively as possible. I do understand the logic, having been involved in hardware manufacture. When you have a Braille cell that starts to go bad, it could potentially indicate a wider problem, but for this market, having to front up for the cost of an entire Braille display replacement is just beyond many people.
Jingle: Mosen At Large Podcast.
Jonathan: “Hi, Jonathan,” says this email, “as promised your man-in-the-know, Daniel Semro is back with a COVID-19 vaccine update. Today, May the 10th, I got my second dose. The doctor that administered the shot said that side effects were more likely than they were in the first dose. Other than pain in the injection site, nothing has really happened.
“As before, the doctor and staff were very polite and really made me feel like it was going to be okay. Again. I encourage everyone to please do your bit and get vaccinated. You’ll be glad you did. You’ll protect yourself as well as others.” Well said, Daniel. Well said. Get that jab today if you don’t have it already.
Douglas says, “Hi, Jonathan. It’s Douglas from Ontario, Canada. Love your podcasts. They’re awesome.” Thank you so much. “I was wondering what’s a good mixer to purchase for my computer. I’m thinking of recording some podcasts but plus, doing some recordings with music over my voice, voice-over dubbing. I also DJ some friends’ parties. I was wondering what type of mixer you would recommend.”
Douglas, thanks for your email. I’m not sure that you need a mixer for either of those situations. Certainly, there are many podcasters these days who just don’t need a mixer. It’s going to add complexity.
If you get familiar with Reaper, which is a wonderful accessible tool for which there are many blind people out there to give you advice and support, you can do all these things without a mixer. At the very basic level, you could start with a USB microphone that plugs into your computer, you can have a multiple-track environment so that music’s on one track and your voice is on the other, and you can apply equalization and compression and other effects all within Reaper. No mixer required.
If you want to go a little bit further, then you can buy an audio interface that has an XLR input for a professional quality microphone. Those microphones will range in price, a great deal. If you’re going to be doing remote interviews, interviewing someone, hopefully over a good quality podcast recording tool like Cleanfeed or Remotely.fm, then you’ll want to get an audio interface that has loopback. Particularly if you’re recording in Windows.
Mac users are a lot more spoiled for choice, but in Windows, an audio interface with loopback really is a good idea. Take a look at the MOTU, that’s M-O-T-U M4, you can also get the Focusrite line which are very good. Although you may need some assistance configuring them at the outset because their control panel is not accessible. You could use Aira to help you do that or Focusrite themselves are really good at helping you to get them configured.
For most people, once you’ve done it once, you can set it and forget it. All this is possible without any mixer being involved at all. Similarly, if you’re deejaying, there are lots of good DJ programs out there. If you’re in Windows and you want to do a really good job of deejaying, then I would recommend getting StationPlaylist Studio.
Now, if you’re using a screen reader with speech, then you will need one sound card for your speech to come through, and that could be the built-in speech that’s on your computer, and then you’ll need another audio interface or sound card of some kind for your music to come through because when people are dancing to your grooves, they do not want to hear your speech coming over alongside the music.
Again, a good audio interface that suits your use case is going to be far less complicated and far more useful than a mixer. Even if you’re going to have multiple guests in person on a podcast, you can get audio interfaces with multiple inputs. You can record each input to a separate track and then take care of the panning and the equalization in Reaper later.
I would recommend two things. First, get on the jolly old blind podcast creators email group, where there are lots of helpful people to give you advice. To get on that, you can send a blank email to email@example.com. That address again, firstname.lastname@example.org, and also follow the blind podmaker club on Clubhouse and ask any questions that you have in that blind podmaker group on Clubhouse that meets on a Sunday, North American Time.
Matt Miller writes, “Hello, Jonathan. I am listening to the May 8th show. Although I agree with your overall premise that disabilities are a socially engineered issue, I believe your analogy is comparing apples and oranges.
“I see being blind as a person with four senses working through a world set up for people who use five senses. In your story, you seem to be suggesting a one-eyed man, a person with five senses, would be at a disadvantage in a world constructed around people using four senses.
“I understand your speech was based off the fable, so you wanted to make a point about people’s limiting beliefs due to their lack of experience, but I just feel the one-eyed man would have needed to be blind and have lost another primary sense such as touch or hearing to be a valid comparison.
“I also believe that your work to get a positive message out there about blindness is 1,000 times more important than my previous point, but I also know you appreciated debating details and just wanted to throw in my two cents.”
Thank you for writing in, Matt. Obviously, as the person who wrote that, I disagree funnily enough, because again, we’re coming back to something we’ve discussed in a previous episode, which is that so many people perceive disability as always being a deficit.
Disability doesn’t have to be a deficit. It can just be that society is constructed in a way that disadvantages you. In this model, sure, this person has half a sense that the majority of the population does not have, but that creates a disability because the world isn’t constructed for that person.
The fact that they have one working eyeball means that they’re being bombarded with a lot of information through the visual cortex that is distracting and that the society that they are a part of really isn’t geared to deal with.
If you are being bombarded with a whole lot of visual stimuli it may, for example, affect your ability to learn Braille with an uppercase B, and so you may well have some literacy issues and on and on it goes. Disability in my model and in the social model of disability, which an increasing number of cultures and disabled people are moving to is not necessarily a deficit. It’s a social construct.
Charlie: Hey, Jonathan, hope that you’re good today. I was listening to your podcast recently about Twitter spaces and everything. Now, I don’t have an abundance of people which I chat to or have on Twitter. Actually, to say the truth, I would love someone if they can actually teach me how to use Twitter.
I’m a bit unschooled in that department of using actually Twitter. It’s a widely-used thing because for me, my Twitter, especially the home tab off my Twitter, it’s been cluttered with unnecessary people or accounts that I have followed in the past and thinking that maybe I’ll gain something out of it.
Now, first of all, I want to unfollow those because really I don’t get anything from them. I really want to get their messages out of my home tab completely.
Then second of all, spaces. Let’s say you started the space, can I see it, the notification if I follow you and if I clicked on your notifications? Which I did on your profile. Do I then see the notification? Can I join from that notification if I don’t have 600 or more followers?
Jonathan: Thanks for being in touch, Charlie. Yes, if you are following me and you’ve turned on notifications for my tweets, then when I start a space, you will be able to see that I have, there’ll be a link in the notification. You just double-tap that link and you will be in the space.
I haven’t done a lot with Twitter spaces. That’s mainly because of the pressure of work. Really, any spare time I have is taken up with putting this podcast together.
Regarding your second point about your Twitter timeline being cluttered up with tweets from people that you don’t want to see, if you have, as you say, unfollowed those people, then as you get new tweets from people that you do want to see, those old tweets from people that you’ve unfollowed will just scroll off and be replaced by more relevant content.
You don’t really delete old tweets from other people that you are no longer interested in seeing. They just scroll off over time.
Matthew: Hi, Jonathan, and listeners at home. This is Matthew Bullis in Phoenix. No relation to Mike Bullis in Maryland, by the way. A lot of people ask me that anyways, even though my brother is called Michael.
Anyhow, two subjects. One, what language do I use when I communicate with app developers to ask them about the actions rotor that we use? How do I tell them that that’s what I need them to put things in such as if you flip to the right, it gives other options when it really should be contained in the swipe down motion. What language do I use there? I don’t know how to explain that to an app developer so that they understand what I’m talking about.
Second, the Uber set-a-ride-in-advance scheduler seems to be broken. I book a ride in advance to go from home to work. When I do that, the normal ride is about $10 to get about four miles or so, but the scheduler says $30. I know that with surge pricing and ride times, it shouldn’t be that much different. Well, I decided to experiment and I booked one of those scheduled rides for Monday morning. $30 it was when the ride was over.
After numerous emails to support with form letter responses, it didn’t get resolved and I’m stuck with a $30 charge. I even tested this. If you do the same ride, say you want to leave right now, do the same ride and it comes up with the price that it should be, $9 or $10, or maybe $11 or $12 depending on where you’re going.
I decided to test it. I attempted to schedule a ride for next week, which was about a mile away from my dentist’s office to home. Again, $30.
I understand that they don’t want to raise the price. You’re guaranteed that price, but it should lower and adjust. It just seems broken and they don’t seem to want to engage in a proper conversation, just getting form responses. Does anybody else have this issue? Try it out. You don’t have to book the ride and you can cancel 60 minutes in advance. You can look at it before it happens, but it just seems the schedule is broken.
Jonathan: Thank you, Matthew. Regarding your first question about the iOS actions rotor. I would refer developers to the Apple documentation, where there is a section on Voiceover and they talk there about the rotor.
There are two options that are available to developers. One is in reasonably common usage. That is what you are talking about, which I would describe as using the actions rotor to add contextual elements when you are focused on an item.
That’s how I’d describe it to developers, but there’s another thing that they can also do. That is that they can add a completely new rotor element to the rotor. Rather than have these items appear on an action’s rotor, you can actually rotate around and get a custom set of rotor options. That’s something I don’t see as often, but it’s all there for developers to access in the free Apple documentation on the Apple developer website. I would refer people to that.
Regarding your Uber question, this is interesting, isn’t it? When I schedule one Uber and I actually scheduled two Ubers this week, what happens for me is that I get a price range and it’s quite a wide range. I was taking a fairly lengthy trip from Mosen Towers to the airport. When I scheduled it, it said that the fare would be between $33 and I think $45.
Then when I got my Uber, the fair was within that range. It’s interesting that it sounds like you are getting a specific price quoted and that no matter what the level of surge there might be, you’re being charged that price.
I don’t know why it’s different for you, but if you want to experience frustration, one way to do it is to contact Uber support. Wow, it is really difficult sometimes to have any sort of meaningful dialogue with them. I am really fortunate in that I use Uber so much with the combination of Uber Eats and the Uber Rideshare service that I have Uber diamond reward status. When you get diamond, I think it’s actually available for platinum users as well, you get a phone number. Imagine it.
You can call a real human being and talk to somebody from Uber. I find that very helpful. If I give those people a call, then I do get a resolution quite easily and regularly, but it’s not available to everyone.
I agree with Matthew. Share your experiences with us if you like. Give us a call, 86460Mosen in the United States, or drop me an email with an audio attachment like Matthew did, or just write something down. Send it in to email@example.com.
Ad: For all things, Mosen At Large, check out the website where you can listen to episodes online, subscribe using your favorite podcast app and contact the show. Just point your browser to podcast.mosen.org. That’s podcast.M-O-S-E-N.org.
Jonathan: That familiar music indicates that we are in fact back in the studio. It’s time once again for another superlative Bonny bulletin with Bonny Mosen.
Bonny: Hey guys.
Jonathan: I’ve learned the hard way that if I don’t introduce you, or for that matter, anybody else, then the very nice people who transcribe this podcast don’t know who you are.
Bonny: That’s true.
Jonathan: They’re going to call you a generic name, like Speaker 1. What speaker number would you like?
Jonathan: One [laughs].
Bonny: I’d like number one.
Jonathan: We have had such an interesting week and we’ve got some listener contributions for you. As we like to say, we will crack on.
Bonny: Yes, absolutely.
Jonathan: The first thing is the great washing machine crisis of 2021.
Bonny: I was so much in fear lastyear– our washing machine is quite old.
Jonathan: I don’t know why you’re fearful of it. You just get another one.
Bonny: No, I’m telling the backstory with this washer. The washing machine’s been around a while. It did have some issues a couple of years ago. We had a repairman come out and look at it. He said, well, the next time it goes, it has an old motor in it, you might as well replace it because it would cost as much to get it repaired. I was so scared during the whole COVID lockdown that the washer was [crosstalk].
Jonathan: COVID. You’re talking like your speech [crosstalk].
Bonny: I know, COVID lockdown. That reminds me of a story someone was telling today where they were ordering a vegan taco salad and there was some protein that they could put in it that wasn’t tofu, but it’s something called Saitan S-A-I-T-A-N, but the voiceover said Satan.
Jonathan: I presume they asked for it. They asked for Satan in there.
Bonny: Yes, they had a Vegan Satan tacos. Anyway, I was worried that that was going to happen. Well, Thursday, I was actually home and decided to run a load of laundry and it wouldn’t start. We had Heidi and Henry come over to look. It turned out that the motor was dead or the belt was not spinning the tub thing around, the agitator I think it’s called. We got online and looked for a new washing machine.
Jonathan: I’m a good agitator though.
Bonny: Yes, you are. So we bought one that–
Jonathan: But hang on, because then there’s the whole Luddite versus progressive discussion because I always think as blind people, we should not be saddled with inaccessible stuff. These washing machine appliances and so many other things, not only are they touchscreen-based, which creates some real input challenges, but they’ve got these menu of choices, most of which blind people can’t use because you either commit them to memory, or you have a cheat sheet on a Braille device or something like that, or you don’t use them at all. If you got an app or something that controls the device by Soup Drinker or something, then you’re good, but you’re a bit of a Luddite in that department sometimes.
Bonnie: Because I’m the one that mainly does the washing
Bonnie: Although that may have changed because I’m not sure yet. You said you’ve ever seen anyone so excited about a washing machine.
Jonathan: You’re frightened of the unknown. Is that a fair description?
Bonnie: Yes, I guess so. I just wanted to work. I just want to get my washing done without too much flaw.
Jonathan: It’s not going to work for us if we’ve got these inaccessible-
Jonathan: – screens- [crosstalk]
Bonnie: More and more you’re seeing right where that’s not accessible. There there are these stovetops, these convections stovetops that are completely inaccessible.
Bonnie: Gone are the days of buttons and knobs?
Jonathan: Yes, long gone. We’ve ended up with the Samsung washing machine. What’s the model number of the Samsung washing machine?
Bonnie: I don’t know, I emailed that to you, but it’s called a quick drive.
Jonathan: I will look it up, oh so it’s a quickdrive? All right. [crosstalk] smart drive you told me- [crosstalk]
Bonnie: No, quick drive.
Jonathan: Quick drive. Okay. I will look up the number in just a moment. While I do that, would you like to talk about what having this washing machine in your life means to you?
Bonnie: We were going into the unknown because Heidi learned from you. You did a good job passing down the tech gene to Heidi because she did do her research and she did go on the New Zealand consumer site and looked at YouTube reviews, so we knew what we were getting. I was desperate because I was nervous and we had laundry to do. We ordered it. Thankfully, Henry and Heidi were good enough to go out, take our old 4,000-pound machine to the trash palace, which is a place where things go to die, the electronic things go to die, but people buy them because they like vintage things and they like redoing them and that sort of thing.
They were able to take their old washing machine out there and pick up the new one and bring it home and set it up. It was pretty simple to set up. The hardest thing was the hole where the hose went wasn’t big enough, so Henry had to run back to his house to get a drill, which was really good he could do that because if we had had it delivered then, “Oh, we’ll just have to get a plumber in.” We got to set up, it was very easy to set up. Then, I was in the other room while they were setting it up, and Heidi started screaming.
Bonnie: It was like, “Did it attack her or something?” “It has Braille,” and I’m like, “What?” I went in there, and it does have a P for Power and an S for Start. They do have little buttons, you push that go into the menus, but you can feel them all. We set it up, we paired it with the Wi-Fi and got the app. It can be a bit quirky, but once you get to know it, it is very accessible, you can watch what the laundry is doing. I’ve done two loads today. We did one load last night and two loads today. You get notified when it’s done. The washing machine sings jingles and-
Jonathan: [chuckles] Like it’s the wonderful Mosen Explosion!
Bonnie: Yes, pretty much.
Bonnie: The thing I don’t like about it, but I think I’ll get used to it, it’s very sensitive. You can get out of it, but if you touch it, it’s like, “Oh, no, what did I hit?”
Jonathan: Yes. I was about to try and find a way to make that comment that while it’s really great that Samsung have done this and put tactile markings on the touch elements and Braille on the Power and the Start, we’ve also put our own label on the Smart Control button. You’ve got to be so careful because it’s attached into the panel basically, so I’m not sure how much of use that actually is because if you just run your hand over the– [crosstalk]
Bonnie: Well, you have to hit the Okay button right now because it’s so smart that it realizes it doesn’t have fabric softener in it because it has two tanks that you can fill with detergent and fabric softer, and it knows how much to use depending on the load weight. Right now we don’t have any fabric softener, so it keeps saying, “Fabric softener low,” and it won’t do anything until you acknowledge that the fabric softener is low. Hopefully, that will stop tomorrow.
I haven’t seen it actually say that in the app which is a bit concerning, so I may just add the detergent and fabric softener myself when I do a load just so it’s not saying that my clothes aren’t getting clean because there’s no detergent in the thing. it’s really cool. Cotton is its default because that’s the most washed material. There’s also some energy-saving things, it’ll sanitize your towels. It still has that new technology smell. So when I take the clothes out, they do smell like new technology.
Bonnie: Yes, it’s really a cool machine and it’s so quiet. You go in there, you’re actually doing anything and we didn’t have at level last night for a bit and we’re sitting in the lounge and we hear a knocking. Henry goes to the door, no one is out there- [crosstalk]
Jonathan: -Yes, Henry actually opened the door.
Bonnie: No one’s there. It was the washing machine moving around. When we set it up, it calibrates itself, which is cool, so it goes through all these weird noises and jiggles around, and it’s cool.
Jonathan: Yes. The way the user interface works with this is that you have two modes, essentially, one is the Smart Control mode. That allows you to control the washing machine completely from the app and remotely. What you have to do there because they obviously don’t want the washing machine running without anything in it, you power it on, you can then load the clothes, although it makes sense I think to load the machine- [crosstalk]
Bonnie: Yes. That’s what I do, yes.
Jonathan: Then, you can press the Smart Control button, and you can go into the app and then you go into the Smart Control mode in the app. The app, by the way, because this is a Samsung device is the Samsung SmartThings app. I wouldn’t describe this as a perfect user experience in iOS, but it’s doable. It’s-
Bonnie: It’s very doable.
Jonathan: -quirky. It’s better with the washing machine actually than it is with a TV. We also own a Samsung Smart TV, but we mainly control the Smart TV with our Soup Drinker devices.
Bonnie: Yes, which you can do with the washing machine. It’s not perfect, but it will communicate without it.
Jonathan: You can start it and you can find out what’s going on-
Bonnie: What’s going on. [crosstalk]
Jonathan: -on the cycle- [crosstalk]
Bonnie: Although it doesn’t tell you what’s going on the cycle because I waited till I knew that it was doing rinsing, and I asked it and it just said, “Washing.”
Jonathan: Okay. It’s pretty basic.
Bonnie: Pretty basic, yes. I started it from the app, which is really cool because it goes ready, or it says ready on the screen and then you hit- [crosstalk]
Jonathan: When you start the Smart Control mode, you can then go ahead and change all the settings. It checks-
Bonnie: It locks the door, and you can hear it lock the door.
Jonathan: You double-tap the button that tells you the kind of load that you’re loading, and it’s defaulting to cotton. When you do that further down the screen, you’ve got to be a bit intuitive about this, it doesn’t immediately change the screen, but when you double-tap say cotton, which is our default, at least at the moment, further down, you’ve got a really long list of presets. This is what Heidi was scrolling through initially before we got to the Smart option, and you can look at them all these different types of loads of clothing that you can choose from, and then you double-tap that one, and you double-tap the one you want and you press OK, and then you can press Start.
Now, the other way to use it is not to enable the Smart mode. As Bonnie says, when you enable the Smart mode, you have locked the door and that’s a safety thing because they figure you could be anywhere controlling your washing machine and they don’t want some random person like a child, for example, who might be– Although, gosh, I hope they’re not at home unsupervised when they’re washing this. I guess this is a safety thing.
Bonnie: They get up in the middle of the night or something and start messing with it.
Jonathan: Yes. They lock the door, and then you have to turn the Smart Control off before the door will unlock. The other mode is that you can make all the same choices. When you’re ready to go, you choose Send to washer. At that point, it sends a series of commands to the washing machine, and then you can press Start on the washing machine itself. When you do it in that mode, you still get status information, don’t you? You’re not really losing too much by doing that.
Bonnie: It tells you how long it’s going to be.
Jonathan: Yes. What it’s doing like washing, rinsing, spinning.
Bonnie: It has an air lock too because you always sometime– You can’t open it because it’s a front loader. If you did all the water would come pouring out like you can on top loader. If you need to put that extra sock in while it’s filling, you can open this airlock and toss it in.
Jonathan: Many moons ago I did look it up here on the e-receipt and I’ve got it in front of me it just says, “8.5 kg Quick Drive Smart Front Load Washer.” That’s a Samsung device. We might see if it’s written down or we can get Aira to tell the model number, but that will give you an idea of what’s possible. I know that there are others playing in this space too with accessible apps for washing machines.
Bonnie: Unfortunately, right now with the supply chain backed up because of shipping, we don’t have a lot of them actually in the country. We were really lucky to have found that one in Wellington that we could get.
Jonathan: One thing too is that it’s got my appetite piqued for what else we can get from Samsung that would be accessible in this–
Bonnie: No dryers in the country, no microwaves either.
Jonathan: We would be interested to hear from other people. Obviously, there are some products available in some markets that are not available in ours, Samsung’s ubiquitous, but I think there are some other washers that also have accessible options now in the US market at least. You’re welcome to share your experiences. On the subject of the horses, which we talked about last week, quite a bit of reaction.
Jonathan: On this, we’ll go through some of it.
Jonathan: Here is Peter, the man himself from Hungary. He says, “Thanks for the comprehensive coverage of the topic raised in my question. Most importantly, it feels better now. I heard some information from Bonnie and her friend about the horse racing industry, which I wasn’t aware of. I’m glad to hear that the use of the whip is limited, but ultimately, I’m not convinced that horse racing is a reasonable treatment of animals. Only 10 strikes is certainly better than say 100, but still, for me, beating an animal is the very, very, very last resort to turn to. It is only acceptable for me when there is absolutely no other option and it is inevitably needed. Racing is leisure for humans, an industry, an entertainment business.
Human and animal life is perfectly maintainable without whipping horses to run this or that way on a track. If somebody states that whipping doesn’t bring intolerable pain to the animal, I wonder what that person would say if I offered him or her 10 strikes a day. I’m also doubtful about the statement that horses like running and racing. They certainly like running when they are free at a time when they wish to and without a Homosapien on their back. We humans usually cheat ourselves into believing that animals have human-like preferences.
We are ready to think that they like something, they are happy about a thing or an activity. It feels good for us, but who can see into the brain of an animal. Finally, yes, you guessed it right, I have no problem with guide dogs. They provide us a service that no other machine or creature can and these dogs are loved, teased, cared about most of the time by their owners. The other thing a couple of months ago, I had a question concerning RSS for Windows. Some of your listeners gave a few suggestions, but I didn’t find these programs too likable. At the end, I realized that the solution was right in front of my eyes all along. Email clients are perfect to handle RSS feeds.
I use Windows Live mail for ages and curiously have never thought about exploiting its capabilities to read RSS feeds. In reality, it is an amazing easy to use method of doing so. If somebody shivers to use such an ancient software, Microsoft Outlook is also a perfect application for the same purpose. I tested it and experienced smooth working.” Thanks, Peter. I should have mentioned that Microsoft email clients do RSS, it’s not the way I would like to do RSS, but then I’m a very heavy user on my smartphone. There you go. Any further comments, Bonnie on Peter’s comment?
Bonnie: Again, they’re not beating the horses. Yes, if they were taking a big whip and beating the horses that would be different, but they’re actually not beating them. It’s just a tap. They like running. They love running. They’re bred to run, it’s like dogs, certain dogs that are bred to hunt. It’s a job for them and they really enjoy it. I have had off-track thoroughbreds and I remember being on a show one time where we all had to line up to go into the ring and the horse I was on, he was 17 years old.
he had been a racehorse, he remembered that or I guess he had some memory that this was going to the post and he was just dancing all over the place so excited. I’m like, “No buddy, we’re not running in a race.”
Keith Wigglesworth: Hello, Jonathan. This is Keith Wigglesworth in Baltimore, Maryland and the cicadas are singing like crazy out here. We have more cicadas per inch than anybody on the planet this week, I think. Long story. Anyway, it’s busy, busy out here. I just wanted to give a buzz about an article I read after hearing Bonnie’s and her friends’ recollection about horse racing and all that other which I didn’t fully listen to, but it was interesting what I got. I found an article, a very, very good thorough article about horse racing, the current state of it anyway, in the latest New Yorker Magazine on audio, and this is dated May 24th.
It’s about the state of horse racing in America and in the United States. It’s called Blood on the Tracks. It’s a very thorough, very interesting thing, takes about an hour to read if you listen to regular speed, but it is good and very thorough about what’s going on with all that. That might add to some of their thoughts on– So I thought you’d like to let Bonnie know that that article is available called Blood on the Tracks and it’s in the current New Yorker Magazine and on audio on Bard.
Jonathan: Well, you have to read that one Bonnie. It doesn’t sound like it could be very favorable by the title.
Bonnie: No, I have to check it out. I can get the New Yorker from Bard, but also I can get it on NFB Newsline. I’ll have to check it out.
Jonathan: Here is Yvonne Peters. She says, “Hello, Jonathan and Bonnie. Wow. Two of my favorite topics discussed in one podcast, horse racing and everything blind. How cool. Many thanks to Bonnie and Lisa for an excellent description of the sport of horse racing, and how the horse athletes are valued and cared for. After I retired as a human rights lawyer, I joined a group which purchases low-cost racehorses to race at Assini–
Jonathan: Okay, then. Was that in–
Bonnie: It’s in Manitoba.
Jonathan: Okay. Fair enough. That makes sense.
Bonnie: Assiniboia Downs.
Jonathan: Downs, yes in Winnipeg, Manitoba. What’s the name of that again?
Jonathan: Assiniboia. Okay. There’s one for the transcribers to look at. [chuckles]
Bonnie: It’s an indigenous group.
Jonathan: “I have met many fine new people and learned a lot about the sport. I too was very concerned about the treatment of the horses, and just how willing they really were to participate in this human-made sport. I have to say that the horses we’ve purchased get very excited when they know they are going to the track. They do love to race. Regarding whips, here in MB, that’s Manitoba, there are strict rules about how they are constructed. They are hollow and are not meant to inflict pain, just a reminder to focus. In a way, it is like the leash correction that most guide dog handlers are taught to use when the dog becomes distracted.” I take it you would agree with all of that though?
Bonnie: Oh, absolutely. Yes, it’s more than noise.
Jonathan: Right. Now, here’s one that I suspect is going to get people talking. On a completely different topic, says Yvonne, “You may recall back in 2017, Canada was contemplating national standards for guide dog and service dog handlers that would impose additional requirements on guide dog teams. Guide dog handlers mounted a vigorous protest against such standards because we already have comprehensive and effective standards established by the International Guide Dog Federation. Moreover, there wasn’t and still isn’t any concrete evidence that guide dog teams required additional standards. Happily, these standards were eventually defeated and withdrawn.”
“You kindly gave this some attention in your podcast back in 2017. I thought you might like to know that regrettably, this dreadful issue has reemerged in 2021. The Canadian Foundation for animal-assisted support services, a nonprofit community group has served notice with the Standards Council of Canada that it intends to develop standards for all animals that perform a task or activity for humans, including guide dogs and their handlers. What makes this issue even more reprehensible is that at no time did the CFAS consult or make any attempts to reach out to guide dog handlers about their intentions.”
“If additional standards are ever adopted, it would mean that guide dog handlers would have to undergo additional assessments to ensure compliance. We do not need additional standards and we certainly don’t need unqualified people telling us what we need, especially without consulting us. Once again, the national coalition of people who use guide dog and service dogs is fighting back. We have defeated such standards in the past and we will do so again. If people would like more information, they can send an email to firstname.lastname@example.org, that’s email@example.com. Incidentally,” says Yvonne, “The H-O-O-H stands for hands-off our harnesses.”
Thank you, Yvonne and that certainly brings back fond memories of the late great Tom Dekker, who was so instrumental in that campaign in 2017, and we did cover that extensively. Any thoughts on this, Bonnie?
Bonnie: I had heard that it had risen its ugly head again, but I wasn’t really sure of what the backstory was behind it, so thank you, Yvonne, for giving more information on it. I think someone posted on the Guide Dog Handlers, Facebook page that they were going to have some meeting about it or Zoom call.
Jonathan: Does it bother you?
Bonnie: Yes. We have the International Guide Dog Federation. We don’t need people who, as Yvonne said, that are not qualified to assess a guide dog. They’re very, very different from other types of service animals, and to not even consult with the users is very, very patronizing.
Jonathan: There are variations, aren’t there? In New Zealand, the law says that you can only call a guide dog a guide dog if it’s been certified by the Blind and Low Vision NZ Organization. You can train your own guide dog, but it has to cut the mustard. It has to meet a certain standard. Of course, that organization is affiliated with the International Guide Dog Federation. There are some people who have a problem with that.
I personally don’t because I do get quite concerned about anybody calling anything else a service animal. We saw how it just got ridiculous in the United States and eventually, the pendulum swung the other way. To me, that’s sufficient. If you’ve got a guide dog school accrediting the quality of these dogs, and they are affiliated with the International Federation, that’s a happy medium, isn’t it?
Bonnie: I think so. There are people that are very much against that. Coming into the country with a dog that’s not New Zealand trained, and I choose to get my dogs from The Seeing Eye, and I will always choose to get my dogs from The Seeing Eye. Not because I have anything against the guide dog school here, but it’s the school that I’ve been with for many, many years and I’m comfortable with it.
Every time I get a new dog, I have to be certified by having an instructor come out and work with us, but on the flip side of that, there is an agreement amongst the schools that if I need follow-up services, if I need orientation to something, the school here would help me because obviously, The Seeing Eye, I’m sure the trainers would love to come to New Zealand, but they’re not going to be able to provide any kind of followup.
It is a reciprocal agreement and I personally have no problem with it. Some people think it’s kind of big brother looking down your neck, but the way I look at it going back to the horses, you have to have a license to train racehorses, you have to have a license to be a jockey, you have to have a license to be an owner. I think just because I pick up fluffy down at the shelter and I may or may not know what I’m doing to train a guide dog. Yes, there are people who have successfully trained their dog and have been successful at it and been safe, but there are other people who aren’t.
Jonathan: I agree with you. I’m not sure that this was the right solution, but I do think there has been a problem where blind people advocated very hard for these rights, and it is now a right. It should be to take guide dogs into places where dogs usually can’t go and to make sure that it’s actually illegal to refuse. We’ve talked about this extensively, of course, on this show, but with rights come responsibilities.
I think what’s happened is that many others who are not in the guide dog area but have other requirements that may well be legitimately met by service animals have short-circuited this. They have done us all an extreme disservice by putting all kinds of crazy animals in situations where crazy animals should not be, and we’re all suffering for it.
Bonnie: We are suffering.
Jonathan: I understand where this might be coming from, but the fact is the ecosystem of guide dog schools has already taken care of it. This is I think an example of where imposing a pen disability solution on everybody is not always the correct answer. I wish them well with this. If people think, “Well, this is just Canada. They’ll do what they will.” I tell you, what happens is that you do get people who watch what happens in other jurisdictions. If one jurisdiction gets away with it, it’s coming for you.
Bonnie: Absolutely. We’ve seen this in the states and that the main problem is you’re talking about policing, and with rights and responsibilities, a business has the right to kick an animal out if it’s misbehaving, but they don’t do it. Now with New Zealand, we’re even seeing some people who are passing animals off as emotional support animals and things like that. These are very ill-behaved dogs or ill-behaved animals because they haven’t been socialized.
Guide dogs and other legitimate service animals have been raised and trained, and the handlers have been taught how to continue that training. That’s where the problem has come in. That’s why the airlines has really cracked down on it. Yet it is a pain, but I’m glad to see it.
Jonathan: We’ll be interested in other people’s opinions, if you’re in Canada if you’re working on this. Of course, Yvonne, we were delighted to devote some time to this in 2017 when the Blindside was the podcast and we’ll happily do it again. Please keep us informed. Thank you, sweetie, for a very-
Bonnie: Thank you.
Jonathan: -good Bonnie bulletin. I know that what happens is that this is a brand new, brave new world, and we had these big debates when Sonos came into our lives and we– Eventually you kind of come around, right?
Bonnie: You do. Yes.
Jonathan: [laughs] No, I’m saying you personally.
Bonnie: I do. Yes.
Jonathan: You personally come around eventually.
Bonnie: I do. Yes.
Jonathan: Now you like the app.
Bonnie: I do.
Jonathan: You like the Sonos, even the one in the bathroom!
Jonathan: [chuckles] Okay.
Bonnie: All right. Bye.
Speaker 1: Mosen At Large Podcast.
Jonathan: Another email from Sydney. It’s Dawn Davis who says, “Hi, Jonathan. On the subject of public education, I feel strongly on the subject of the lack of education of doctors in hospitals. I often find that even though doctors are good intentioned, they can be thoughtless and inappropriate probably through no fault of their own. For example, I found myself in hospital emergency one day after being given the wrong dose of a medication by a chemist. Luckily, I had a friend with me as I really was in no state to speak for myself. I was quite disorientated and feeling very vulnerable.
When I was seen by the doctor, the first thing he did was to ask me a list of standard questions, such as touch your nose and touch your finger. Look at the light, walk in a straight line. By this time, my friend and I were becoming frustrated to the point where I said, “Don’t you know what blind means?” My friend said, “This lady is blind which you know she’s in a place she does not know, and she is ill.” At this, the doctor became very apologetic and didn’t seem to know what to say. I was kept in overnight for observation. I’ve had a number of similar situations.
I find, however, that the nurses in hospitals are almost always understanding and have treated me very appropriately. I feel that there should be much more education in the medical profession as they relate to people who at their most vulnerable. I do not mean this to sound like a generalization. I know there are many doctors and other people in the medical profession who treat patients who have disabilities no matter what they are extremely well.”
Thank you, Dawn. I agree. I’m pleased to say I haven’t been in a hospital for about 20 years now. I hope that run continues for a long time. The last time I went in was when I had some ear issues. I was lying in bed and they put these great big rails around my hospital bed. It was really humiliating and quite crazy. I have had various situations over the years, even when I’ve gone to see a GP or something like that, where I’ve come to them with an issue that doesn’t relate to blindness in any way, but they’re obsessed with the blindness thing and they want to know what causes it and all sorts of medical things that are totally irrelevant, and people can be very patronizing.
I agree. It is a concern because when you go and visit these people you are often very vulnerable and you shouldn’t have to be an advocate for yourself in those situations about just disability issues. There is I think a moral obligation, a requirement for the medical profession to be much more disability-responsive than it is. It would be good to hear from other listeners on the subject and the experiences you’ve had in the medical system pertaining to feeling like you’re being patronized, not taken seriously, treated with less dignity because you’re a blind person.
If you want to comment on this, drop me an email, firstname.lastname@example.org. Write it down and attach audio. You can also call 86460 Mosen in the United States, 864-606-6736. On the 12th of May, Debbie Armstrong wrote in to say, “This week, APH is holding a coding symposium day-long seminars from Tuesday through Friday. It’s aimed at students and teachers of blind and visually impaired students who want to consider coding as a career. APH will be releasing resources on their website. Every speaker is being recorded and the entire symposium will be available in a few weeks on their YouTube channel.
The speakers are really fantastic talking about how to get trained for this career and how to break into the job market. The solutions that keep coming up repeatedly are learning how to problem solve and think out of the box, not giving up because one inaccessible disappointment often opens other doors and the idea that having an internship under your belt or a personal portfolio project you can showcase will make you stand out from cited job seekers.” Thank you, Debbie, and I hope that people check that out on the APH YouTube channel. Perhaps it’s already there by now and have a look at the symposium information if coding interests them as a career.
Anil is writing in and says, “Today, I found a club managed by our opposition party.” Anil is in India, by the way, if that gives us some context. “The chief guest, who was former chief minister is answering people’s questions with wrong information to spread propaganda on government. Hence, I oppose political parties or their people in Clubhouse. What do you and Mosen At Large audiences view on this?” Thanks for getting in touch Anil and I hope that you and your family are doing well in India where I realize that COVID-19 has really been rampant of late. We’re thinking of all of our Mosen At Large listeners in India.
My view on this is that Clubhouse is a great way for politicians to be held to account. If your view is that they are spreading misinformation, then hopefully there’ll be a chance for question and answer on Clubhouse because Clubhouse is quite a democratizing platform. I, for one, would warmly welcome it if some of our politicians here in new Zealand got on Clubhouse and opened it up for Q&A, and we could all ask questions. I think that is a wonderful way to be in touch with the people. Rather than oppose it, I’m totally enthusiastic about the idea.
In sunny and no doubt lovely Virginia Beach, Beth is writing to draw my attention to an article that I also have tweeted actually because I saw this a few days before Beth sent it in saying that Bose is getting into the hearing aid business. Their first hearing aids are suitable for people with mild to moderate hearing loss. They have been FDA approved and they cost $850. Apparently, they’re pretty basic. It doesn’t look like they have made for iPhone certification or anything like that, but they are pretty low cost.
They are designed to be fitted without the help of an audiologist. I think it’s probably a trend that you will see. It wouldn’t surprise me if eventually, all the knowledge that Apple has acquired may see them entering the hearing aid space. I think given the attention that Apple pays to sound made for iPhone, Apple hearing aids could be really disruptive in a good way. If you have mild to moderate hearing loss and you try these Bose hearing aids, let us know how they work out or don’t work out for you.
Daniel Ozuna has been thinking about this as well and said, “I recently read an article on Bose’s OTC hearing aids. They sound a lot like the Bose hearphones. The hearphones are the sound amplification device Bose has been selling for a few years now. They used to cost around $500 at Best Buy. I have not seen the price on them recently.
Jonathan: Some useful info here from Don Barrett who says, “Hi all. Just a note to let everyone know how easy it is to use an automated response phone system with the iPhone using a Bluetooth keyboard. I don’t hear this talked about much at all. In fact, when I called Apple accessibility to ask them if this could be done, they looked through their material and told me that they doubted it and they had nothing written about it in their support materials.
This all came up because I had a customer who had an emergency situation that required her to check voicemails on her home system and also pay bills using only her just purchased iPhone without any knowledge of the touch screen. My initial response was to sing the praises of voice control and how that could most likely help her in completing her needed tasks. Boy, was I wrong? When I tried it myself on my credit union’s automated system, I was told that voice control was no longer listening as soon as I made the call.
I understand that Apple would disable voice controls so that the microphone could be used for the phone conversation, but you would think that there would be either a manual setting or automatic setting mechanism to tell the phone to leave voice control on since I am making an automated call thus allowing me to say, tap 1, tap 5, tap £, et cetera, thus using voice control in this kind of situation.
That was not to be. That’s when I called Apple about using a keyboard and their blasé non-committal response was of no help either. ‘Okay,’ I thought, ‘Time to buy a magic keyboard and see if it is really magical.’ I did and it was. I am thrilled to pieces as I now have a very happy, very productive customer. Here are the steps I use for anyone who was interested in this process.
One, make sure your keyboard is paired with your phone and begin by turning off speech with VO. You want to do this because voiceover will keep reading the whole string of numbers as you keep entering new numeric entries. That is distracting, annoying, and unnecessary, especially when you are trying to hear the automated responses with which you are interacting.
Two, from the home screen or phone app, dial the company with whom you want to interact and begin listening. Three, as numeric responses are requested, simply enter them on the top row of alphanumeric keys using Shift 3 for £, or the number sign and Shift-8 for the * key, no need to hit ‘enter’ at all since numbers are entered automatically as you type them. Four, when you are done with the call, simply end the call using the VO-Key combination and the phone will hang up as that keystroke emulates the two-finger double-tap. Five, don’t forget to turn the speech on again with VO.
An advantage of using this method is that with proficiency, you can easily enter numeric responses without timing out, which has unfortunately happened to me when using the phone’s touch screen to enter numbers into these systems. These types of systems often deploy a timing mechanism, which makes one provide entries in a relatively hurried fashion, which can be tough for many folks using just the phone’s touch screen.
Of course, no discussion of this type would be complete without a discussion of how this might work using a keyboard such as is found on the Orbit Writer, Mantis, Focus, et cetera.'” He says, “(Notice how deftly and unobtrusively I have avoided the capitalization controversy). I noticed two weird things about using the Obit Writer keyboard to enter numbers to be dialed. First, it took a couple of seconds after typing each number to hear the tone of the dialed number being echoed from the phone’s speaker. Thus, if you typed the numbers quickly, you never heard the tones being generated. Whereas I think they would probably always be generated using a QWERTY keyboard due to the speediness of its data entry.
Second, after entering the UEB number sign, dot three, four, five, and six, and typing the numbers 153 using the alphabetic letters, A, E, and C as the UEB method, I heard the letters A, E, and C spoken instead of the numbers, even though the numbers were correctly entered into the response system as they resulted in the appropriate spoken prompts. This will, of course, require additional experimentation using similar devices. I hope many find this useful, and I also hope Apple allows us to use voice control on one of these calls, which would be a really cool way to do it. Thanks for listening.”
Well, thank you for writing that up, Don. I do remember when this was introduced and we wrote it up in iOS something without the ‘i’ and I was really excited about it myself because sometimes you just want to enter a bunch of numbers and it’s easy to do that with a Bluetooth keyboard.
Obviously, this is an advantage of the Mantis because it’s just a standard QWERTY keyboard and it will work the same way as the Magic keyboard. If you are using a Braille display with Perkins keyboard entry, then you may like to switch to computer Braille. I wonder if that resolves the issues. It’s just one step at the beginning of the process to switch to computer Braille input and you can switch it back again after the call, but it may well simplify things because I suspect what’s going on is that you have to wait for the automatic translation feature to time out before you hear the tone.
Ross: Hi, Jonathan, and all the Mosenites across the world. I have a question out there for folks, but I want it to start with a quick anecdote. I’m using my iPhone 11 right now. It’s hooked up to iTunes, syncing all the tunes so I can’t use it tonight. I like to listen to music when I fall asleep. Then I went, “Ah, I’ll grab my old iPhone 8.” I plugged it in and warmed it up, and it seems so fast. I really missed that finger sensor. I find the iPhone 11 to be so clunky. I sure hope they bring back that finger sensor or thumb sensor. I loved it. Anyway, progress in life goes on.
My question is when SiriusXM at the time, I think there were two companies, but I got an XM radio when it first came out over 20 years ago, it was accessible enough because the remote control only had five stations that you could preset and I was satisfied with those. However, my understanding from talking to people is that it’s gotten very complex. I’m curious, I don’t know if they have it in New Zealand or not, and if not, maybe some of your other callers know is XM Sirius accessible in any format?
I know I used to get it on my computer, but I don’t have a computer in my living room where the stereo is. I think that the iPhone app is not very accessible. I’m curious to know if anybody knows about an accessible way of getting XM radio because I’d really like to get it again. This is Ross Wanesky from New Mexico, in the United States. Hello to everybody. Thank you.
Jonathan: Thank you, Ross, for getting in touch with the show. Yes, there are several ways that you can access SiriusXM which are accessible. The app I would say is doable. It’s not the most pleasant experience, but if you persist with it and learn a few tips and tricks, it is okay. If you want to use it on iOS though, and just use it for the basic listening to channels, a really good alternative app is called StarPlayrX.
That is all one word and player is spelled without the E. S-T-A-R-P-L-A-Y-R-X all joined together. It’s I think a free app available in the app store. You log in with your SiriusXM account. It is extremely screen reader-friendly. In fact, I think it may have been designed because the developer heard from a blind person that the official SiriusXM app isn’t very easy to use. StarPlayrX is an amazing experience and I can’t recommend it highly enough.
You can also listen to SiriusXM on Sonos if you have a Sonos device, it’s really accessible on that platform. Finally, there is a Soup Drinker skill for SiriusXM. We say to our drinker plays CNN from SiriusXM and off it goes and plays it, so a few options for you there. Good luck.
Speaker 2: What’s on your mind? Send an email with a recording of your voice or just write it down. email@example.com. That’s J-O-N-A-T-H-A-N@mushroomfm.com. Or phone our listener line. The number in the United States is 86460 Mosen. That’s 864-606-6736.
Jonathan: Andy Rebscher writes, “Hi, Jonathan. For the person who wrote to you about playing a cassette deck into a computer, here’s a thought. Most desktop machines still have two 3.5 millimeter audio jacks, one out and one in. In Windows 10, the input jack can be set to operate as mic or line-in. You can use a cable with an RTS 3.5 millimeter plug on one end and two RCA plugs on the other. A direct connection between the cassette deck and the sound card is most likely all that would be required.
A dialogue should pop up when you plug something into one of the audio jacks, it will ask you what you connected. Select line input. Hope this helps our tape preservationist friends.” Thank you, Andy. I do take that point. I suggested an audio interface and this may be all that’s required. I have had quite bad luck with this dialogue, but that could be because I’ve never seen the default Windows one.
I tend to get something from real tech, which is proprietary and horrible and often is quite difficult to access. If Windows 10 using generic drivers pops up a smooth dialogue that lets you choose a line in source for that input jack, you are right, that could be a very elegant solution. All Brian Gaff wanted to do was to try and get into a little bit of home automation and get a switch set up, it turned into an accessibility debacle all thanks to TP-Link.
He reports, “Hi, again, the device has now gone back to Amazon for a refund. I let a sighted person attempt to set it up and even they failed. The website it uses has a self-voicing feature using some oriental lady whose strong accent made it incomprehensible. It sounds like from what you read that they’ve seriously need someone to handle their PR and look at how they interact with the users as they have lost my business, and also potentially of the sighted person I lend it to as well and word of mouth will spread that message, I am sure.
Really, they have made a cheap product, but it’s no good if setting it up needs some degree in computer science and blind people can’t do it either. As for inaccessible apps, amazingly, I had one from the company who delivers the milk to the doorstep in the UK called Milk & More who are owned by the big food company, Müller who do the yogurt, et cetera, and it too is not very usable with voiceover.
The status or type of controls are not voiced, but it makes a good job of graphics design. Pity they did not spend more time testing it with folk who are not mind readers. Words fail me particularly since they are forcing all customers to have online accounts and not allowing interaction or paying the milkman in person. Luckily for me, the website is usable, but the little old lady down the road who was used to leaving notes out is being deleted if they have no way to go online. The brave new world is only for some people.”
Thanks, Brian. I would like to pick up on one point that you raised because I think it’s very important, and I did raise this in the presentation I gave for Global Accessibility Awareness Day this year. Consumers are becoming a lot more discerning, a lot more critical in a positive way about where they buy from. People increasingly care about environmental sustainability, they care about fair trade. I think people will increasingly care about accessibility. I hope so.
I think most people want everybody to have a fair go in life. If they’re deprived of having it because people just don’t give a damn about accessibility, then I hope these companies increasingly reap the consequences of their actions. I also take the points about the digital divide, it troubles me as well. There are people who are being left out and that’s most unfortunate.
Speaker 3: Mosen At Large Podcast.
James Odell: Hi, Jonathan. This is James Odell here from Cardiff in Wales, in the United Kingdom. I just want to let anyone listening to the podcast know about an exciting change to the BBC iPlayer app. This is our catch-up TV service that we have in the UK for the BBC. It’s now possible to use the app to view any content with audio description that’s available on the service.
Previously through the app, all you could do was access the latest episode of any particular program with audio description, and as soon as the next episode was broadcast, you’d only be able to access the next episode with audio description through the app. You could access all of the audio-described content by opening up a web browser, but that was quite a bit less convenient than being able to just play things through the app like anyone else, you’d have do a lot of navigating through the webpage, finding the playback controls, and sometimes there were issues with the playback controls on the website, so now all of that has got away.
Although, I’m guessing using the web browser may well still work for the time being at least. The way to do it is to find a program that has audio description so that could be, for example, EastEnders or a drama, most of them have audio description. You find an episode of the program and importantly, you need to start that episode playing. Then, once you’ve done that and the player window opens, you’ll find in that there’s a button that’s called Settings and Subtitles, or I think it might be Subtitles and settings.
It certainly doesn’t mention audio description. You double-tap on that button, and then you’ll find in that setting screen that opens, there are toggle switches for sign language, audio description, and subtitles. It all seems to work very well. Once you’ve made that change in the setting screen, you close that screen down and start the program playing again. If you’ve turned on audio description in that, then you should have audio description when you’re playing the program.
I’m not 100% whether this actually sticks between programs, this setting. I think it may well do, but I need to check that, but it’s certainly a lot easier than having to use the website all the time. They’ve added 10 second backward and forward buttons as well. It really is a pleasure to use and it’s going from strength to strength, and it’ll make my life a lot easier, and hopefully, other people’s as well. Thank you for the great podcast. I really enjoy listening to you every week.
Gary: Hello, Jonathan, just listening to your fascinating conversation with Brian about using the BBC B Micro and the Apple II computer. Brian and I were actually at the same school. I remember the setup he’s talking about and the manufacturer of that Braille link device was Clark & Smith, and it was a fantastic device because it had a QWERTY keyboard and a Braille display. Braille display was above the QWERTY keyboard. As Brian said, there were two little tape decks on it as well for recording your work digitally as it were. I remember writing, I did a computer science O Level at the time and an A-Level, and I wrote my projects on that Clark & Smith. We used basic on the BBC B Micro for the O Level projects.
We were taught in a structured language called Pascal for our A-level projects. We did it all in Braille on this Braille display and you would connect it up and it had one of those little skirts over the keyboard to try and keep all the fit muck and fingers from these teenagers out of the dots. You had this roughly satiny-type cover over the actual Braille display itself. It was in a suitcase. You opened the lid and it folded back. It was quite a beast, but it was amazingly reliable, but it was made by Clark & Smith, really fond memories.
Haroon: Greetings again. I thought I would fill you in on a little bit of info if it hasn’t been filled in by some other good person before I’ve got to it. I just started listening to the history of Mr. Brian Hartgenand he mentioned something that was quite familiar to me, and that was back in the ’80s in the advent of the Braille link. I can fill in a little bit of information. It was built in the United Kingdom.
It was a 20 cell Braille terminal with a QWERTY keyboard and a microcassette in slash-out device on the left and the right. They held about 200 pages of Braille per microcassette. You could extend that because you could get 15 minutes and 30-minute cassettes. The IO was certainly not like the verse of Braille where you could wind a C-60 from end to end in under 20 seconds.
It was manufactured by a UK company called Clark & Smith who I believe went on to make canes and all other VI devices. I actually still have the one that I used to train other people. Unfortunately, it’s a piece of history that doesn’t Braille link anymore because the system board is just a memory, very heavy. It was like a suitcase that opened and it did TTY. In my case, when I was training people, I had it connected to a Tandy TRS-80 Model III, I believe it was, that had two, five, and a quarter-inch, 160 K floppy disks, 48 K of Ram. Wow. It was really quite something back in those days. We had it connected to that with a CPM program called DT Host, which I drove it through a serial port at 9,600 board when it worked. That was quite an experience. It wasn’t better or worse than the VersaBraille was. It’s a totally different tool altogether.
Jonathan: Wow. I have not heard of this device before and given how much stuff we used to get from England in the 1980s, I’m interested that we got the Versa-Braille, but not this Clark & Smith Braille link because we were using Clark & Smith Talking Book machines. I remember convincing the powers that be that I should be allowed to use the Versa-Braille in my exams.
I was the first person in New Zealand that they allowed to do that because I was able to convince them that if I was showing everybody else how to use it, surely, I knew how to use it and we had to start somewhere. It was great to be able to use the Versa-Braille to write my exams.
To contribute to Mosen At Large, you can email Jonathan that’s, J-O-N-A-T-H-A-N, @mushroomfm.com by writing something down or attaching an audio file or you can call our listener line. It’s a US number, 86460 Mosen. That’s 864-606-6736.
Speaker 1: Mosen At Large Podcast.
[01:54:29] [END OF AUDIO]