Transcripts are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF
file.

 

[music]

Jonathan Mosen: I’m Jonathan Mosen and this is Mosen At Large, the show that’s got the blind community talking. This week, join me for a webinar where I’ll announce this podcast’s future. Be My Eyes is building a Virtual Volunteer powered by ChatGPT, Sonos has new features and speakers, and reaction to our Optima interview.

Automated: Mosen at Large Podcast.

Can we claim this area code?

Jonathan: Welcome to another episode which happens to be 221, and 221 is the United States area code for nothing. Nothing. There is no area code 221 at this point in the United States. What I want to know is, can I apply for it? Wouldn’t it be great if, as a perk for listening to this show, you could have a number on area code 221 that identifies you as a Mosen At Large listener? We could be the first podcast in the world to have our own area code. Wow.

Join me for an announcement about this podcast’s future

We are certainly not going to be the first podcast in the world to have our own webinar on Zoom, but we are going to be doing one, because I’ve got something to tell you.

I put the word out last week on social media and also on our media list, so if you’re following us on Twitter or Mastodon, or liking us on Facebook, or you are subscribed to the media list, which you can do by sending a blank email to media-subscribe@mosen.org, you will already have had notice of this. A lot of people just listen to the podcast without taking part in any of that, so let me give you the background that I’ve already provided to social media and the email channel. Since August 2019, the Mosen At Large podcast has discussed a wide range of topics from a blindness perspective.

Of course we’ve covered all the cool tech and had some brilliant guests, but one of the things that has made it special is the remarkable level of engagement from the community. We’ve shared a diversity of opinions and experiences. Now the time has come for me to make an announcement about the future of the podcast. Making it via the podcast just seems too disconnected, given the community that we’ve all built up together over these nearly four years. If you’ve enjoyed listening to and participating in the podcast, I’d like to invite you to a live Zoom webinar.

It’s coming up. It’ll be held on Saturday, the 8th of April, and it takes place at [4:00] PM US Eastern Time. That equates to [20:00] UTC if you’re into the good old coordinated universal time, [9:00] PM in the UK. In my part of the world, in the Southern Hemisphere, it’s taking place at [8:00] AM on Sunday morning, the 9th of April in New Zealand, [6:00] AM in eastern Australia. I will be making the announcement, and after that time I’ll certainly throw it open for any comments or questions. It will be necessary to register to attend this webinar, and a lot of people already have.

Registrations are open now, and slots are limited. If you would like to attend the webinar, I will put the link in the show notes. You can also go to the blog post where I made this announcement about the webinar, and the link is in that blog post. I’ve put it on a nice, easy URL so you can just remember it. It’s mosen.org/future. That’s M-O-S-E-N.org/future. Go there. You’ll find the link to the registration form for the Zoom webinar. I chose Zoom because pretty much all of us used it during the pandemic and it’s familiar, so hopefully it won’t pose barriers to entry for people who want to attend.

If you can’t attend because you’ve got other commitments or because of the time, because you can never pick a time where it isn’t the middle of the night somewhere, we will be publishing highlights of the event on this podcast feed. I will try to do that quickly, and eventually, there will be a transcript, because this is a live event and I want to get the live events published in archival form as quickly as possible. There may be a bit of a delay between when we publish the audio and when we get the transcript, but the transcript will be done.

However, I do hope that you will be able to join us in person for this announcement. To ensure that you can, head on over to mosen.org/future for the registration link. I do intend to produce a regular episode of Mosen At Large. What I intend to do is publish that as episode 222 the day before we would normally publish the podcast, so that will be Saturday my time, Friday in the Northern Hemisphere.

Then you will have the announcement the day after, that is Saturday the 8th of April in the Northern Hemisphere, Sunday the 9th of April in the Southern Hemisphere.

Chat GPT turns its hand to punditry about the webinar’s purpose

Not a week has gone by lately where we haven’t talked in some form or other about ChatGPT, and ChatGPT apparently is a pundit as well, because after receiving this invitation to the webinar, Kevin wrote in, he says, “Hi, Jonathan, very excited for the Zoom webinar.”

Meanwhile, these are some predictions from ChatGPT. It is predicting five possible scenarios. Number one, I am thrilled to announce that the Mosen At Large podcast is partnering with a major media network to bring even more diverse perspectives and high-quality content to our listeners. Starting next month, we will be launching a brand new series that focuses on current events and their impact on the blind community. All right then. Next one, number two. After much consideration, I have decided to step down as the host of the Mosen at Large podcast.

Speaker 2: I shall resign the presidency effective at noon tomorrow.

Jonathan: It has been an incredible journey, and I am grateful for all the support and engagement from our listeners. However, I feel that it is time for someone else to take the reins and bring fresh ideas to the table. The Mosen At Large podcast hosted by, I don’t know, Jim Smith. I wonder if that would work. Next one, number three we’re up to now. As many of you know, I am passionate about accessibility and technology. That’s why I’m excited to announce that I will be launching a new venture focused on developing cutting-edge assistive technology for the blind and visually impaired.

Yes, like an accessible knife or something. No, that’s not ChatGPT saying that. [chuckles] I hope to continue to collaborate with members of the Mosen At Large community on this exciting new project. Fourth speculation from ChatGPT about what the Zoom webinar is about. I have been reflecting on the conversations we’ve had over the past few years and have realized that there is a need for a more focused platform for blind and visually impaired people to share their stories and experiences. That’s why I am excited to announce the launch of a new podcast network dedicated to elevating the voices of blind creators.

Number five, the last one. In the spirit of community and collaboration, I am proud to announce a new initiative that will connect blind and visually impaired people from all over the world. The Mosen At Large community will be partnering with a number of organizations to launch a virtual exchange program that will allow participants to share their culture, language, and experiences. Those are the five guesses from ChatGPT. If it wants, ChatGPT, like you, can go to mosen.org/future and register to get the actual facts at [4:00] PM US Eastern Time on the 8th of April.

Automated: Mosen at Large Podcast.

Hans Wiberg and Mike Buckley from Be My Eyes talk about the service in general and it’s new Virtual Volunteer in beta

Jonathan: Be My Eyes is one of the most valued blindness-specific apps in our community. When we’re bombarded by a constant stream of bad news in our media, Be My Eyes is a pleasant reminder of the good in the world as millions of sighted people are willing to help blind people perform tasks where sight is useful. Recently though, Be My Eyes has been making quite a splash with talk of its new Virtual Volunteer technology powered by another technology making quite a splash right now, ChatGPT. This is now in early testing. To discuss this and catch up with Be My Eyes in general, I’m joined by the service’s founder, Hans Wiberg, and Mike Buckley, who’s the CEO of Be My Eyes. Guys, welcome. It’s good to have you both here.

Hans Wiberg: Thank you.

Mike Buckley: Thank you so much.

Jonathan: Hans, I’d like to start with you because you and I go way back. I remember being on a forum where you were touting this idea, what if we could have a service where volunteers provided sighted assistance? I was really surprised by the lukewarm response initially from some blind people. What lesson that taught me was you’ve got to trust your gut, you’ve got to follow that dream, never give up. Because you persisted, I remember contacting you very early and saying, you’ve got something here. This is needed, and you did it. You must be absolutely amazed at how fast it’s grown over these years.

Hans: I am. I also got to realize that the blind community is very, very different from each other. What fits somebody doesn’t fit all, but Be My Eyes is a tool that some people love and other people, not so much. That’s just the way it is when you’re dealing with human beings. For me personally, it has been a tremendous journey. I have never dreamed that we should be helping so many people and have so many volunteers.

We can help in 180 different languages, which is super, super important that you can get help in your own language by somebody who knows the country you live in, and the food ingredients, and all that. That is just what amazes me on a daily basis, that we here in Denmark, we have this little app, and we can help find people get connected with other people in Japan, for instance.

Jonathan: What’s nice about it is that it lessens that reliance sometimes on family members who we may lean on sometimes for a quick answer to a visual question. It’s something that we can do, I don’t know, a lot more anonymously, a lot more independently because we don’t always have to tap into those family networks.

Hans: That’s exactly right. Also you know that your friends or family, they’d love to help, but you don’t really want to be asking all the time. A lot of those things, you can get out of the way with other person who are willing to help you and exactly when you need that help. We have many, many calls that are less than a minute, 20 seconds. “Have I set my oven to–?” “Yes, you have.” “Thank you so much.” That’s it. Then you can get on with whatever you’re doing and you don’t have to wait until someone comes around or your sister gets home from work, or whatever. The fact that you can get these small pieces of help exactly when you need them is so important opportunity.

Jonathan: Mike, it’s good to meet you. You’re relatively new to Be My Eyes. What brought you there?

Mike: I was an investor and a board member in the company starting in 2018, but I am definitely new to the CEO role. I started in December of last year. I was really intrigued by really what Hans did here. You alluded to it in your introduction, Jonathan. The way I think about it is Hans came up with a brilliant yet simple idea that merged technology and human kindness to solve accessibility needs. I just thought, gosh, that’s just this remarkable, amazing thing that he brought to the planet.

Combined with the fact that the company started getting more traction in working with its corporate customers like Microsoft and Google, and Sony, and Barilla, and Procter & Gamble on better accessibility in customer support for people who are blind and low vision. That powerful combination of the beauty of what Hans built, and then the business opportunity, it’s hard to imagine something more fun to work on.

Jonathan: You mentioned being an investor, and I’ve always been a bit curious about the business model. How does Be My Eyes stay afloat? How does it make its money to keep things in operation?

Mike: It’s a good question because Hans and our philosophy as an entity is that we want to provide tools and services to the community for free, including our new services, by the way. We make money in a couple of ways. One is with a specialized help product, which is an add-on to an existing customer service center. Microsoft has its disability help desk, but within our app there’s a Microsoft button that can connect our community to Microsoft seamlessly quickly on a video call, which is great for service.

Microsoft and these other customers pay us a monthly fee for that. It’s a growing customer base and it’s a nice source of business. The second source of business that we’re really launching a bit more now is to help employees at companies who are blind and low vision better navigate their work environments. We’re partnering with a number of companies to talk about how to make these employees not only comfortable, but give them more power in the workplace, more productivity in the workplace.

That’s a really interesting line of business. Then the third is a corporate volunteering product where, say your team of 50 really wants to volunteer and learn about accessibility, we can make sure that on a particular day, every one of them gets a call to give assistance. It’s a lovely lesson in accessibility, and it’s often a very moving experience for teams that go through it.

Jonathan: That second revenue stream, assisting blind people in the workplace, does that mean then that you’re getting into the business of offering paid agents in certain situations?

Mike: We are not the source of the paid agents. What we do is there are support networks often within companies that exist, kind of the equivalent of employee support agents within companies themselves. We not only help connect those calls through very simple technology in one touch of a button, but we make sure that there are also video calls. I think what you’ll see though is with the advent and the introduction of Virtual Volunteer that we have now, I think that will become a very powerful tool in the workplace to empower independence and self-sufficiency for employees who are blind and low vision.

Jonathan: One thing I truly do appreciate is it’s not so much that I want agents to see me in terms of calling Microsoft help, the disability answer desk, for example, or Google accessibility, or whatever, but as a hearing-impaired person, what I really appreciate is the audio quality that is far superior to making a phone call to those services where they’re often using cheap voiceover IP technology and the audio can be quite garbled, so for that use case, it’s very useful to have

Mike: That’s really interesting to hear it.

Hans: I’m with you on that.

Jonathan: You are quite transparent as an organization about your user numbers, which is one of the best ways that I know of, of measuring the size of the smartphone-using market in the blind community. I watch this with a lot of interest. I’d be interested in breaking it down though. Do you happen to know how many blind people are using Be My Eyes on iPhone, and how many are on Android?

Hans: I think we do. I cannot give you the numbers right now, but I can tell you it’s very, very different from New Zealand to India. In Denmark, I think we are like 95% of the blind people are using iPhone. If you go to other countries, it might be almost the other way around. Overall, I think it’s pretty much 50-50 in the blind community, but these are from very, very different countries of course, so I don’t know how much that information gives you.

Jonathan: It’ll be useful to know in the sense that sometimes as an iPhone user, I contact an app developer and I try to tell them that their app’s not accessible. I cannot say, and if you made your app accessible, this is how many people you could potentially reach. Apple is, as Apple likes to be, incredibly secretive. It will know how many people are running VoiceOver. It will have that data about how many people have VoiceOver enabled, but it doesn’t disclose it. It would be quite interesting for accessibility advocates like me to have even some inkling of what those numbers look like.

Mike: I don’t have a problem sharing that Hans. I think if we look at the data, Jonathan, I think we’d be happy to share that. It’s just information that could be helpful to the broader community. The point you’re raising that’s really interesting to me is if you believe the statistics from the World Health Organization, there are 253 million people globally who are blind or low vision.

The implication there is we’re only serving a fraction of that population, and we have to get better. That’s the reason Be My Eyes exists. The numbers of our community steadily go up with no marketing spend and no advertising. We’re pleased about that, but one of the initiatives that we’re engaged in right now is to figure out how to dramatically increase the growth of that community on our platform so we can better serve them.

Jonathan: Obviously Be My Eyes operates a high-trust model. How often do you find yourself having to intervene because of volunteers either not provided quality service, or even behaved inappropriately?

Mike: I’ll let Hans answer the second part, but on the first part, over 90% of our volunteer calls are successful when you strip out technological problems or telecommunications problems. The service works very well, and the incidences of issues or abuse are also very low. I will tell you, I am very thankful that our number one security officer and policeman is Hans himself.

Hans: [chuckles] I can tell you that it is very, very rare that we have to suspend a volunteer. It is 10 times, it’s extremely low. I don’t even recall why, but some of them were totally hopeless, not useful information. Of course, anyone can sign up. We do have some very young school kids signing up, pretending to be blind, and make good old prank calls. Of course, they do get reported and we suspend them right away. Apart from that, we don’t really have serious issues about that. Of course, you as a blind user, you will sometimes need to explain to the brand-new volunteer how they can help you.

Please tell me how to hold this item. Should it be closer or further away? Should I turn it around or something like that? That is of course different from Aira, where you have trained agents who are capable of doing this and have a month-long training for doing this. Most of the questions are straightforward to answer for the volunteers, and our users know that it is a volunteer, so they cannot expect them to have made 100 calls before and all that.

Jonathan: The algorithm appears to have become a lot more sophisticated over time in the sense that when I use the app, I’m almost always now sent to somebody local to me. Sometimes local knowledge is very useful, so that’s a nice touch.

Hans: We always try to find someone in the same country because, for instance, the food ingredients are very different from country to country, and it is nice to have someone within the same culture to answer these questions. If you make a call in the middle of the night, we do not call anyone in New Zealand in the middle of the night, then you will get someone in another timezone, but speaking the same language, of course.

Jonathan: It’s really nice. Sometimes when I make a call on Be My Eyes, I get somebody at the other end who’s absolutely astonished that their app has actually gone off with a call because they say they’ve been waiting for one for yonks and they finally got one, and they’re so happy.

Mike: This is the number one complaint from volunteers is that they want more calls.

Jonathan: One area where Be My Eyes, to the best of my knowledge, is not able to help right now or at least not officially, is when you would like someone to provide remote computer assistance, such as when you’re dealing with an inaccessible website or software or something like that. Do you think there might ever be a way that Be My Eyes could conquer that challenge, say with a special pool of volunteers who verify that they’re at a computer ready to offer remote assistance, or are there just too many security risks in doing this?

Mike: Absolutely, yes is the answer. There are two ways, I think, we can do this. One is, Hans has talked about for a long time, and I think we’re going to start to do experiments and make investments here, about having pools of our volunteer base that have special skill sets. Whether it’s knowledge of a Microsoft product, whether it’s knowledge of technology, we really would love to develop a sub-community of Lego enthusiasts, for example, for a less high-tech use case.

That’s one way we can address it, by segmenting some of our app volunteer population based on the advanced skills that they have. The second thing is with the advent of the Virtual Volunteer AI product, you can imagine a scenario in the not-so-distant future where the AI solves this problem. Because when you think about AI, imagine that it’s ingested every user technical how-to manual in humanity. It ought to be able to feedback information on how to correct and how to solve problems. I think that’s where this goes as well.

Jonathan: I want to spend lots of time on Virtual Volunteer in a sec, but I just got to get this question in or my audience will not be happy with me. Any word on Be My Eyes on the Envision Smart Glasses?

Mike: Stay tuned. It’s coming. I don’t know if we’re breaking in. Look, we really like the Envision guys a lot. I spent time with Karthik, the CEO at CSUN. I think you’re going to see some interesting things from us in the very near future, so stay tuned.

Jonathan: Great. Now for those who have not heard of it yet, can one of you give me a description, the elevator pitch of what exactly the Virtual Volunteer is?

Mike: Sure. I’ll start and then maybe Hans can give some specific examples from his usage. When you go into the Be My Eyes app, beta testers, but in the future everyone will have the option to either call a volunteer as they do today or give a voice command to and just say virtual volunteer, or press the button on the phone. What that will do is connect you with an AI-powered service from OpenAI, which includes an image-to-text generator. What does that mean?

That means you can take a picture of anything and very quickly, in a matter of seconds, get a highly powerful and accurate description of that image that’s vastly superior to the image recognition technology that’s currently on the market from other people. Number two, there will be an analytical layer that’s available on this image recognition that goes above and beyond just of description of what’s in it.

Think about taking a picture of the contents of your refrigerator and not only getting a list of the contents, but the tool can tell you what you can make for dinner based on what’s in there. That’s the second thing that this provides. The third thing it provides is the ability to converse back and forth using voiceover technology. You can say, “Can you suggest a recipe?” You can ask specific prompts and specific questions based on the image that you’ve uploaded to provide any type of context that you want and it’s quite a remarkable thing.

The final feature, before, Hans, I turn it over to you to give some examples, is that when the tool is incapable of giving an answer or it’s not sure, it’s going to have an automatic option for the user to roll over to a volunteer call. It’ll say, “I’m not really sure. Would you like to be connected with a sighted volunteer?” We thought that that was an important back-in safety mechanism. Hans, I know you have some interesting use cases that you’ve done.

Hans: I think all of us has a remote control. I took a picture of my remote control and simply typed in a question mark, and then it gave me a description of all the buttons, and there’s quite a few. I do have some pretty smart friends, and I know that none of them can explain me all the buttons, but this was able to simply give me a overview of all the buttons on my remote control. I think that’s pretty powerful.

Also, I took a picture out of the door into my garden and it described there was a porch and there was some plants in pots, and there was a big rock in the middle of the garden behind the rock, there were some bushes and so on. That’s a super precise description of my garden. I could have asked further questions or said, what kind of plants are in those pots, and so on. That’s what makes it so powerful that you can not only get to know if the picture you have taken actually shows what you think, and then you can, now I know I have a picture of it, and then you can ask further question to details in the picture. That’s just mind-blowing.

Mike: The one other that blew me away was we took a picture, Jonathan, of the Indian railway map system and then we asked something like, “How do I get from Bangalore to Delhi?” It gave the directions. Literally which lines you had to take and how to get there. Then I said, “Can you tell me in Hindi?” It did. It’s remarkable. It’s remarkable power.

Jonathan: That remote control example is pretty compelling because many of us who travel find ourselves, say, in a hotel room with a TV we’ve never used before and we try to work out how to operate that remote, or even a remote control to adjust the temperature of the air, for example. When you interrogate a picture that you’ve taken, is ChatGPT aware that it’s talking to a blind person? Do you see what I’m saying? When it gives that description, is it optimized to describe something in the way that a sighted person might describe it to a blind person?

Mike: I don’t really know that it was optimized specifically for a person who’s blind or low vision, but I do know that the quality and breadth of the descriptions serve the need that you’re talking about. I can’t speak to whether or not OpenAI trained it specifically that way, but I do know that the context and analysis it provides is remarkably useful for our community.

Hans: I can absolutely say that they have been thinking about blind people because we are the only company that does this. Whatever happens in the picture-to-image field is all blind people because it is only Be My Eyes that are using this technology. I agree with Mike that they are not doing this just for Be My Eyes. They are of course doing this long-term and so on. Maybe it was not solely based on thinking about the blind people, but they have absolutely had blind people in their mind, at least in the test that we are running now, because we are the only one doing that.

Jonathan: How did this happen? Did you approach OpenAI or did they approach you when they were working on their ChatGPT-4 model?

Mike: Yes. I called them in January and really wanted to explore ways to partner with them, but I had no idea that there was a ChatGPT-4 or that there was any image recognition coming. We started talking a little bit, but then they called me in early February and they said, “Hey Mike, can you keep a secret?” I said, “I think so.” They told me about the model and what it was capable of, and they asked if we wanted to be their exclusive launch partner.

I said, “Look, we’re very interested, but can we play with the technology?” Obviously, we have concerns about utility, you have concerns about safety, you have concerns about accuracy. We started testing it and we were blown away. We got back to them very quickly after the tests and said, “We want to do this, but there’s something you need to know about our philosophy and about how we operate. That is that we provide our tools and services for free to our community. Are you okay with that?” They took about three seconds and said, “Yes.” We were off to the races and we built the product together.

Jonathan: Obviously I realized something like that had gone on because as soon as word started to get out about ChatGPT-4, there you were with this announcement, and that was really significant. Can I talk about privacy and whether my usage is contributing to GPT’s data model at all? If I use the service, am I effectively making the pool of data bigger and making ChatGPT smarter, and how anonymous is my usage?

Mike: As of now, you are. There we have a beta test agreement with a small group of people that are testing this out for us that says, look, for now, we’re going to share this data to make sure that the tool is working properly and make it work better. Make it have more utility. I think we’re going to think about that question about data sharing, whether or not to do that down the road. I’m very encouraged though, Jonathan, by, if you look at the other announcements that OpenAI has made recently on this issue, they have now made it opt-in only for developers to share their data.

My assumption is we will have a choice as to whether or not to share this data with them on some level. To your point, there are two issues to confront here. One is privacy, which is vitally important, and the second is utility and making the service more useful. My hope is that we get to an end state where this is purely opt-in at the user’s choice, but no matter what, we’re 100% committed to transparency regardless of how the model ends up working.

Jonathan: Are you still learning about where this technology is useful? I’m thinking, for example, particularly of navigation where there’s actually some degree of risk in overcooking this and in talking up its capability too much. If I’m, say, at an unfamiliar airport and I want to get to my gate, should I be able to use this technology to help me do that?

Hans: You have to remember, as it is right now, it is a picture. You cannot be guided through an app just by have somebody look at a picture. That’s not possible. Soon that within, I don’t know, a year or two, it will also work on a video and then we are in a new situation, especially about guiding people out there in the real life. You cannot do it right now. Simply because it is a picture.

Mike: Let’s be really firm about this. We do not want anyone using this technology to replace a white cane. We don’t want to use it to replace a guide dog in terms of where it is now. We want to be slow, we want to be thoughtful, we want to be cautious. We’ve even put that in the agreement upfront with the beta testers that it should not be used for these purposes. Down the road as video comes in and as the AI gets more sophisticated and intelligent, I think there are absolutely going to be navigational in other use cases, but we’re just not there yet.

Jonathan: Do you think it will ever be possible to have an exclusive pool of data for each user? For example, I think it’d be amazing if a ChatGPT-type thing could pick out a family member or friend of mine in the crowd, but obviously, they would not want to be identifiable in any kind of public model.

Mike: It’s a really interesting question, but it’s also incredibly thorny from a regulatory and legal perspective. There are laws in the United States right now regarding biometric information, which is largely about faces. The answer to that question is, I’m sure it’s technologically possible, but whether or not it will be possible or advisable from a legal or regulatory, or a privacy perspective is a full additional question. We just don’t know yet.

Jonathan: How fast is this? When I interrogate ChatGPT using Bing for example, it is improving, but there’s still quite a delay even when I give it a text question before it spits out an answer. What lag can we expect between taking the picture and being able to find out about that picture and ask it questions?

Mike: Right now it feels like it’s about four to seven seconds. If the image is blurry or incomplete, it can be a little bit longer. Hans, I don’t know, it’s even in the last two weeks it seemed to have gotten faster for me. Would you agree with that?

Hans: Absolutely. In the beginning, we were waiting and waiting and waiting. When you’re waiting, 20 seconds is a real long time. [chuckles] I think we are down to at least below 10 seconds. Very often it’s five or something like that. It’s really fast.

Jonathan: There’s enormous interest in this, and people can register their interest to be a tester when the public beta rolls out, I guess, by going into the app and pushing the button that’s prominently on the screen there. When do you think it will be available in wider public beta at this point?

Mike: I hope to expand the beta. We hope to expand the beta in a few weeks, but again, it’s going to be slow. In terms of general availability, we’re really going to be responding and directed by the community. Again, we have to prioritize safety, we have to prioritize utility, and efficacy. It’s going to be a lot about the feedback that we get from our beta testers. I think that the potential for this technology is profound.

When you talk to the beta testers, they use phrases like life-changing. They say things like, I have my independence back. On the one hand, gosh, we really want to get this in the hands at more people, but on the other, we have to make sure that we’re doing it responsibly and that we’re doing it safely, and that’s going to be directed and impacted by our beta testers and our community.

Jonathan: Those people who have registered their interest. It sounds like what you will be doing is rolling that out gently. It won’t be a switch that you flick and everybody who’s registered their interests will get it at one time. Is that correct?

Mike: That’s my instinct. I think we’ll roll it out gently initially, but if we get up to thousands of beta testers and we’re seeing great utility, great adoption, high degree of safety, there may be a scenario in a couple of months where we flip the switch. Again, it’s all going to be based on the experience of the people using it.

Jonathan: When you do trickle it out, is that going to be a first come, first serve basis essentially? Or how will you determine who gets it first?

Mike: Yes, first come, first serve. Although certain podcasters might be bumped on the list just because you have a megaphone, Jonathan, and we want to make sure that people who have a broader audience with our community get to play with this and really give your unvarnished opinion about it. Obviously we’re trying to bring in a number of blind organizations early as well and make sure that several of their employees get early access. Other than those two groups, it’s basically a first come, first serve.

Hans: We really want blind users to be our testers. We have a bunch of sighted people who also want to drive this, but the feedback we really want is from our blind users because that is what we are here for and that will bring value to the community that we can build this technology so it serves our community. That’s why we are so focused on having voiceover users testing this new feature. Absolutely.

Jonathan: Really excited about this. I know many in the community are, so we look forward to staying in touch. Thank you both for giving us some time. We look forward to finding out how this rolls out in the next weeks and months.

Mike: Jonathan, thank you for the time and for doing what you do for the community as well.

Jonathan: We can make transcripts of Mosen At Large available, thanks to the generous sponsorship of Pneuma Solutions. Pneuma Solutions, among other things, are the RIM people. If you haven’t used remote incident manager yet, you really want to give it a try. It is a fully accessible screen reader-agnostic way to either get or provide remote assistance. These days, not a day goes by that I’m not using RIM. One of the ways I use it is to either receive or provide technical support from family members.

I’m a tech support guy in our family, so I quite often get questions from family members that they want me to solve. It’s not realistic to expect them to install a specific screen reader, even the demo. Before RIM came along, I found myself having to try and talk them through what they needed to do. Now, I can tell them to go to getrim.app. That’s G-E-T-R-I-M.app, install a simple application on their Windows PC, and just by exchanging a code word, I can have a look at what’s going on. I can either run the rater on their system, or if you’re using NVD-8, you don’t even have to do that. It’s an amazing tool, so do check it out. RIM from Pneuma Solutions at getrim.app

Automated: Mosen at Large Podcast.

I have the Sonos Era 300

Jonathan: Back in 2016, so I was a fairly late adopter actually, I got my first Sonos device and have never looked back. The sound is pretty impressive. The way that Sonos synchronizes across your house is just super, and it hasn’t always been the case, I understand, but in recent years, the Sonos app has been accessible. If accessibility breaks for some reason, if necessary I just drop the CEO of Sonos a quick message, and typically he responds, and we get it sorted out. Sonos is a great company, and I enjoy using their products very much.

We got into the world of Dolby Atmos with the Sonos Arc. When the Arc came out, we swapped our Sonos Playbar for that. We already had a surround sound system. In the living room, we have a sub, and then we have two rear surrounds. They’re actually original Sonos Play:1s, and that’s about to change. If you are fortunate enough to be able to get audio description along with Dolby Atmos, and we’ve talked about this on Mosen At Large over the years, it’s a really tremendous experience. Sonos isn’t perfect. There are things I’d like to be able to do, like play multi-channel audio files from a local network-attached storage drive and have that multi-channel audio played.

You cannot do that. I hope that will change at some point. I don’t why you can’t do it, because they do have all the digital audio converters in there. You’ve got the speakers, so I’m not sure why they just can’t allow you to play those files and hear them the way they were intended to sound. We are making progress though, because if you wanted to listen to Apple Music’s Dolby Atmos implementation, and they do have a few thousand tracks, and I think it’s growing all the time in Dolby Atmos, what I’ve had to do to enjoy it though in the past is to turn the TV on, then select the Apple TV input on the television, which is connected to the Sonos Arc.

Then from the Apple TV, you can run Apple Music, and you can enjoy this Dolby Atmos stuff. I do sometimes, and particularly when there’s, say, an amazing new mix that Giles Martin has worked on of The Beatles stuff, you bet I’m listening in Dolby Atmos. In fact, Giles Martin, who has become this new generation producer of new generation Beatles mixes, works for Sonos. He’s in charge of sound there. Sometimes they do have Sonos Speakers at prestigious studios like Abbey Road to audition the mixes, but the barrier to getting all this going is annoying.

You haven’t been able to just use the Sonos app, which in my view is very elegant, very accessible. Search for the content you want and then play it. You have to do all these things to get the surround sound. I mean, first-world problems and all that, you can do it and it’s worth doing because it sounds amazing in some cases. Sometimes the stereo mixes, in my view, are better depending on the mix. I draw the analogy between where we were with early stereo and where we are now with this new Dolby Atmos Spatial Audio stuff going on. You may have heard some very early stereo mixes.

Actually, The Beatles stereo mixes are really quite primitive, especially in the early days. When you go back and listen to, say, a stereo mix of, Please Please Me, the album, or With the Beatles, it’s pretty harsh because they pan the vocals all the way over to one channel in many cases. I think that some of the Dolby Atmos mixes are like that, but some of them are absolutely superb. The Beatles Dolby Atmos mixes are great. The Sergeant Pepper remix didn’t have a Dolby Atmos mix available commercially at first, and then one became available on the streaming services that support Atmos.

Then, I think, Giles Martin may have withdrawn it and replaced it because he didn’t like the way that the mix sounded on home equipment. I was particularly surprised by how good the Michael Jackson Thriller album is in Atmos. That is a really good mix. Another good mix is The Saturday Night Fever soundtrack. What I’ve learned playing with this Dolby Atmos stuff is it isn’t necessarily relating to how new the album is, it’s more about who mixed this and what they’re trying to do with the Dolby Atmos mix. The latest one that is quite mind-blowing indeed is the 50th anniversary Atmos mix of Pink Floyd’s The Dark Side of the Moon.

If you’re a fan of that epic classic album and you have the ability to hear it with Dolby Atmos on good speakers, then knock yourself out. It is a really good mix of that album. Some of them just disappoint. Hopefully as engineers become familiar with how to mix effectively in this new space, in this new sound stage, we will get better and better mixes more consistently. It would not surprise me if some of the yuck mixes get replaced. Anyway, now, if you have the equipment, it is much easier to hear these mixes on Apple Music in Sonos.

If you’ve got the Sonos Arc or the second generation Beam and any other attached speakers to that system like rear surrounds and/or the Sub, then you can now listen to Dolby Atmos mixes directly from Sonos. If you try to airplay something in Dolby Atmos from your phone to Sonos, then you will not get the Atmos. If you ask one of the voice assistants like Amazon’s one or Google, that also appears to produce a stereo result. If you use Sonos own voice assistant, which in my experience can be a bit flaky, then that does work if you can get it to work.

When you ask it to play something and it plays the right mix, if it’s Dolby Atmos, it will stream Dolby Atmos. How do you know? Hopefully, if it’s a good Dolby Atmos mix, your ears will not lie. It’s quite amazing when you hear a good Dolby Atmos mix. Sorry to rave on about this. [laughs] If you listen, for example, to a live performance, sometimes it really can sound like you’re immersed in the crowd. They can be very effective, but if you think your ears are playing tricks on you, you can verify this because when a song is playing from Apple Music that is in Atmos and you open the “now playing” screen in the Sonos app, there’s a little badge.

You see this when you’re watching TV as well. It will say “Playing on Dolby Atmos”, and it will pop up and it will be absolutely unambiguous. What you cannot do with this new Sonos implementation that I would like to see, is have a little badge as you’re scrolling through your search results. Let’s say that you search for a particular song and you choose Apple Music as the provider you’re searching with, you’ll get a list of search results and there’s no way of telling which one has the Atmos. Sometimes you can see multiple mixes of the same song.

You see this with Beatles albums in particular, but there are lots of instances, compilations, for example, where there are older mixes, some of those will not be an Atmos. At the moment, the only way you can tell is to play it and see if it starts playing in Atmos. I thought this might be an accessibility thing, but it isn’t. I’m told that visually there’s no indication when you’re scrolling through your search results, which mixes are in Atmos, so it’s a fiddly process to get it right. Apple Music does have a playlist or two where they’re showcasing Apple stuff.

If you were to buy something that plays Atmos and you really want to give it a workout, then you can fire up Apple Music, find a relevant playlist that’s made for spatial audio, and scroll through and hear tracks from musicians that you like. This is an exciting development because Sonos’ Atmos implementation is the first third-party device that have been given access to Apple Music in this way. The release of the Sonos app supporting this has coincided with two new speakers. We talked about this when we did a tech briefing a few weeks ago, but I’ll recap briefly.

These speakers are a new range for Sonos called Era as in E-R-A. I have to say, if I was product managing this, and I’m not, you’ll be thankful to hear, I wouldn’t have called it Era, because I immediately think when it goes bad for some reason, when someone has a problem, they’re going to call it Error on social media, aren’t they? That’s inevitable. Anyway, [chuckles] it’s called the Era. They have the Era 100, which is a replacement for the Sonos One. Then they have the Era 300, which is a kind of a hybrid point between the Sonos Play:3, which was discontinued some time ago, and the Sonos Five or Play:5.

Reviewers seem to regard these speakers fairly well. Both of them have Bluetooth now, they have airplay, Sonos who supported Airplay 2 for some time, and they have a USB-C port that can do all sorts of things. It can provide wired ethernet. Normally that’s not necessary in a Sonos environment. They can also do line-in. If you wanted, for example, to connect, I don’t know, dare I say a turntable, Slau Halatyn and I had a long talk about this when I had him on the podcast, I just cannot get my head around this vinyl malarkey, but anyway [laughs] it’s your money, spend it how you want.

If you want to connect the turntable, you can do that. If you wanted to connect one of the blindness players like a Victor Reader Stream or a SensePlayer, or something similar, you can do that as well. Once you’ve connected it to one of these speakers, then you have the advantage of Sonos’ multi-room audio technology, so you could connect your stream to the line-in of one of these little Sonos speakers. Then you can beam it around the house to all the other Sonos speakers that you have. It does have these capacitive controls on the top of the speakers.

Sonos has gone down this track for some time now. They’re basically flat, you can’t feel them. I have never tried to put little marker dots on them. I have not tried that, but it would concern me that that might press the button. If anyone’s successfully done that, let me know. I’m not overly worried about those controls because you can control volume and things by voice or through the Sonos app. There is actually a physical switch on the back, of the 300 at least, and I think it’s in the 100 as well, that allows you to switch the microphones on and off.

It’s very easy physically to be able to tell whether they’re on or not. It’s quite similar to the side switch that you have on an iPhone that toggles whether sounds are on or off. The key thing to note is that the Era 100 does do stereo now, whereas the Play:1s and the Sonos Ones were mono, I believe they’ve got much-improved speaker technology as well, but they don’t do spatial audio. If you want spatial audio in a single speaker, then the only Sonos product that does that for now is the Sonos Era 300. Now, I was going to buy one of these and replace our Play:5 that we have in the master bedroom with the Sonos Era 300.

Then I thought, ultimately my long-term plan would be to update the rear surround speakers in our living room to Era 300s as well, because you actually get additional channels of Dolby Atmos when you do that, and I thought that would be really fantastic to do sometime. Actually, I’ve got them now. I got three Sonos Era 300s. How I manage that is a bit of a non-sequitur, but it’s a fun story, I suppose, so I will try and tell it briefly. Back in 2020, I got an American Express Platinum Edge card. Now, the Platinum Edge card is a credit card product.

In other words, if you want to, if you have to, you can accrue some debt on there, and you can pay off a little portion every month. I got this card because when I was buying an iPhone, I had all sorts of problems with Visa because it was the middle of the night and they thought, what’s this person doing plunking down such a significant amount of money in the middle of the night? They actually blocked the transaction pending further investigation. I was a bit grumpy about that. Luckily, Bonnie came to the rescue and we managed to buy it on her credit card, and I got my iPhone delivered on delivery day.

Oh, it was a close-run thing, I tell you. That’s what encouraged me to get the AmEx card. We’d saved up and we decided we would go to Europe last year and we’d do the ABBA Voyage. We would go to the ABBA Museum. You will have heard about that if you’ve been listening to this podcast. When we did that, I thought I’ll apply for the platinum charge card. You have to pay that one off in full every month. It does have quite a few benefits in terms of access to lounges and hotel memberships, and that kind of thing. I got the platinum charge card and I kept the edge card for a little while because I’d booked some of our travel on it.

If there were to be an issue, then I needed the card. Things like insurance and that kind of thing. I just left it alone really. I finally realized as the fee was coming up due, I need to cancel this card. I’m not using this card. I called up the AmEx people and I said, “I’d like to cancel my Platinum Edge Card, please, because I got the Platinum Charge.” They said, “But did you know that it’s a credit card so you don’t have to pay it off in full every month?” I said, “I did know that, thank you. That’s okay. Right now,” and it could change in the blink of an eye, as we all know, “I can cope with this.

I can pay it off in full. You just have a look at my payment history and you’ll see that. Could you please cancel the card?” She said, “But did you know you get a free domestic flight with this card?” I said, “I did. At least in New Zealand every time I call and I ask for a flight, they tell me that there’s no available slots that AmEx has on this particular day.” She said, “Yes, I must confess, in New Zealand the slots are pretty limited.” She said, “But did you know that when you use this card to purchase things at a supermarket or a petrol station,” what they call a gas station in the United States, “you get the triple points, but you only get double when you use them at a supermarket with a Platinum Charge.”

I said, “Yes, I did actually know that as well, but it’s just a hassle to remember which card to use. Plus, I’m blind and I don’t buy a lot of gas unless I’m topping up my kids for some reason, who are helping me out.” She said, “Oh, I’m sorry.” I said, “Don’t be. We’re living the dream. Now, can you just cancel the card?” Finally, she said, “How about we offer you 300,000 points to keep it?” My reaction was, holy soup. I just said, “Okay, I’ll keep it.” Now I am kicking myself because I wonder how high she would have gone. I wish I’d had the presence of mind to say, I’ll do it for 500,000 or something, and just see how high I could have talked her up.

Anyway, I took the 300,000 points, and that paid for two Sonos Era 300 speakers. I was originally going to buy the one for the master bedroom. We’ve ended up with the three I wanted long-term, and they arrived with a thud at our front door the other day. I have not yet tackled the surround sound in the living room. Those two Sonos Era 300s along with their stands are still in their boxes. I have set up the Sonos Era 300 for the master bedroom, and I can report on the experience. I was going to record some unboxing and things but actually, it’s really hard to show you in an audio demo like this how speakers sound.

Then you’ve got spatial audio in the mix, and the best that we can do is stereo. There’s not a lot that I can show you of consequence, I decided, and we’ve actually got quite a lot of material right now, so I’ll just describe the experience. It was very straightforward. Unboxing it was the usual frustrating thing that you get with some of these big tech products. They try and make it a major event. I set it on the dresser in the master bedroom and connected it to the wall, and it came up, and the Sonos app instantly detected that there was a new Sonos Era 300, and asked me if I wanted to set it up.

I said that I did. Then it prompted me for the password to my Sonos account. It appears that this is a new thing that is in some of these new Sonos products. I entered the password, and then I really wasn’t prompted to do much else. You may know that if you’ve set up previous Sonos speakers, there tends to be a button on the back, and they’ve never really used that button for anything other than making the initial connection. They seem to have abandoned that button in favour of you authenticating your Sonos account, and that makes a lot of sense.

If this is your first Sonos device, then you are invited to sign up for a new Sonos account. I think it is quite a straightforward process. I am hoping that this will find its way to other Sonos products where setting up has been quite difficult. Last year, my Sonos port died the same day the Queen did. I reported on this here on Mosen At Large. I had to replace the Sonos port, and there is this little code we’ve been talking about, this is on Mastodon actually, and it’s like a captcha. It has a series of letters and numbers. You have to type them in and that’s the way that you authenticate.

It’s very difficult to do that as a blind person. I had an Aira agent who was great and we worked on it, and we finally got it sorted out, but it is really not a very friendly experience. Hopefully, those awful inaccessible codes which I have given feedback to Sonos about, and I know other blind people have as well, will be a thing of the past. Then I had the opportunity to set up Trueplay. There’s good news for Android users here. Trueplay is a technology where in the past you walk around your room with your iPhone, and unfortunately it has only been an iPhone.

Using the way their weird sci-fi sounds are bouncing, and what the microphone’s detecting, they customize the speaker’s performance, EQ, various other things like that. You can still do that method. If you’re an iPhone user, I’d strongly encourage you to do that method. If you don’t want to or you can’t because you’re not an iPhone user, then there is now a simpler Trueplay option you can check where Sonos uses its own mics in its own speaker to calibrate the sound. It’s good news that they’re no longer leaving Android users out in the cold.

I wonder what it is about Android devices that means that the full Trueplay is not possible. A hint to hearing aid wearers when you have your made-for-iPhone hearing aids connected, Trueplay is not going to work. Before you embark on the Trueplay process, you’ll need to find a way to temporarily disconnect the made-for-iPhone hearing aids so there’s nothing else connected and VoiceOver is talking over the speaker. When I first attempted to do Trueplay, it kept giving me errors. I was quite concerned about that but I was on a bit of a time crunch when I was setting up the speaker.

I just needed to get it done, and I thought I’ll revisit the problems later, so I skipped the unsuccessful setup of Trueplay, got it all done, added the Amazon voice assistant, the one that we affectionately call the Soup Drinker. They have taken Google Assistant away. Now, the spin about taking it away is that there’s something to do with compatibility between Google and Sonos devices that Google is taking a different approach or something like that. There has also been a lot of rancor between Google and Sonos over patent issues. So far, Sonos has won those legal cases.

I prefer the Amazon assistant anyway, so got that set up, and that was working fine. Now, the interesting thing was when I got all that done and I returned to the main screen, the Sonos app said something quite cool. It could tell that it’s orientation was wrong. It said, “You’ve got the speaker the wrong way up. Be sure to have it so that the touch controls are on the top of the unit.” It’s a funny shape. I’m not even going to begin to try and describe the shape. You can read this in reviews but I thought, this is a funky-looking speaker, so I wasn’t absolutely sure which way was up.

That turned out to be the reason why Trueplay didn’t work. It would not work until the speaker was orientated properly. I think that’s pretty nice. After that, Trueplay just performed swimmingly and it made its weird noises. I walked around the room, and it made some adjustments. How does it sound? For spatial audio, it’s out of this world. It’s quite a trick of the ear really, because if you stand in the center of the room with the speaker on a dresser so it’s a reasonably high-up location, it really sounds like there are things coming from the wall behind me with a good Atmos mix.

It is very impressive. It’s crisp. The woofers are great. The tweeters, it’s hard for me to tell, because of my hearing impairments, so I don’t have a lot of highs, but it seems pretty nice, and I love the Dolby Atmos stuff. The Verge is saying that in their opinion, the Play:5 still does a better job with stereo mixes. What they’re actually saying is if you get a pair of Play:5s, that’s probably the best stereo Sonos experience that you can have right now. I do wonder if there’s going to be a Sonos Era 500 that comes out at some point. I found myself thinking, I don’t think I’d need to add a sub, especially when you’re just listening to music in a bedroom environment, to this setup.

It is very bassy. You can crank the bass way up, if that’s your thing, to the point that you can feel it through the floor. You can adjust the height element as well by going into the Sonos app. If you really want to accentuate the Atmos part of this so you hear what’s going on, you can do that and then just adjust accordingly. In the short time that I’ve had the speaker, and it’s only a little under 48 hours at the time that I put this recording together, so very early days, I’ve had one situation where the voice assistants just stopped working. When I set up the Amazon one again, they both started working.

I’ve also got a problem where despite expressly selecting Apple Music as the default music service for Sonos voice assistant. Sonos seems to think I haven’t got a music service set up, and that is frustrating because you’re going to be reliant on that Sonos voice assistant if you want to initiate Dolby Atmos from Apple Music with your voice. Working with the app works fine and it was trippy just standing there in the middle of the room listening to Money by Pink Floyd.

It’s great. I don’t regret this purchase at all and I will report back once we’ve got the ERA 300 set up as Rear Surrounds and I’m looking forward to watching something pretty impressive on Apple TV plus where you are guaranteed to get Dolby Atmos along with the audio description. It should be a great experience. If you have got a Sonos ERA 300 or you are experiencing Apple music for the first time in Dolby Atmos because of the integration that Sonos now has. Let us know how it’s working out for you.

[music]

Optima and Braille display thoughts

John: Hi, Jonathan, this is John Gasman. Just finished listening to the Optima discussion and you were talking about thumb keys. I think you were the one who first introduced a lot of us to thumb Keys back in the days of the Apex when you were working for human wear and that continued with Audi and the Braille and I can’t tell you how much I enjoy and love that feature. Not only for recording work that I do but also for presentations to be able to move from one line to the next, continuously talking and conversing and your hands never leave the unit, and it just sounds so natural and seamless. To not have the thumb keys on the Optima would be really a disappointing feature.

Now I don’t know if there are other ways to make that happen in a different way, and I guess I would be open to that as long as I could still read and not have to take my thumbs off of the unit. Hopefully, between now and March, Orbit research will figure that out. They’ve made a lot of us happy already with what we’ve seen at CSUN and heard on various podcasts. We certainly hope that creating and utilizing thumb Keys as a part of this unit will be something that they’ll figure out before March or so of next year.

Jonathan: Good to hear from you, John and I have heard from Addie who said that they are seriously considering the thumb keys issue because Addie quite likes thumb keys too. That’s great because I think it would be very disappointing for a product like this one not to have thumb keys. Because I know that there’s a whole new generation of younger people who are fascinated by older assistive technology, I will just correct the record and say that the Apex was not mine.

I was not with HumanWare by the time the Apex came out, but I was involved, I was the Blindness product manager at HumanWare when we did the BrailleNote mPower and also the BrailleNote PK. Even preceding those BrailleNotes, the original BrailleNote had thumb keys. Thumb keys have been a human wear feature for a long while and once used, you don’t want to let them go because as you say, they are so efficient. The BrailleNote Apex, I think Greg Tillson will have been there by then doing the product management while I was over at Freedom Scientific looking after PAC Mate things.

Yes, I do see a lot of this going on Mastered on now. I think there’s an email group in a forum now all looking at older blindness technology, and it is something that we’d like to talk about from time to time on this podcast. It does make me feel a bit ancient because there’s so much fascination with old main menu episodes and various things. It makes me realise, well a lot of that stuff that I’m doing is history now even though here we are still producing new material. That’s what happens when you get older.

Abby: Hi, everybody, this is Abby Taylor in Sheridan Wyoming. I have been enjoying episode number 220, which I will nickname the Ohio episode since it has the same number as Ohio’s area code. I definitely learned some interesting facts about Ohio, but that’s not the point of my recording this time around. I would like to talk about Braille navigation. Well, for me it depends on the situation. When I use my Brailliant BI 20 with my PC, I have the QWERTY keyboard in front of me. The PC is a Windows, it’s an all-in-one, it has the monitor and everything and then it has the keyboard and the QWERTY keyboard’s in front of me with the monitor behind it. Then to the right of me is the Braille display.

In that situation, the thumb keys work best so that if I’m proofreading something I can use my right thumb key to press that innermost thumb key to pan the display down. However, one of the few things I liked about the HIMS Braille Sense products when I used them was the panning and up and down keys on the left and right-hand side of the unit because when I used those units standalone without the PC, I had them in front of me, so I could read the Braille with my right index finger and that is Braille with a capital B Jonathan just so you know.

Then when I would get to the end of the line, I could use my left index finger to press the button to pan the display down and then just bring my right hand back over to the left and start reading it. It made for a smoother reading experience. I have since rectified this with the Brilliant BI and also the BrailleNote Touch Plus. Which also has the thumb keys by reconfiguring the innermost gum keys for panning the display so that the one that goes down is on the left. That way when I get to the end of a line, I can use my left thumb to press that left innermost key to go down a line. That also makes for a smoother reading experience.

I enjoyed the information about the Optima Braille computer, and I will definitely be looking at that, maybe getting one myself when it comes out, but I am disappointed not to know what the price will be. I guess that they can tell us, they can’t tell us, or maybe they won’t, who knows? Anyway, what I would suggest, and I will definitely maybe send them an email, is they offer monthly payment plans so that those of us who can’t really afford to pay $2 $3,000 upfront could maybe pay in monthly installments until it’s paid for. If people only had the option to have that money taken out of their checking account or PayPal or what have you.

They’re not about to all of a sudden bail and say, “oh, I’m not going to pay for this anymore.” To dispute it with PayPal or the bank, it would be just a big hassle. I don’t think there’d be a problem with people not deciding not to pay for it. I think Optima should consider doing that. I wish more companies did that because I love these Braille devices, but I really hate paying so much money upfront for them. It’s not always affordable that way.

Jonathan: That is always the problem, isn’t it? That this technology has so much promise and because there are so few units being manufactured for the small group of people, the price is inevitably higher. Orbit’s done a great job of trying to disrupt that market, but the economic realities are still real. Very good point, Abby. I agree with you. I love the reprogrammable nature of these thumb keys, re-programmability, FTW. That means for the win, by the way.

I’ve always reprogrammed my thumb keys the way that different people read Braille fascinates me. I’m a two-handed Braille reader. When I had my Braille light, so this is going way back to the 1990s, Paul Edwards gave me a good tip. “Jonathan,” he said, “If you want to speed up your Braille reading even more, reverse the advanced bar so that the left side of the advanced bar moves you forward and the right side of the advanced bar moves you back.”

I said, “That seems weird.” He said, “Try it, you’ll never go back.” I did try it and I didn’t go back. One of the first things I do when I get anything with a thumb key is I make the far left thumb key advance the display. What I find is that I’m getting to the end of the line. I’ve processed that line, I press the left thumb key to advance and that works well for me. I reprogramed that on the Mantas itself and I also reprogram it in JAWS to do the same thing and on my iPhone as well.

Way back when we did a feature showing you how to reprogram the thumb keys on the Mantas in iOS to get that better experience or if you do consider it better. While we are talking about the Optima, I will also mention that just after I recorded the interview with Addie and Venkatesh and we got the transcript done and the podcast was essentially locked down. There was a press event at which framework, which are making these modular computers where the parts are upgradeable and replaceable did announce their involvement with the Optima project. It is now confirmed everybody’s talking about it. This is based on framework technology, it sounds like a great partnership.

People have commented to me on just how attractive physically these framework computers are and that they’re kind of MacBook-like. It sounds like it’s a good quality partner to help get this Optima out the door hopefully next year. Jeanie Willis says, “Hi, Jonathan, I really enjoyed the interview with the developers of the Optima. I’m quite excited to see how this looks when it comes out and what the price is. As you know, Braille is for me a new skill that I’ve only learned by touch this last year at the age of 50, and only got my first Braille display a few months ago.

It is the Brailliant BI 40X, and of what is currently on the market, it is I’m sure the best option for me, and I love the way the Braille feels, how quickly it refreshes, and how quiet it is, BUT,” and that but is all in caps, in case you missed it. “I am really surprised and quite taken aback as a person who has used Windows for all my adult life, at how little this device can do, and how badly it integrates with everything else.

To me, it seems like I have been put back 35 years to a device like the word-processing typewriters I had in the ’80s. I seem to be the only one that is astounded that these devices have absolutely no formatting in their Braille editor. I can’t even put in a page break and most of the formatting I do put in manually after counting down 25 lines to make sure it will line up correctly when embossed. Has a nasty habit of vanishing when you exit and reopen the file, there is no spellcheck, no hyperlinks or functional contents, or indexing.

I can’t believe I have huge reference books that have to be navigated around manually with various page finding and go to edit fields. Most of the page numbers listed in the contents or index don’t even match up with the display page numbers. As it doesn’t account for the pages at the beginning that have been labeled with Roman numerals. Then there is the storage systems that don’t sync with any cloud storage.

I find myself back a few decades in a world where it is necessary to try and keep track of which version on which device is the most recently edited, and to which attachment from an email was or wasn’t put across onto the display storage. The thing I’m finding most crazy is that I seem to be the only one who thinks this is archaic. I guess coming at it knew from using Windows, everyone seems to have more reasons why it isn’t realistic to expect a Braille device to do any of these things than they do expectations of improvements.

My Brailliant pretty much lives permanently in terminal mode, and this has by no means solved all the issues. As there is so much with NVDA it just won’t do, and in computer Braille, which I use for music, everything but Duxbury seems to have compatibility issues. I’ve also found that in terminal mode, having to keep my left hand on the display and right hand on the PC keyboard arrow keys as the display thumb keys don’t move it around has no ergonomic solution, wherever I put the two keyboards.

As a teacher, I really hope that they were wrong about students being taught on these displays, and having little experience of the Windows world or Apple with Voice-over, et cetera. I am quickly becoming a huge advocate for Braille, but not at the expense of other skills because they have been locked in on a display with such limited capabilities. I just love the philosophy behind the Optima and hope that in time it can live up to the dream. PS. Looking forward to the day when I can read Braille fluently enough to read aloud like you are on your podcast. I am still reading like a junior school reader at the stage, taking a second or two for each word.”

Oh, and there’s a disclaimer at the bottom of this, “All users of the word Braille in this document should be considered to be written with a capital B, if any aren’t, it is a typo.” Well, I’m glad we cleared that up, Jeanie. Oh, and before I comment, there’s another email which says, “I forgot to mention in my previous email, I’m not really into thumb keys. Maybe it’s a pianist thing, but my wrists sit up higher when I’m reading and I find to use the thumb keys, I have to drop them a bit and crunch inwards. I have reconfigured mine so that I have the pair on the right four back and forward so my left hand can stay on the Braille while the right hand does the thumb thing.

However, they don’t work in terminal mode, while I’m in a Braille editor like Perky Duck. So I want arrow keys on my Braille display and a nice little panel with the four arrows in their usual configuration. Maybe with a button above the left and right arrows to go to the end or the beginning of the line would be perfect, just like any other keyboard, and of course work in terminal mode. I agree, though, that the ones at the end of the rows are not as good as you shouldn’t have to take your hands off what you were reading.”

Thank you very much for sharing your experiences Jeanie and hanging there regarding the speed. I think just the more you read, the faster that you will get, and I just think it’s awesome that you’re being so persistent with it. I feel a bit sad though because I feel you got the wrong product. What you have is a Braille display, and it sounds like what you want is a note taker, and they are quite different.

In earlier times, there would be no confusion like this, because there was a time when what we call Braille displays were completely unable to do anything of use unless they were connected to another device. There wasn’t even a need for a terminal mode because that’s all they did, you just plug them in via USB, or maybe Bluetooth, you use a screen reader on your computer, or latterly your smartphone, and that’s all you did with a Braille display.

Eventually, manufacturers started to try and differentiate themselves from other Braille displays, and they said, “Look, we can add some value here.” We will add a basic editor, like a scratch pad type thing to your Braille display. When you get one of the HumanWare lines, you’ve actually got quite a menu of choices there. You’ve got the book reader. You’ve got the editor, various other functions, and to be honest, that pretty bare bones. That’s not pejorative, or anything because HumanWare will tell you that these are very bare bones, basic editors, they’re designed for taking quick notes, they are not designed for writing complex documents with complex formatting.

If you want to do that in Braille, that is absolutely possible, but you need a different category of device, which for some reason, after all these years, we are still calling note-takers, I have no idea why. I think I’ve tried to change that nomenclature when I was a product manager and just didn’t manage to get traction on it. I think the term note-takers is misleading, and actually undersells what these products do.

If you were using a Braille nose with key soft on it, you would have the ability to do headings, to check your spelling, to insert page breaks, to do all of those things that you’re talking about. I’m sure that the Hims products in that note-taker space would be exactly the same that you’d be able to do all those things. Now they are more expensive than a Braille display, but that’s why, because you’ve got an integrated suite of software tools that are full word processors. It’s those products that are being used extensively in the education market, not these Braille displays with very basic onboard functions.

I don’t know whether it’s too late to maybe see if you can evaluate a note-taking device, but it sounds like as what you’re asking for you would be far happier with it by the sounds of it. You may also be happier using JAWS as your screen reader. I am not an NVDA user, but I have seen from people that I trust repeated comments that the Braille support in JAWS is far more advanced. You can certainly in JAWS, reprogram those thumb keys to do what you want.

I have my inner thumb keys acting as arrow keys, for example. I can move by Braille segments with the outer thumb keys. My inner thumb keys are programmed to emulate up and down arrows so I never have to take my hands off. Although, of course, I’ve now got the Mantis, so I’ve got a QWERTY keyboard and the arrow keys are right there. If it’s possible to change those assignments in NVDA I’m sure that we will get some instructions on how that is done.

While we’re on the subject of Braille displays. Chris Westbrook says, “First I really enjoy your podcasts, looking forward to Episode 570, so you can cover my area code. I thought I would give my thoughts on Braille displays for whatever they are worth. I very recently switched from a Focus 45th generation to a human web Brailliant BI 40X. One reason I did this was because of the frequency of repairs needing to be performed on the Focus which is out of warranty.

Also, I have had trouble connecting the Focus with my iPhone. Sometimes it will connect flawlessly. Sometimes I will have to reboot the phone in order to get it to connect. At first, I thought maybe it was because I am using a cochlear implant paired with my iPhone, but this is a problem with or without the implant being connected. The Brailliant always connects flawlessly so I’d be interested to hear if others have this issue. I love the Brailliant, the thumb keys are great. I like the feel of the Braille and being able to download books is a nice touch. I also have an Orbit Reader 40 but found it too noisy and slow.

Automated: Mosen At Large Podcast.

Apple issues and comments

Bryant: Hello, Jonathan, and all the Mosen At Large listeners, this is Bryant. It’s been a while since I have contributed to this podcast, but I assure you, I still listen to you on a weekly basis, and this is still one of my favorite podcasts to listen to. I’m having another strange issue with my iPhone that I’m hoping you or someone else can shed some light on because this is something that I have never experienced in any other iPhone. I have the iPhone 14 Pro Max that I received as a Christmas present last Christmas. Ever since I’ve had this iPhone I’ve been experiencing a rather bizarre issue where the iPhone is not talking for a little while and then you do something on it.

First little bit of the sound is cut off. You might remember in Windows a lot of users were experiencing similar issues where the sound would cut off when the sound card wakes up. This is happening to me on the iPhone with both my hearing aids and with the internal speaker, and it doesn’t seem to be an issue that happens all the time. It’s happening, for example, where voiceover will say messages, the M might be cut off because the iPhone has gone to sleep. Or well, not gone to sleep, but it will not have talked for a little while. It’s almost like the sound card has gone to sleep and was waking up again.

Jonathan: That does not sound good, Bryant, I’ve not seen this. I’ve got an iPhone 14 Pro Max and I use it extensively and I’ve not seen this, but I wonder whether it’s text-to-speech engine specific, so it would be good to know what TTS engine you have. Steven says, “Hi, JM.” Yes, that’s what they call me at work a lot as well. My friend Lynn Malif loves your show and now has me listening to you here in the Rotten Apple. Oh, Dear. New York City, I have RP and mostly blind, but more on that later. My 2020 SE iPhone has a bug that I am told others have experienced.

It’s when I dictate a message on my iPhone, and if I fumble a bit or hit the screen wrong, I have yet to figure out why this happens voiceover turns off or can’t be heard. I get a lot of clicking as I touch the screen. It’s very frustrating because Siri insists voiceover is on. Usually, I power off to fix it, but I have discovered a workaround. If you tell Siri to set media volume to 65%, any percent above 50 works, it will fix the bug and voiceover will again start talking to you. If that doesn’t work, tell Siri to turn voiceover off, then say set media volume to 79%, then turn voiceover back on and it will be fixed.

Apparently, something makes the media volume go to 6% volume and that’s pretty inaudible. Great show says Stephen, keep on talking. PS. I was almost completely blind but started to take curcumin for my COVID infection because I am a health nut. My wife thinks I’m just a nut, and after seven weeks on 250 milligrams of curcumin capsule, I unintentionally got some eyesight back from my 68-year-long battle with incurable retinitis pigmentosa. I have been on it for over a year now and haven’t gotten much more improvement, but the 5% vision I got back is enough for me to feel gratitude.

I mentioned this just in case somehow someone told you about this and you didn’t believe them.” Well, thank you for writing in Stephen in the Big Apple, and I hope you continue to enjoy the podcast and thank you so much for listening. “Hi, Jonathan, this is Ray Williams and I just wanted to take a minute and give you my thoughts of the Apple Watch Ultra. I have one and I absolutely love it. I spend a lot of time outdoors, especially since I live in Sarasota, Florida. We have lots of warm weather here so I can do a lot of exercise outdoors as well as other adventures.

I purchased my Apple Watch Ultra back in October 2022 and also like you, when I put it on my wrist, I noticed how much bigger the watch actually is. I’ve since gotten used to that size and I love it. I love the fact that the speaker is much improved and one of the things I’m going to be doing with it is going diving. The haptic feedback is very much pronounced even more so than in my Series 7 that I have, which is a 44 millimeter. I hope you enjoy wearing your Apple Watch Ultra. I’m sure you’re going to get lots of use from it and have an awesome day and I absolutely love the podcast.”

Thank you so much, Ray. Always good when you take a punt on a purchase and it works out, and you’re happy with what you have. It’s amazing how quickly humans adapt to things. The other day, Bonnie left her Apple watch charger at work and she said to me, “I’m going to hand you my watch and your mission is to put it on charge on your charger.” Of course, I said, “Okay dear.” She handed me her little Apple watch and it is tiny. It’s the smaller one of the series. I think she has a Series 6. Man, you just get so used to what you’re working with every day. I thought this thing is tiny compared to my Apple watch.

I guess for me because I’m wearing this thing every day, it’s become the new normal and it’s not onerous or anything like that. I don’t feel like I’ve got this big weight on my wrist and as someone with a hearing impairment, I do like the speaker, particularly since my made for iPhone hearing aids can’t pair with the Apple Watch yet. The battery life I just wow over that still. Now someone who is not wowing over something Apple right now is Cornelius. He says, “I was listening to your podcast during the weekend and you discussed iOS 16 bugs. I’m not sure if what I’m going to share with you is related to iOS.

I’m running an old iPhone, which does not support iOS 16. It is iPhone 6S plus to be specific, the last update for this iPhone is iOS 15.7.3. One thing though, I can’t navigate anything on the create call links screen in WhatsApp. I’m sure that I am running the latest version of WhatsApp and each time I go into the calls tab and click on create call links, voiceover won’t read anything in there. I’m also unable to click the close button to exit from the screen, and the only way is to close WhatsApp from the app switcher. I’ve checked with friends who are running newer iPhones running iOS 16.3 and he seems to be able to navigate the call link screen on WhatsApp.

If you have an older device running iOS 15.7.3, would you be able to test out whether WhatsApp’s call link screen is accessible? Thank you.” Well, thank you for writing in Cornelius. It’s good to hear from you and sorry to hear about that frustration. I’m afraid I don’t have an older device to test this with, but perhaps somebody else might. Maybe somebody who’s actually running iOS 15.7 because I know there are some who haven’t upgraded to iOS 16 yet, and they’re quite happy that way. We may have a few iOS 15 listeners who can check this out for you. If they’re using WhatsApp, does that create call link screen work for you?

Then if it doesn’t, I suppose the next question is, well, is this an accessibility bug or is the create call screen not working for anyone blind or not on that iOS? If that’s the case, I guess it’s got a better chance of being fixed.

Jim: Hey, Jonathan, it’s Jim from Sunny Florida. I hope you’re well. I just did my update on my iPhone 13 to iOS 16.4. Got to love all these different updates and just a couple of things I wanted to point out to you. I had some trouble doing the update. I tapped the check box and it just sat there, had to tap it three times before I proceeded with the update when I was in 16.3. Finally went then I wanted to share that when I did the update, it started speaking other languages after it finished, like when you get a new iPhone. I had to swipe up from the bottom, which for many of us that’s a challenge with flat screens and such.

Anyway, I was able to successfully get through that. Then I noticed though in the update setting it says it automatically defaults to update automatically, which I don’t like to do because I’ve had some nightmare stories, which I’ll tell another time about what happens with updates. I’m sure many of us have had those, but I also saw a secondary setting where you can check on or off to update for security and a few other things. What I’ve done for me, but I’d like to see if others could expound on this a little bit. I turned the automatic updates off the first one, but the second one about the security updates I left that on.

My big concern obviously is accessibility, being blunt, the updates, obviously, I’ve lost some access things haven’t worked right with the updates. I always like to research before I do an update and wait a few days, but I wondered what your thoughts were as far as the secondary update section that to me, it’s new. At least I’ve not seen that before where you can turn on or off the security updates. Obviously, in this day and age, we want to turn on that whenever we can because we want to be as secure as we can.

Jonathan: We do indeed, Jim and I think it is advisable to keep those security updates enabled in that setting. No matter what one thinks about iOS updates, those security updates tend to be small and they tend to be quite important. I must be one of the lucky ones because I don’t think I have ever, I am knocking very firmly on the wood here. I don’t think that I have ever had an issue with iOS updates since automatic updates were introduced.

I realise that some people may want to hold off on updates because of bugs in iOS, and there are still some people who are running iOS 15 as a result, even though we are nearing the unveiling of the first beater of iOS 17, which will happen in June. I get that. For me, the update process works flawlessly. I do wish that there was some feedback when the phone was updating. Now, in Windows, you can run the [unintelligible [01:31:22] and you can get some pretty good feedback most of the time when you’re doing a big Windows update.

It’s been a while since I did a Mac update, but I’m pretty sure you can turn voiceover on when macOS is updating and get a little bit of basic feedback about what’s going on. To a blind person, when an iPhone’s updating, it’s gone. You’ve got no way of finding out how the update is progressing and that’s something I would love to see change on iOS. iOS 16.4 really doesn’t seem that exciting to me. I’m disappointed that the hints bug is still not fixed. Maybe I’m the only person who doesn’t leave the hints on all the time, but I have my hints disabled by default.

There are some apps where they actually provide some interesting contextual information and in those apps, you’re now able to enable hints just for where they’re useful. Right across the operating system, I have them disabled, but it doesn’t make a scrap of difference because you still get the double tap to open thing, and all those hints even if hints are off. It’s been like this for several versions of iOS, I am perplexed by why this feature has just stopped working, the ability to disable hints. I don’t know whether I’m the only one that points it out.

While we are on the subject of New Apple things, it is worth mentioning that Apple Music classical is out, and I’m really delighted by this app. It’s super accessible and it’s available to you at no additional cost if you already have an Apple Music subscription. Why is there a need for a completely different app for classical music? Why? Apple has answered this question and they’ve said the answer really is metadata. When you are searching for classical music, you might want to search by composer.

There may be several names that a work has. For example, the Moonlight Sonata is the common name that we give to that Sonata, but you could also give it its proper name, which is the Piano Sonata number 14 and C-sharp minor. All kinds of ways to search. You might want to search by a particular conductor or by an orchestra, all kinds of stuff. The metadata that they’ve taken the trouble to assemble an Apple music classical is super.

Also, if you have the gear to enjoy the spatial audio, there are a good number of works available in the spatial audio and that’s fantastic. At one level, it is and the other, it’s scary thinking, how did this orchestra fit in my living room? If you’re a classical music fan, and you haven’t checked out Apple Music Classical yet, and you have an Apple Music subscription, it is well worth checking out. Thanks to Apple for taking good care of the accessibility on this one.

Automated: Mosen at Large Podcast.

Wanting to improve my English

Jonathan: Leal Ben Simon is writing in and says, “Hi, Jonathan, and everyone. I really enjoy your podcast. I have been studying English for several years, and I wanted to ask you how in your opinion, can I maximise the learning process as a blind person. What tips can you and your listeners give me? I’m talking about all aspects of the language. Thank you for your lovely and fascinating podcast.” Thank you very much, Leal, I really appreciate hearing from you.

Now, I remember in my days of being a short wave listener, there would be courses that would teach English, or the Voice of America used to have this thing that they called special English. Where they would read the news at a slower pace and perhaps simplify the language a bit. I don’t know whether people considered that patronising, or helpful. If you have learned English as a second language, or perhaps you teach English as a second language, and you have some tips for Leal, then please be in touch, jonathan@mushroomfm.com on the email, attach an audio clip if you prefer, or you can call the listener line in the United States.

864-60mosen 864-606-6736

Keeping TalkBack speech on my Android device

To hungry we go where Peter says, “Hi, Jonathan. Have you, or your audience any idea to make talkback speak separately from all other sounds and voices on an Android phone? For example, I would like to make talkback stay on the phone’s own speaker when the device is connected via cable to my JBL loudspeakers through an amplifier. I haven’t found a way to do this and I searched the web, but there may be a trick. If there is one, please reveal it to me. My phone is a Nokia G20 with Android 12. Thanks in advance.” Thanks, Peter, hope all is well in sunny Budapest. Let’s see if any of the Android users out there can help us with this.

I don’t think you can do it with cable, but with iPhone, you can certainly do it with certain types of wireless connections. Perhaps it’s possible on Android as well.

The Braille Doodle, a book on disability justice, and remembering Judith Heumann

Here’s Matthew Bullis. He says, “Hello. I thought I’d alert you, and other listeners about this project, which I hope will reach its funding goal in the middle of April. It’s called the Braille doodle. From what I understand, it’s a frame with a pen where you bring magnetic balls up to the surface, which will help you draw, or write Braille with an included overlay. The design says it’ll be seven lines down and 14 cells across. The Kickstarter price is $70 for one Braille doodle with options to buy more or donate elsewhere.

If listeners Google BrailleDoodle Kickstarter, the link will come up. You have no doubt heard by now of the death of Judith Heumann, who was a disability rights advocate here in the States. Many articles have now been written in tribute and she was able to complete several episodes of her podcast, The Human Perspective. Yes, indeed. She is a shining example of the fact that the things that many people take for granted today were hard-won. They were hard won by people taking a stand, taking risks, getting out there, and making change. She was a remarkable woman, a remarkable activist, and many of us around the world have a lot to thank her for.

The final items is Matthew, “I’d like to let you know about is a book I just discovered, which was released in May of 2022. The author is Amy Kenny and the title is My Body Is Not A Prayer Request, Disability Justice in the Church. It is available in several accessible formats. If the listener is not religious, what you will get from this book are ideas and conversation starters, which can be used to combat the religious model of disability. If the listener is Christian, you will get the same out of this book, but with scriptural support. The author not only places disability within the realm of church worship and engagement, but deals with disability in the framework of wider community.

One thing I learned from this book is the churches were the worst and most hostile adversaries to the ADA. I personally have passed this book on to my church elders, and I will be hosting a book club this summer. The author has appeared on some podcasts, so she may be willing to appear here if you want to reach out.” Woo, boy. Thank you very much, Matthew. That might be an interesting interview. www.mybodyisnotaprayerrequest.com. That’s a nice long URL, but it’s all joined together.

No dashes, or anything like that. www.mybodyisnotaprayerrequest.com. I got to read this. Thank you, Matthew. I appreciate it.

Accessible wireless mesh recommendations

Christopher Right is in touch once again. He says, “I’ve heard a lot of good things about mesh Wi-Fi, and would like to try it. Assuming I can get a system that’s not horrendously expensive. Can this be done for say, less than $200, or $300? What devices would you, or your listeners recommend that have accessible management interfaces and Wi-Fi 6?” We currently use an 18T BGW 210-700, and the Wi-Fi speeds are okay but not great. For Wi-Fi 4, I usually can’t get speeds faster than 50 to 100 megabits per second.

Wi-Fi 5 seems to top out around 150 to 300 megabits per second, but only if I’m very close to the BGW210-700. This is particularly sad because we have a gigabit connection, so I’d expect a little more speed. Would a mesh system increase performance? I also need a unit with enough RJ 45 ports to connect a few switches so we can continue using wired connections. I’m most likely not going to go with Ubiquity unless you can change my mind, as I had a really bad experience while using some of their equipment at World Services for the Blind. Apparently, the only way to manage their stuff is with a dedicated controller device, and you’re screwed if you lose the configuration, resulting in a complete factory reset of all devices to restore management capabilities.”

Thanks for the email, Christopher. Wireless mesh systems are consumer grade and designed to give people reasonably reliable internet. With a gigabit connection, your most optimal solution will be achieved by laying cat 6 cable all over the place and getting network outlets where you need them, and then plugging switches or wireless access points directly into those outlets. That’s what we’ve done here because our fastest internet connection that we have available to us now is 8 gigs down and 8 gigs up. If you really want to maximise very high speed connections like that, then the Wi-Fi is going to be the bottleneck.

Obviously, the internet is only going to be as fast as its weakest link, but it sounds like you want to do this on the cheap. Just keep in mind that when you do it on the cheap, you’re going to get results that reflect that. Wi-Fi mesh systems are pretty good and they’re certainly way better than their alternatives. The old range extenders and things which were essentially repeaters that would degrade the signal over time. They have come a long way. Don’t expect super-fast speeds, but I’m sure that you would be able to see some performance improvement in optimal conditions. How much this costs depends on how many mesh points you need, and that depends on how big your house is.

You will have a base station that usually serves as a router as well, although I’m sure you could put it in bridge mode if you’ve got a router that you’re happy with, and then you need as many mesh points as your house requires. We’ve got an ISP here that decided that it would be far better to just give everybody all the mesh points that they need so that they could hire fewer tech support people, and that seems to be a working strategy. When you sign up in New Zealand with this particular ISP, they scope your house and they send you a number of mesh points to begin with. If you email them and say, “Hey, I’ve still got some dead spots, they’ll just send you another one.”

It’s quite a cool concept and very sensible, but assuming that your ISP is not like that and you have to pay, then you’ll have to do some surveying. Maybe start with a couple of points, find out if there are any dead spots and add any mesh points as required. I’m a huge Ubiquity fan and our whole house is kitted out with amazing Ubiquity Unifi gear. There are actually two categories of Ubiquity products that are relevant to this discussion, and within those there are sub-categories. You’ve got the more professional business grade technology called Unifi from Ubiquity. For those who’ve not seen that written down, it is spelled U-N-I-F-I.

Unifi does use controllers, but they do now also have an all-in-one product called the Dream Machine. You can also get a dream machine pro. Now we have the Unified Dream Machine and it performs all those functions of a controller and a router, a Wi-Fi access point all in one. Then we have several other Wi-Fi access points not in mesh, because we’ve got cat6 all over the place and network outlets all over the place, and we just plug in the Wi-Fi access points as required. In that scenario, you can back up your configuration for the Unifi Dream Machine and restore it should you need to do that. It’s a very elegant solution.

When you add a new piece of hardware to the Unifi network, you do a process called adoption and it inherits the SSID, and the configurations, and it just happens. It’s pretty reliable on the whole and it’s beautiful. What you may be looking at in this situation that you’re talking about is another class of Unifi products called Amplifi, and that is spelt in a similar way. A-M-P-L-I-F-I and Amplifi is Ubiquity’s consumer grade Wi-Fi mesh system that is very user friendly. Its iPhone app is super accessible. I actually set it up for a family member and was blown away with how impressive and accessible the Amplifi technology is, so you could give that a look.

Another one that I’ve heard a lot of good things about just recently, in fact, on Mastodon, people were singing the praises of Eero, which I don’t believe we have in New Zealand, but blind people who I trust say the Eero app is totally accessible. You might want to have a look at that one. There is also Google Wi-Fi. I do not have any direct experience of that to know how accessible it is. Perhaps others using mesh systems can comment on their direct experience. Good luck. We’ll be interested to find out what you ended up with.

Automated: Transcripts of Mozen at Large are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies on the web @pneumasolutions.com. That’s P-N-E-U-M-A solutions.com.

Sense Player and Victor Reader Stream

Jonathan: We have an email from John about the Sense player, and he says, “Do you know if you’ll be able to check the device even when the device is off? I remember messing around with the Blaze ET and noticed that it spoke the date and time, even with the unit powered off, so I wasn’t sure if that would carry over to the new players. I’m planning to purchase one in July when I attend the NFB convention in beautiful Houston, Texas. I always look forward to the podcast every Saturday, Jonathan, you do a very good job of producing the show. Take care.” Thank you, John, that’s a good question.

Not owning a sense player, I can’t answer it, but we do have people who may well be listening to this right now on their Sense players. Hopefully, they’ll be able to give you the answer, John. Anne Byrne is writing in about the stream and says, “I am concerned that the battery in the stream 3 is not user replaceable. The battery in my stream 2 got hot while charging and swelled up to the extent that before I discovered the problem, it had bent the connector to the stream. The replacement battery would not snap into place. I had to hold it in position with tape. The vendor who replaced my battery without charge said that HumanWare had had a series of defective batteries, but that hopefully, the problem was in the past.

I had to replace my stream after a couple of years because the board failed. Then I had to replace the stream with the battery taped in place. Although I use the device daily, I feel that HumanWare’s quality control is not as effective as it should be.” You’ve had a bad run there, Anne. I’m sorry to hear about that. I do wonder whether the lack of replaceable battery in the stream 3 is in a way a response to these issues. That perhaps this technology is less prone to that sort of behaviour. I guess only time will tell. While we are talking about the stream, some interesting information that has come to light in recent times in quite unorthodox circumstances.

I understand we do now know that the stream will be moving at some point to tune in for its radio support. That will be a good thing, I think, although I wonder how that will affect listeners in the UK, where tune in is not functioning well because of some bizarre legal action. HumanWare may well comment on the UK experience.

Vizling is an app that makes comics accessible to blind people

“Hello Jonathan,” says Lena, “Here is a fun app with good accessibility. Vrizling is available for iOS and Android, and it’s an app that makes comics accessible to the blind. When you open the app there are three choices, tutorial, books and about.

I recommend listening to the short tutorial, but I do not recommend turning off your screen reader. About has information about the app and its developer. Books have short cartoons. The cartoons I have listened to, portray the experiences which we blind people have but wish we didn’t. For instance, the blind lady walks into a China shop. She uses her white cane and walks carefully. A large sign says, ‘you break it, you buy it.’ Behind her, a man follows closely with a broom in hand. In the final panel, she is a bull but still holds a white cane. There are only a few funnies now, but more are planned. This app is for everyone who enjoys comics and is being enjoyed by my sighted friends too.

Thanks for an interesting and high quality podcast,” says Lena.

Reading poetry in Braille

My day was not particularly sunny until I got this email from Mani, who sent me this interesting mail to discuss reading poetry in Braille. Well, it’s not that good, but I made it up myself and what you going to do? Manny says, “Dear Jonathan, I am kicking myself so hard, ouch, for not having discovered your podcast much earlier. Since the last few podcasts, I am in love with you and Bonnie and your podcast.” A little background on me. I was a software engineer for 30 years before I lost my eyesight, and my hearing loss became profound. I wear Phonak hearing aids.

Since then, I did an MFA in poetry and published several of my poems and a chapbook in 2019. I also took the time to learn Braille. I am still not proficient in Braille and I’m probably reading at 10 miles per hour. My problem is with reading poems at poetry readings. The few I was invited to read, I ended up memorising about eight to 10 poems. Of course, I became nervous and probably lost a lot of words and even lines. I could have someone read for me, but my poetry colleagues insist on hearing them in my own voice. No, I can’t use 11 labs because it would totally fail with my thick Indian accent. Jonathan, I hear you reading listeners e-mails on the podcast without a pause.

I wish I could read my poems like that. I suspect you use Braille or memorise them. Can you please share with us your secret? I would also invite other listeners to give me ideas on how I can achieve a smooth reading without all those nerves. Thank you, Jonathan, for all you do for the blind and deaf-blind community.” First of all, Manny, you’ve been on quite a journey, and there have been a lot of changes in your life and I just want to congratulate you for the approach that you’ve taken, the response that you’ve made to those changes. You are determined to live a full life. You have learned Braille at a later stage in that life, and I think you should be congratulated for that.

In terms of the way I do this podcast, I have my Mantis Braille display and I am reading the e-mails. My memory is not good enough to memorise some of these long e-mails that we receive. I do pre-read them first, and if I mess up as I will sometimes do, I can go back and edit myself. What you hear is not me reading completely live. I want to be transparent about that, but I do count myself as a very fast Braille reader. Keep in mind that I’ve been reading Braille since I was five, and that was a very long time ago now. I do have that advantage of always having done it.

I think what putting a podcast like this together illustrates is just how important Braille is. Many of the things in my life I would not have been able to do as well or at all if I wasn’t a proficient Braille reader. I am not a Braille Instructor. I have not taught anybody Braille, and I’m sure we will have people among our listening audience who have some tips for you.

When I hear this topic coming up, what I keep hearing is people saying, practice is everything. Read as much as you can as often as you can. You will eventually pick up some speed. If you have access to good quality Braille instruction, you may well have someone look at your technique and just see whether you are processing information in the most efficient manner.

I think it’s possible that you may be at the stage where you are still decoding the dots. It’s not so much the speed at which you’re running your fingers across the page or the display. It’s more that you’ve not had the opportunity yet to develop that muscle memory where you are just automatically decoding those dots into letters. I think that takes practice and time. I want to be encouraging and I feel like there will be others who are listening who perhaps learned Braille at a much later stage than I did. Due to circumstances similar to yours who can give you a lot more practical advice.

I do hope that some will come forward and offer you that advice. In the meantime, hopefully your colleagues will be gentle on you. It’s great that they want to hear your voice and that they’re stretching you a bit. We all need to be stretched a bit and get out of our comfort zone, but there’s nothing to be gained from embarrassing you either. If there is something particularly important that you need someone else to read, then maybe for now you need to, while continuing to improve your Braille reading speed.

Braille is the way forward. I’m sure of that for you. One thing that some people also do is they have an earbud or something with a text-to-speech engine, which they have control over. They may be using the arrow key to get a line at a time, or they may be flicking through on their iPhone so that they’re able to make sure that the iPhone or the laptop is speaking at the right speed and they are parroting that back. I had a stunning display of the effectiveness of this when I interviewed Nas Campanella, who’s a successful journalist in Australia who’s totally blind working for the ABC.

She has a condition, which means that the sensitivity in her fingers has been affected, and Braille wasn’t an option for her. She perfected this technique and she’s recited news bulletins on national radio over there, very impressive. She’s gone on to do other things as well since I did that interview. In terms of your Braille reading speed, let’s see if others who have either direct experience of what you are going through learning Braille later in life, or some instructors who’ve perhaps taught people in a similar position to you might have some practical things to offer.

What I would offer you though, is encouragement. Hang in there. You’re on the right track. Keep reading, read for pleasure, read whatever you need to read to just get that muscle memory working and that speed increasing.

Blindness and literature

Hello, says this e-mail. “This is Anexis from New Jersey. Congratulations on the arrival of your grand-daughter. I’m glad everything went well.” Thank you so much. “I found your podcast a few months ago, but this is the first time I’ve chosen to write. I’m a 23 year old blind college student, and I’ve really enjoyed the different conversations you’ve had in your podcast. I especially enjoy the community aspect.

I’ve really been enjoying the conversation about reading, and the different ways blind people choose to read. I read books in Braille with an uppercase B and listen to audio books. I don’t have a Victor Reader Stream, but I used to have one in high school. As tech savvy as I am, I enjoyed the simplicity of the device, but it no longer holds my interest, especially now. I don’t like that. With the new generation of Victor Reader Streams, customers have to send their devices to HumanWare to replace their battery.

I have no doubt that HumanWare will charge them an unreasonable amount just to replace the battery. To listen to audio books I use my iPad mini. I put my device on ‘Do not disturb’ if I don’t want to be interrupted. I used to have a setting that would allow calls to go through if someone called twice, but I’ve recently turned that off. I’m very happy you’ll be interviewing a blind author. I’m predicting a very interesting conversation about blindness representation based on what you said. I’m very excited and curious to see what this author says about this.

As an avid reader and published author myself, I think blindness representation is important, but not just showing them doing everyday things. I think it needs to be extended to different genres. I read a lot of fantasy and romance for the most part, and I haven’t really seen a blind main character. This is something I want to see more of eventually. In my stories, I strive to write well-rounded or blind characters. Blind people need to see themselves in the books they read, no matter the genre. I write based on my experience, of course, but I hope at least some blind people see themselves. I’m very curious to hear your thoughts on this.

I enjoy writing blind characters, so I would love to know how others in your community think as well.” Thanks, Anexis, It’s great to hear from you. Thank you for a thought-provoking e-mail. I agree with you. I think disability generally needs to be more widely integrated into literature by people who know what they are talking about. It’s disappointing when you can be reading a book where somebody hasn’t bothered to do any kind of research whatsoever on disability, and they depict a particular impairment in a pretty and accurate way, and it can totally spoil the book.

That also means not only that we need to see blind characters and others in a variety of genres, but we also need to promote disabled writers. One of the best and most accurate depictions by a non-blind person of a blind person was Robert Jay Sawyer’s depiction of Caitlin in the Trilogy called WWW, and the books will Wake, Watch and Wonder. If you’re a sci-fi reader and you haven’t read Robert Jay Sawyer’s WWW Trilogy, it’s a very good read, actually, particularly at the moment as we grapple with AI and what it means.

Because the whole premise of this trilogy is that the web becomes sentient. It’s a good read. It was published back in about 2008, 2009, and I had been a Robert Jay Sawyer reader for a long time before that book came out. It was an absolute pleasure to interview him and get to speak with him.” I’d love to hear from you. If you have any comments you want to contribute to the show, drop me an e-mail written down or with an audio attachment to Jonathan, jonathan@mushroomfm.com. If you’d rather call in, use the listener line number in the United States 864-606-6736.

Automated: Mosen at Large Podcast.

[01:59:54] [END OF AUDIO]