Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.
Advertisement: Chris Peltz here with the Blind Grilling Experience – where we talk all things cooking, grilling, and barbecue; tips, tricks, and techniques for the blind and visually impaired, all things accessible technology centered around food.
If you like brisket and breads, you like pizzas and pies, folks, we will leave you hungry and wanting more.
Check out the Blind Grilling Experience on your favorite podcast app, or visit our website at BlindGrilling.com.
Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.
Hello! this week: Audible support on the Victor Reader Stream 3 is not happening soon, we want to hear from you on old apps you’d like to use, TV and white cane recommendations and anything on your mind, and we get an update from the NFB convention on the Monarch Braille device which many are calling the Holy Braille.
We’ve already made it to episode 241. Doesn’t time fly when you’re having fun?
Now, area code 241 in the United States has not been assigned. It is allocated apparently for general purpose use, so one day area code 241 may be a happening thing. But in the interim, I see warnings on the web that if you get a call from area code 241, it is likely to be a dodgy nebulous thing and you probably shouldn’t pick up because the area code doesn’t exist.
On the other hand, the country code 241 most certainly does exist, as does the country. It belongs to Gabon, where there are about 2.5 million people according to the 2023 census. Its official name is the Gabonese Republic. It’s in West Africa.
And that’s about all I have to say about that because I am using the beta of iOS 17 at the moment, finally on my primary device. And when I said to it “Tell me about Gabon.”, it told me that much. And then it said, “Shall I keep going?”, And I said yes. And it said, “Sorry, I don’t understand.” So I’m not going to be able to tell you any more about Gabon or the Gabonese Republic. But if you’re listening from there, a warm welcome to you.
Speaking of iOS 17, developer beta 4 is out, and I have installed that on my primary phone.
It’s in relatively good shape, actually, but I’ve been holding back on installing it on my primary iPhone because there’s been this glitch affecting Mantis Braille displays, at least. I don’t know whether it was affecting others that were using HID Braille display drivers. But when you went into any edit field previously, if you tried to type, you couldn’t. So if you’re trying to send a text message, or an email, or a toot on Mastodon, [toot sound effect] (Oh, there it goes now.), you couldn’t because you weren’t able to type anything in. So that was a bit of a show-stopper that prevented me from putting it on my iPhone 14 Pro Max.
But that one’s just been fixed in the beta 4, so I am able to install it. And we might have a look at some of the new features in iOS 17 in subsequent episodes, now that I’ve got it up and running.
It is a beta, and that means that you can expect bugs at the stage of the cycle. That is perfectly appropriate and perfectly expected.
One of the things I am finding is that even though I can now type in these edit fields, I find sometimes that the device, the Mantis that I use is being seen as a keyboard just fine. But it is not often coming up with Braille. And I have to go to a lot of efforts to make Braille come up. So it is a little bit idiosyncratic, and a bit fiddly, and a bit frustrating at the stage of the cycle. But hopefully, that will improve as we get closer to release.
It is curious to me, and it’s a beta so it’s just an observation, it’s not a complaint –that notifications break in almost every year’s iOS beta cycle. I don’t know what it is about notifications that make them so problematic. But what you typically find with early betas of iOS in any year is that if you get a lot of notifications and you’re scrolling through in the morning, then you can’t get to them all very easily because there is something to do with the infinite scrolling that breaks.
For example, I woke up this morning with about 33 new notifications, which is actually quite a low number for me. It can be more, depending on how much news is breaking in the world. I scrolled a little bit, and then found I couldn’t scroll any further without a lot of flicking left and right and things to try and move the thing forward. So that’s another issue. But it is often fixed before release. So hopefully, it will be in this case as well.
How are you finding iOS 17 as the cycle progresses and we get towards what’s expected to be a September launch of iOS 17? More and more people are jumping on the bandwagon. So if you have any findings, what are you liking? Have you tried personal voice? Have you tried point and speak? How are some of those new big features that were being talked about at the WWDC conference? Are they working out for you okay?
And as I say, we will delve into some of these in coming weeks here on the podcast.
I want to give a shoutout to Pneuma Solutions, who make it possible for us to provide transcripts of every Living Blindfully episode.
Advertisement: Remote Incident Manager is taking the blind community by storm, and for good reason.
But don’t just take my word for it. Look at some of the organizations who have deployed this easy, fully accessible way to give and get remote assistance on your computer. RIM is used by Perkins School for the Blind, the national office of the National Federation of the Blind, Northern Arizona University, the Wisconsin Council of the Blind and Visually Impaired, and a growing number of other organizations around the world.
Getting two computers together doesn’t need to be difficult. PC and Mac can even talk to each other fully and accessibly.
Join those of us using RIM today by visiting GetRIM.app. That’s get-r-i-m.app.
While we’re talking about iPhone things, we had a really cool discussion a few weeks ago on Mastodon. And if you are on Mastodon, then you can follow this podcast at podcast@LivingBlindfully.social. That’s podcast@LivingBlindfully.social. I share some interesting tech news and other links there. And sometimes, we have some discussions.
A few weeks ago, we had a discussion about old iPhone apps that no longer exist that we remember with fondness. And it has been my intention to bring this discussion to the main podcast for those who don’t participate in these Mastodon discussions because it was a great discussion, and the iPhone’s been around long enough now that we do have fond memories, hopefully, of apps that were once in existence that aren’t in existence anymore. And I thought this would be a fun topic.
Blind people have been using iPhones since VoiceOver came to the 3GS in 2009, and the numbers have only increased ever since. So there’ll be apps that many of us remember using that don’t exist anymore.
Let me do some starting off here, and reference the Mastodon thread where we had some great discussions, and see if anybody has memories of any of these apps or can think of others.
One of the things that I first remembered when I started thinking about this topic was Boxcar. You had to be a pretty early adopter of iPhone to have used Boxcar, but this was an app that pushed all sorts of things. You could connect it to Twitter before it got weird. [laughs] You could connect it to all sorts of other things, and it would push notifications about all sorts of things to you.
There was another Twitter app that was really popular in the blind community, and I understand it was made by a husband and wife team. I don’t know what became of them. It was called TweetList. That was around for a long time, the old TweetList, and even when it was no longer being developed and Twitter moved on and added features, a lot of people stuck with TweetList because they really liked the user interface.
One of the downsides for me about TweetList was I don’t think they ever got around to introducing push notifications, which for me was a pretty essential feature as somebody who got a lot of notifications.
But it was a great app, and they were very responsive when they were doing their thing. Good old TweetList.
In terms of the apps that we now enjoy like Be My Eyes, Aira, and other things in that space, there was an app called VizWiz, and I believe that VizWiz may have been the first app of this kind where it sought to identify things for blind people. And my memory’s a little hazy, so perhaps others have more detailed memories of this. But I think it was some sort of university project, and the idea was that you would send a picture, you would ask a question, and you’d get a variety of answers back from volunteers who got your question. And if they had the time to take a look at the picture, then they would do that and give you your answers. You could also share your picture on social media and get answers back that way. So I think that’s what VizWiz used to do, and it was pretty exciting when that came along.
Of course, the KNFB reader is still around. But I remember how significant it was when the KNFB reader arrived on iPhone because I had a Nokia N82, which I really liked.
I think the Nokia N82 for me was the sweet spot of the Symbian era. I had the N86 later. Didn’t like the N86 as much as I liked the N82. There were various things happening in Symbian land by the time we got to the N86. That meant that it wasn’t as good an experience.
The N82 with the KNFB reader, it was a little bit slow. But it was pretty cool to have that kind of technology in your pocket available to you at any time. And nothing on iPhone matched the kind of quality you got with KNFB reader.
I think there was an app called Text Detective that was attempting to do this way in the early days of iPhone. And it sort of worked, but it didn’t work nearly as well as the technology that we have these days.
And when KNFB reader came along, of course, that was a significant game changer. It was pricey, but we were willing to pay because in those days, that’s how you got access to really decent OCR on your iPhone.
Here in New Zealand, we just use standard Daisy books, which is great. There’s no DRM or anything associated with them. No special audio format that’s particularly unique.
So there was an app called NDaisy that I used to use quite regularly when I first got an iPhone to put Daisy books on my phone and use it as a talking book player. NDaisy was a pretty powerful Daisy player, and I’m not sure that there’s anything that has quite emulated that since. And I was disappointed when that one disappeared. Anybody else remember NDaisy?
When I raised this on Mastodon, others brought up all the games that they’ve played over the years on their iPhones. And somebody mentioned a game called Orify that was available way back when in the early days of iPhone. That was very popular.
That reminded me that I had a game called Zany Touch. I think Zany Touch might still be around. It was kind of like a Bop It type game.
And there was an official Bop It app for a while. That was pretty cool. I think it had some VoiceOver issues, or maybe you had to disable voiceover. Perhaps that was before the days of direct touch on VoiceOver, or it hadn’t been implemented. But the official Bop It app was pretty good. Not sure that that’s around these days.
People also talked about how easy it was in the early days of iPhone to share content with Facebook and Twitter (The art is formerly known as Twitter anyway. It’s now called X.) [laughs] because it was all integrated with the phone. It was all part of the sharing facility on the phone.
And somebody also commented on the very accessible, simple YouTube app that Apple developed that was built into early iPhones.
So I won’t go through this whole Mastodon thread because I’d like to hear from you. Do any of these apps bring back memories for you? And are there other apps, games that you used to play that you no longer have access to because the developer has moved on and pulled those games? And of course, that can happen in a thriving community of independent developers.
But it’s a fun topic to reminisce, isn’t it? People like tech reminiscing. So if you want to talk about early iPhone apps and what it was like to start to use those on your mobile device, this sounds like a cool thing to get into.
You can also call the listener line and leave a voice message there. I just would encourage you to do it on a phone that’s got some decent audio and doesn’t sound really horrible because that all helps us to hear, especially those of us with hearing impairments. 864-60-Mosen is the number in the United States if you want to use that method. 864-606-6736.
Be pleased to hear from you on this, and any other topic you want to raise.
Voiceover: Stay informed about Living Blindfully by joining our announcements email list. You’ll receive a maximum of a couple of emails a week, and you’ll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want.
Join today by sending a blank email to announcements-subscribe@LivingBlindfully.com.
Why not join now? That’s announcements-subscribe@LivingBlindfully.com, and be in the know.
Alco Canfield is writing in on a couple of matters.
I just started using a titanium cane which does not telescope. It is as light as the carbon fibre ones and has a metal tip. I feel much more secure with it.
In the past, I used a carbon fibre one. And when it shattered, it really shattered.
The man who makes this cane is Don Fidone.” (That’s F-I-D-O-N-E.), “and his email address is DCanes2019@gmail.com.” (That’s D for delta canes 2019, all joined together, @gmail.com).
“The cane costs $31.”, Alco continues.
“I use a Brailliant with my iPhone. And finally, with the new 16.6 software update, it doesn’t lock up when I get to the next page when reading a Kindle book.
This has been a source of great frustration for me, and the last former 2 updates did not fix the problem. I am glad now it is finally working.”
Me too, Alco, me too. It has broken before. Let’s hope it doesn’t break again [laughs] for the foreseeable future.
Voice message: Jonathan, this is a cloned Roy Nash voice.
I use an iPhone, and I’d like to post on Mona. There are a number of videos that I would like to post.
However, I don’t know how to put descriptions of the videos on Mona so as to make them accessible to everyone.
Can you help me? I suppose the better question is, would you help me? I know you can.
Jonathan: [laughs] No pressure there, Roy.
First of all, Roy, that has got to be one of the biggest ElevenLabs [laughs] fails I have heard to date. Because if anybody has heard Roy contribute to this podcast before, you’ll know that he’s got that wonderful southern accent going on. I hope I haven’t just planted you somewhere you’re not, Roy. But I believe you’re from Arkansas, and the accent confirms that you’re from that part of the world. [laughs] You do not sound like you’re from Arkansas with that clone from ElevenLabs.
Anyway, adding a description is pretty straightforward. Just bring up the new Toot composition screen and attach a photo or video in the normal way. So you will double tap that attach a photo or video button, scroll around for the video that you want.
There are various ways to get to this. For example, you can choose the albums button. And then, you can narrow it down to video and hopefully, that will assist you to locate the video that you want. You double tap, and voila! (That’s the actual French.) Your video is attached to the Toot, ready to send without a description.
So to add the description, you then locate the video and double tap it. You’ll then find some controls.
You can play the video to be sure that what you’ve attached is actually what you think you have. For a blind person, that’s particularly helpful. So you can play it.
And then, another button in that screen (once you’ve double tapped the attached video) is add description. And it’s there that you can, as my friends in New York…
I’ve never heard someone from Arkansas say knock yourself out, but maybe I just don’t know enough people from Arkansas. Maybe they say it all the time.
But knock yourself out. You can attach your description there. You can write a lengthy description of the video so that someone with a hearing impairment, or somebody who perhaps doesn’t want to sit through the whole video can get a quick synopsis of what the video contains.
Hope that helps, Roy. And it’s certainly good to see you on Mastodon sharing some interesting content. And who knows, maybe next time we’ll hear the real deal and your lovely Southern accent coming through on the show.
Elijah Massey says:
I heard that there is now a version of JAWS for Android, and I was wondering if you have any information about this.
Apparently, there is a company called TPG that is part of Vispero, (the parent company of Freedom Scientific) that makes a product called JAWS for kiosks. And they have created a version for Android.
However, there is no information about this on the Freedom Scientific website. And the TPG website has a link to contact them about licensing their kiosk product, but no documentation or trial downloads.
The page on their website mentions the ability to use the touchscreen to control the kiosk and to label unlabeled buttons in apps, which leads me to believe that there is support for standard Android apps and not just ones designed for JAWS for kiosks.
If Freedom Scientific released a version of JAWS for regular Android devices and made it significantly less expensive than JAWS for Windows, it might have some powerful features from JAWS for Windows that TalkBack lacks. It would probably be competitive with Commentary Screen Reader and if it had the excellent Braille” (with an uppercase B) “support and keyboard commands from JAWS for Windows, I think it would become the best screen reader on Android. Not to mention features like PictureSmart, OCR, JAWS scripting, the Virtual Viewer, Flexible Web, the amount of configurability JAWS has, etc. JAWS could even make Android instantly surpass iOS in terms of accessibility.
Freedom Scientific should definitely release this, if it’s as good as I’m thinking. Even if they make it cheap, they’d make so much money from this.
I would expect them to announce this everywhere they can and have tons of press releases. Yet, they are only advertising this for kiosks on a webpage that is hard to find and makes it seem like no big deal.
Is Vispero just testing this on kiosks first to fix bugs and adding more features until they’re ready to release? I really hope they end up releasing this to the public.”
Elijah, I fully agree with the principle of what you’re saying. I would love to see a JAWS for Android, and I would like to see some quite close synchronization so a lot of your settings and other things would sync in a kind of a JAWS cloud environment between your Windows screen reader and your Android screen reader. It would be fantastic!
License Eloquence. Make sure that is working properly because obviously, there are some ongoing issues.
iOS has Eloquence built in now. Android does not, and it’s becoming increasingly difficult.
I think though that there’s a vast difference between what can be achieved on Windows and what Android commits. There’s a series of APIs (Application Programming Interfaces). Some of them are quite limited.
We know, for example, that the problems supporting HID Braille displays, which is something we’ve spent a lot of time talking about on this podcast, is at the Bluetooth stack level. So this is something that the TalkBack developers, as I understand it, can’t move on until there’s some systemic change at the Bluetooth stack level, a different division of Android altogether.
So this isn’t a case of Vispero just allocating resources to it and making a business decision to develop a JAWS for Android. Kiosks are a different case in point. And a lot of these kiosks run Android under the hood, but they’re very much a closed environment that do very specific things. That’s a much simpler use case to code for than an open platform that has to work in a wide range of scenarios with a wide range of peripherals.
Coincidentally, I did run into Matt Ater, or Matt Ater ran into me. Blind people do this sometimes, you know, they run into each other [laughs] at the NFB convention. And he did express an interest in coming on the podcast and having a chat about the kiosk product.
So we will follow up on this. But I don’t think you should infer that JAWS has any plans to release an Android product for the general smartphone market at this point. I mean, they may, but there’s no indication that I’ve heard that it’s even in the works.
Karl-Otto writes in and says:
“I heard your interview with Shelly and you mentioned iBeacon as something that had potential. But then, it faded out.
Back in 2016, I heard about iBeacons and started experimenting with them. And in October 2017, the app Quenda was shown to the public for the first time. Quenda is an app that uses iBeacons to present information about where you are and what’s in different directions. It can give you directions from one place to another with instructions very similar to those you get from navigation apps that use GPS.
In 2019, Malmö City Library did a project with it to make their venue more accessible for low vision and blind people. But when we had the final meeting, as the project ended, it was February 2020, and we all know what happened then. There was no interest in an indoor navigation system when no one should be out and about. So I kind of declared it dead, at least for the near future.
In January 2023, I met with an accessibility coordinator from a municipality nearby, and she showed interest to try it out. During the summer, there were 20 beacons placed outdoors in Höganäs City to help low vision and blind navigate their way around.
After your mention of iBeacons, I sat down and translated the pages for Quenda to English, finally, so you and others can have a look.
It’s quite sluggish to convince public and private companies that it’s a good idea to make their venues easier to navigate for low vision and blind, but I’ll never give up. Thankfully, I’ve managed to keep expenses to a minimum and use income from other parts of my business to finance the project.”
Thanks very much for writing in, Karl-Otto.
Back in about 2016 or 2017, thereabouts, Wellington, the capital of New Zealand where I live, was kitted out in the central business district with a lot of iBeacons in conjunction with the BlindSquare app, and it was quite good.
And if you go back into the BlindSide podcast which is still online, you can hear the demo that I did of walking around with the iBeacons. It was useful, but nowhere near as useful for actual navigation from one place to another as the new GoodMaps Explore app is. That is just giving you a phenomenal level of accuracy.
I will include the link that you provided in the show notes.
I’d be interested to know what the cost of uptake is for those who want to put iBeacons into their businesses. Where do you get them from? What does it cost? How easy is it to do it?
Matthew Whitaker has an interesting question. He says:
“I am reaching out to seek insights and recommendations regarding the accessibility of televisions for individuals who rely on screen readers.
While I currently use an Apple TV and a Fire TV stick, I am eager to explore other options that offer accessibility features and are user-friendly.
I would be grateful for any suggestions or experiences you may be willing to share. Specifically, I am interested in learning about alternative TV streaming devices apart from the Apple TV and Fire TV stick. Are there any other television streaming devices that are particularly accessible and easy to navigate with a screen reader? I would love to hear about your experiences with such devices and their unique accessibility features.
Choosing the right TV size. I am also looking for guidance on selecting the appropriate TV size for various rooms. What factors should I consider when determining the optimal screen size for a given space? Are there any recommended guidelines or tips that you can share?
If you use a gaming console or external speaker system, which TVs do you recommend for connecting those?
Thanks for all the help.”
Matthew, it’s been a few years since we looked at this. But back in 2020, there were episodes that we ran on the Samsung TV that we currently still have, as well as a Sony one that we returned. Because when you enabled the eARC (the Extended Audio Return Channel), it disabled its screen reader.
We’ve had a Sony TV since about 2016 or 2017. We originally upgraded to the newer Sony TV because we bought the Sonos ARC which is a great sound bar, which requires the Extended Audio Return Channel. And as I say, we ran into that problem where if you enable that, no screen reader, which is a bit of a show-stopper.
We’re very happy with our Samsung TV, and I think the way that they work has largely not changed. I’m sure there’ll be much newer, better models, but I think their screen reader is fundamentally the same as was in the demo in 2020.
I understand LG do some pretty nice accessible TVs as well, but I’ve not had any direct experience of that.
So I hope this will be a really informative topic. If you have experiences with streaming TV devices and actual TVs themselves, opinion@LivingBlindfully.com, 864-60-Mosen in the US, if you want to be in touch on the phone. 864-606-6736.
Voiceover: If you’re a member of Living Blindfully plus, thanks for helping to keep the podcast viable.
If you haven’t yet subscribed, why not do it today?
Get access to episodes 3 full days ahead of their release to the public, you’ll get advanced notice of some of our interviews so you can have a say in what we ask, and you’ll help keep the podcast viable by helping to fund the team to do their work.
Our guarantee to you is that everyone who works on the podcast is blind or low vision, so we’re keeping it in the community.
Find out more. Visit LivingBlindfully.com/plus. That’s LivingBlindfully.com/P-L-U-S.
Pay what you can. It all helps. Thanks for your support of Living Blindfully Plus.
Anexis Martos writes:
I hope you’re doing well. Great speech for the NFB convention.”
Thank you so much!
“I wasn’t able to attend, but I’m glad I got to hear your speech.
I, for the most part, am well, but I’m having some accessibility issues right now and would love to see if you have any solutions or ideas.
I’m currently a college student and will probably graduate with a Bachelor’s degree in 2 or 3 years. In the meantime, I’m trying to build a career as a freelancer.
Currently, I have my services up on co-fi.” (I think that’s how you pronounce it. It’s C-O – F-I) “and LinkedIn.” (I know how to pronounce that. Although sometimes, my speech synthesizer says Linkedin. Linkedin.)
“Both were very easy to set up. co-fi was definitely the easiest, allowing you to set up individual services. LinkedIn allows you to set up up to 10 services, but it has a character limit for descriptions, and there can only be one for all your services, from what I can see.
Recently, I tried to offer services on Fiverr. I was able to get through the personal information, but I can’t seem to get through the professional information area.
I had a difficult time adding skills, to the point that when I wrote poetry, it typed Hindi poetry. I was able to fix it, but it took me a few hours.
When you type in skills, it gives you a suggested result, which you have to click on. From what I understand, based on what VoiceOver read out, it’s highlighted on the screen. Finally, for some reason, as of writing, my skills are showing up as pending.
In this section, they also ask for relevant education and certificates, but they don’t seem to be required. However, it doesn’t allow me to continue to the next page. I attempted to add my current education. But even then, it wouldn’t allow me to continue.
I’ve attempted to complete my seller profile on Fiverr with 3 different devices. I tried my BrailleNote Touch, my iPhone, and my MacBook Air. On my Mac, I tried on Safari and Google Chrome. I had similar results with everything.
I tried offering services on Fiverr a few years ago and had similar issues. I was hoping the accessibility issues were addressed, but it seems that’s not the case.
Has anyone attempted offering services on Fiverr? Have you found similar issues?
Thank you so much. I look forward to hearing from you.”
Thanks, Anexis. Let’s see if we get anything back on this.
I have used Fiverr as a consumer, and I must say it’s not the best. I’ve kind of managed my way through it, but not particularly pleasantly. So I can well imagine that perhaps, selling on Fiverr may not be as accessible as it should. Let’s see what comes back.
But if you’re experiencing issues, I’d encourage you to reach out to Fiverr and make them aware of the issues. Let us know how they respond if you do that.
Sometimes, you get an email and there’s nothing left to do or say but take a deep breath and sigh, and say “Thank you.” [laughs]
This one came through just before episode 240 was published from Catherine Getchell, and she says:
I have been following with interest and dismay on your dialogue with Apple about the inaccessibility of their date picker.
Their response to you that they could not replicate your issue using Safari and VoiceOver was absolutely shocking to me, as it represents the poorest of POUR in digital accessibility testing, and it violates a key digital accessibility principle.
As you and some of your listeners may know, digital accessibility has 4 key principles, and they follow the P-O-U-R acronym. A website or other digital material must be perceivable, operable, understandable, and robust, and the robustness principle is the one the “testers” and the date picker violated most notably here. Though I should mention that the date picker is neither perceivable nor operable by screen reader users, but the testers violated the robustness principle in their testing. These POUR principles are one of the fundamental things that digital accessibility professionals learn before they learn anything about accessibility design or testing.
Digital material is robust if it is accessible using a broad range of assistive technologies, and it fails the robustness test if it is accessible using only one setup such as Safari and VoiceOver.
The latest Worldwide Web AIM survey on screen reader usage conducted in June of 2021 finds that 53.7% of screen reader users use JAWS, 30.7% use NVDA, and only 6.5% use VoiceOver.
When it comes to browser usage, 53.6% of respondents use Chrome as their primary browser, 18.4% use Edge, 16.5% use Firefox, and coming in at a distant 4th place again, Safari is only used by 5.1% of users as their primary browser.
So for Apple’s digital accessibility team to rely only on the third most popular screen reader and the fourth most popular browser for their testing is ridiculous. Yes, Safari and VoiceOver should absolutely be one of the configurations they test with, but they need to also test with other common setups such as Edge with JAWS and Chrome with NVDA. This is the only way to make sure their digital properties are robust.
Thank goodness you are savvy with your tech and you have access to a lot of screen reader and browser configurations for testing, you explained to them how their date picker is not accessible using a large number of setups. You also explained how even on VoiceOver, the date picker isn’t accessible unless you happen to know that pressing return on the supposed edit field will open the date picker somewhere else on the page.
But it is not your responsibility to do Apple’s digital accessibility testing for them. It is Apple’s developers and digital accessibility team who is responsible, and they have demonstrated a fundamental lack of understanding or adherence to a basic digital accessibility principle, not just in their web development, but in their testing as well.
I hesitate to say I look forward to the next installment in your saga. But I will be listening attentively for the next update, and I certainly hope that this is a learning experience for Apple’s digital accessibility team and that they or their superiors realize that their “testing” process has some major flaws that need to be addressed.
Thanks for letting me blow off some steam, and I appreciate the great show.”
Well, Catherine, I appreciate the great email. Thank you so much for your support.
The arrogance, the ignorance, the contempt with which I’ve been treated over this process is absolutely shocking to me, and I would like to think it should shock many blind people. If there is anybody in Apple with a modicum of decency and adherence to good testing practices, it should shock them as well. It should shock them into action so that this sort of thing never happens again.
Voiceover: On Living Blindfully, we hear the opinions of blind people from all over the world.
So why not share yours?
Or if the phone is more your thing, phone our listener line in the United States: 864-60-Mosen. Thats 864-606-6736.
This email comes from Charlene Ota who says:
You’ve been talking about mainstream companies not always being as concerned with accessibility as they should be. Well, I have a situation you can add to your list.
I am currently working from home. My employer has been using a VPN called ExpressVPN to connect us with their system.
Due to some other issues, they had to create a workaround for me when we first went to work at home.
Recently, they thought they’d figured things out so I could use the same system for logging in as the rest of my co-workers. So someone came by my house and we started getting the new system set up.
One thing I needed to do was install and log into ExpressVPN. That went fine.
However, when I logged into the work system, that system needed to verify my identity through that ExpressVPN app. The verify button came up, and both Kyle and I kept tapping it and tapping it, and nothing happened. He could turn VoiceOver off and then tap the button, and it connected me. So I’m back to my old way of logging into the work system.
Kyle went back to the tech guys at work and told them what happened. Initially, they contacted the developer of this app and were told it had been tested with VoiceOver, and it worked fine. So I told them it does not, and told them I have an iPhone 14 Pro and to make sure to tell the developer what kind of phone I have. The developer told the tech guys that they had only tested their app with what they say are mainstream iPhones and thought they didn’t need to bother with the iPhone Pro models.
So my employer has had to go find another way to deal with my situation. Apparently, my employer isn’t interested in pushing them to fix their app. I find it interesting that they have made judgments about us VoiceOver users.
Anyway, just thought I’d share this with you. Thanks for Living Blindfully and all you do.”
Lovely to hear from you, Charlene.
I’d be really surprised if the hardware made a difference. Normally, iOS is iOS. So if it works with iOS on one device, I’m not sure why a pro device would make a difference. Perhaps they haven’t really tested as thoroughly as they think they have.
Has anybody used ExpressVPN? I know it’s a pretty popular tool. If so, how well is it working out for you in various situations? opinion@LivingBlindfully.com, and the number – 864-60-Mosen in the United States. That’s 864-606-6736.
Earlier in the year, we were talking about the Victor Reader Stream 3. There was a lot of interest in this. We had Dominic and Mathieu from Humanware on the show to talk about this. People have been reviewing it, and it’s a very popular device in our community.
So when a new generation of this thing comes out, it is big news, and we’re going to cover it on this podcast.
We learned as part of that discussion that there’d been a little bit of disconnect between Humanware and Audible, and Humanware, in the email I’m about to read, recap the steps that they took way early on in the product development cycle of this device to reach out and try and engage with Audible to ensure that there would be Audible support in the Victor Reader Stream 3. Earlier in the week, Mathieu Paquette released this email to the Victor Reader Stream email group, and I’m going to read this to you and then offer a bit of comment.
“An update on Audible on the Victor Reader Stream 3.
Hello, everyone,“, begins Mathieu.
“I wanted to give you an update on the latest about what’s going on with the Audible integration on the Victor Reader Stream 3.
I’m going to pull the curtain back a little and let you in on the inner workings of such an endeavor.
For context, we initially contacted them in July 2021, 2 years ago, to let them know we had a new product coming, and to prepare. We contacted them again in July 2022.
We were told to just use the previous implementation – the one we have on the Stream 2 and Victor Reader Trek.
In November 2022, it became clear that the fact we’re now using MTP (This is Media Transfer Protocol.) wouldn’t work in Audible Sync. So we re-initiated contact, let them know, and sent them a device at their request for testing. This is all before anyone knew the Stream 3 was coming out.
As I’ve mentioned on some podcasts and here before, it was then radio silence for a couple of months. I would email them once or twice a week, and get no answer in December and January.
That is until Jonathan Mosen, in late January, initiated a very successful open letter/petition that caught their attention. Then, they started communicating back with us. Since then, we’ve had some meetings, in theory, bi-weekly, but a few ended up being cancelled with little to no notice.
We were told in February we’d be sent a test account. It took about 2 months to receive it, April. We then kept being told, from meeting to meeting, that it’s going to work in the next few weeks.
Finally, around mid-June, we finally were contacted by one of their engineers who started looking at the issue. From the questions they asked, it seemed clear that this was the first time they even plugged in the Stream 3 to a computer.
I’m going to open a parenthesis here,” continues Mathieu.
“Now, what you need to understand is, there is about a month’s worth (4 to 5 weeks) for one person of work to do on our side to create the interface on the stream. We need to build an app, build the user interface, etc. We’re a small company, and we cannot afford to put that much work into something that ends up not working because Audible Sync won’t support MTP.
So we are waiting until we have confirmation that the stream is seen before we start the work. That means that in order to make the 1.2 update, we’d have to get the Audible Sync thing sorted out before the end of July, then start the work, which brings us to early September, when we can then do quality assurance and beta testing. End of parenthesis.
2 weeks ago, we had a call with some higher-ups at Audible that told us we would get an update by July the 21st, which was last Friday. As we had not heard anything this morning, we asked for it.
They responded quickly and the update is, we’ll give you an update by July the 31st, as we’re evaluating whether or not we can do MTP. So basically, same as 2 weeks ago.
We at Humanware are really disappointed in this turn of events. The fact that we’ve given them a heads up 2 years ago, provided them with a test unit over 8 months ago, and to be quite blunt, not much has been done since at all.
I’m the kind of person that calls it like it is. When we mess up at Humanware, and we do sometimes, I don’t shy from it. I voice it. Doesn’t always make me popular, but I tell it like I see it. And I’ve even said on this sometimes that we’ve made a mistake.
In this instance, for Audible, I can say that we did everything right. We answered them in a very timely manner every time they asked us a question, provided them with all that we could, attended every single meeting with them, etc. We really did everything we could humanly do.
As it stands, it seems very unlikely we’ll see Audible support in 2023. Mid-2024 would be my bet. And that’s if they don’t decide that MTP is too complex to add.
We are looking at a plan B right now, if that’s what they decide, to see if there’s something we can do. We think there may be a way, but without going into details, it’s a bit ugly and we’re really hesitant. It would also require a lot of work.
So there you have it. I’ve always been transparent with you all. And my goal here really isn’t to throw Audible under the bus or anything, but just relay things as they are, factually.
We’ve expressed our disappointment with them, but told them that despite it, we’ll still cooperate and do all that we can to make it happen sooner rather than later.
So yeah, thank you for taking the time to read this long update. I’m sure that we’ll eventually get there. It’s just a bit of a long, winding road. Like Sheryl Crow sang, I get a little bit closer to feeling fine.”
So that’s the email from Mathieu. It is really disappointing.
I’m proud of the role that we were all able to play with the open letter in January, respectfully pointing out just how significant Audible is, and how many people love their Victor Reader Streams.
They seemed to be making all the right noises. The communication resumed. But it seems like it’s all been empty so far, and that progress has not been made.
It must be very frustrating for those who have invested in a Victor Reader Stream, believing Amazon when they said that they were on to it, that they were going to prioritize this. And yet, Humanware has nothing of consequence to report after all this time.
I imagine that they might be looking at Humanware at somehow backward compatibility, essentially reintroducing the support that’s in the Victor Reader Stream 2 for the Audible Sync compatibility. That does sound complicated, and it could be quite fiddly.
So if this matters to you, what shall we do? I think the least that we can do, if you are affected by this, is to contact Audible support. Tell them that you are a Victor Reader Stream user (even if you’re not a Stream 3 user at the moment, but you might want to be). Let them know that this is your preferred method of consuming Audible content, and that in any material definition, Audible has done a whole bunch of nothing about making progress on this, even though people bought their devices in good faith after Amazon’s assurances.
So if you feel strongly, if this affects you, do contact Audible support. Let’s turn the heat up. Let’s see if we can help out to get some progress, just like we did in January.
Advertisement: Living Blindfully is brought to you in part by Aira, and I thank them for their sponsorship of the podcast.
You know we’ve become used to some businesses offering free Wi-Fi. It’s a nice touch, and it makes us feel valued whenever we come across it.
And I know similarly that when I learn about a business that has purchased Aira Access, it’s like putting out a massive “Blind people are welcome here.” sign. I know that if I need it, I’ve got a trained professional agent available to me to provide assistance, and that means that the business cares enough to pay for that. I appreciate that.
From airports, to Starbucks, to Target, and more, Aira Access can assist you to navigate, shop, browse and not be reliant on friends, family or others who may not understand our needs as well. And don’t forget that as well as the offerings in physical locations, there are other businesses providing Aira Access that can be used from home.
So you should check out the app to find out what’s available. To do that, just open the Aira Access section in the Aira Explorer app on your favorite device. You can also visit Aira’s website to find out more at Aira.io. That’s A-I-R-A.I-O.
We’re in the Humanware suite at the NFB convention, taking a look at something that a lot of people are talking about here. This is the Monarch. It’s often referred to as the Holy Braille, and we’ve talked about this on the podcast before.
We have got a star-studded cast assembled for this.
Handing over to Greg Stilson. Are you like the support act, or are you the main act?
Greg: [laughs] I don’t even know how to answer that question.
Jonathan: [laughs] There you go. Do you want to introduce who you are with us?
Greg: Sure. So I am Greg Stilson. I run the global technology innovation team at the American Printing House for the Blind.
And with me is Andrew Flatres of Humanware. Andrew, do you want to introduce yourself?
Andrew: Yes. So my name is Andrew Flatres. I’m the product manager at Humanware. I’ve been with Humanware for 20 years, and I’m certainly happy for Greg to be the main actor here.
Jonathan: And you’re a familiar voice on the podcast too, Andrew.
Andrew: Yes, I’ve had the pleasure of joining you a couple of months ago, I think it was, yes.
Jonathan: So good to have you here.
Andrew: Very cool. And to my right is William Freeman.
William: Hello, William Freeman, technical technology product manager at APH. Happy to be here.
Jonathan: Another star of the podcast.
William: Wonderful. Thanks.
Jonathan: Yeah, yeah.
William: And then, we got Peter.
Peter: I am Peter Tucic, the director of strategic partnerships at Humanware, former podcast guest as well.
Jonathan: You are all veterans of the podcast.
Greg: We reconvene in person.
Bonnie: Podcast communion.
Jonathan: Yeah, yeah.
Peter: We are in person, which is fantastic.
Jonathan: So The Monarch. We talked about this some time ago when it was a concept. And we heard that you were working with Dot Inc., who make the Dot Pad. And I had the privilege of seeing one of these when Mark Riccobono brought one over to show us in New Zealand when he was there. How is it going? How would you describe its readiness at this point?
Greg: It’s not ready. I’m going to just say that. But it’s on its way. We are excited that we have about, what is it, 180 units now here in the United States, and about 50 of them are going to be allocated to our field testing, where we’re actually going to put these in the hands of students and teachers in the classroom.
Part of our mandate at APH is that before we can put a product on the federal quota schedule, it has to go through in-depth field testing with students and teachers in the classroom. And so that’s coming up here at the end of the year, end of this year.
We have a real working beta device in front of me. And I will say actually, just before the convention, we got another software update from our friends at Humanware, and we’ve been testing that out.
The improvements are significant, and it’s really coming together. We’re to the point now where we can actually look at properly formatted book on this device and go through reading sessions and things like that.
But we’re also really excited because for the first time ever in our Tactile Viewer app on this device, we now have a connection to our Tactile Graphics image library directly on this device through Wi-Fi. So in the same fashion that you would, say, log into Bookshare and search for a book, you can actually log into our Tactile Graphics image library of almost 18,000 graphics and just search for anything.
And just before you arrived, Jonathan, I searched for the word New Zealand and found a graphic of Australia and New Zealand in that area. So just by doing a quick search, I didn’t have to have that graphic embossed or prepared. It was already there in the Tactile Graphics image library. So things are really coming along.
Jonathan: Let me ask you this, and it might be that some of the other blind people in the room want to comment on this as well. Obviously, tactile graphics are the big drawcard of this, and that’s going to be huge for people working in STEM subjects, a range of things like that.
I’m one of those blind people who has never really been able to interpret tactile graphics particularly well. I’ve been in situations where somebody can have a piece of paper in front of them and feel that piece of paper, and it’s got a diagram on it, and they’ll say, “That’s an elephant.” And I say, “How on earth did you get that from that?”
Jonathan: So I guess my question, and perhaps it is very much an APH kind of question because of the education focus that APH has, can that be taught? I mean, even as an adult, if tactile graphics has never kind of made sense to me, is it possible to learn how to interpret them? Or is there some sort of spatial thing going on that means that some blind people just aren’t good at it?
Greg: It is a skill. Tactile interpretation is a skill, and you have to have a good spatial understanding.
But I think, it’s a chicken or the egg thing, right? Like myself, as a blind student, I was not taught a lot of tactile graphics, right? And so if you weren’t taught a lot of tactile graphics, you don’t develop these spatial understandings or skills.
So what I’m excited about is that for the first time, blind students are going to have access to just an exponentially larger number of tactile graphics than they’ve ever had before, because it’s a device that you don’t have to run it through an embosser, and preparation, and all that kind of stuff. You can literally just download it to your device.
The biggest thing that we’re doing that’s so different than I think those who came before us trying to do this Holy Braille kind of device is that we’re really focused on tactile graphics that are created by tactile graphics artists. I’m not trying to go on Google Images and take an image of the Mona Lisa and throw it on this device, because you as a blind person would never know what that looks like. There’s way too much noise in a visual image.
We’re taking graphics that are prepared by a tactile graphics artist, like those that are in textbooks. And that’s really the primary goal of this device is to render a textbook, and part of textbooks are tactile graphics. Those tactile graphics come with Braille labels most often, or keys, or things like that that provide context.
So we’re not just throwing a graphic at you and saying, “Oh, figure that out.” This graphic has context, and it has labeling and things like that.
I’m going to pass it over to William.
William: Thanks. And that’s exactly right.
One of the things we’ve been doing with this device is looking at the research that already exists. And that research says like, whether you’ve looked at 1000 tactile graphics, or just one tactile graphic in your life, you’re more likely to understand it if you’re already primed to know what that’s a tactile graphic of.
William: So one of the things we’re working on is adding alt text and extended descriptions to the tactile graphics. So you don’t have to touch it and try to figure out it’s an elephant. You know it’s an elephant. And now, you can actually explore it with that additional context.
Jonathan: And that’s the beauty of having actual Braille and the graphic in the same display, isn’t it? – that you can mix and match, you can add labels to the element that feels like Braille.
Greg: Yeah, we’ve been running user testing sessions here at NFB. And just the remarks that I hear, because one of the samples that we show is a US map, and the US map has labels where the states are. And the amount of blind people that really were, I’m one of those people as well, that are geographically challenged that we’re like, “Oh, Mississippi is right next to Alabama.” really, like, you know, these are things that, as you’re looking at this map, you’re able to understand what you’re looking at because there are those abbreviations for the states, right? And you’re understanding spatially where these things are actually laid out.
Jonathan: I know there’ll be a minority of people in this position, but there’ll be some who’ve seen the Dot Pad, or even earlier iterations of this device. And when it comes up in conversation with some of the geeks that I talked to, they have said, “Look, this is all very interesting. But the trouble with this technology is that if you keep your hand on the display, when it’s refreshing, then the part of the display that you’re touching doesn’t refresh.” You’re working around that, I understand.
Greg: We are, and we’ve seen significant improvements over the last few months. And you were reading a book on this just a few minutes ago, and you had your hands all over that display.
Greg: I don’t think it had any issues.
Greg: Which is really exciting. And it’s probably the part that I think we’re most proud of right now is that was a huge technical challenge.
So one of the things that we’re able to do is we know where your fingers are on this device. And so the device is tracking where your fingers are. And if you do touch pins that are blocked, we automatically fix that on the fly. And so you as a blind person may see something strange for a second, but it fixes itself at that point, and you just go on reading.
We also have a manual refresh button in the event that it doesn’t fix itself. But we’ve seen those situations more rarely now. So it’s been going extremely well.
Jonathan: Given that it is touch sensitive, does that mean that you can drill down if you’ve got a map of the United States, and you touch a particular state, and you want to expand that state? Is that possible?
Greg: There’s a couple ways that we’re doing graphics right now.
We have 2 types of graphics, I would say – tactile graphics that are created by a tactile graphics artist. Right now, the vast majority of those are in PDF form. And when you’re dealing with those graphics right now, we have 2 zoom levels in those graphics.
So we have what we call the overview mode. If you take that example of the US map, right? You have the overview mode. The overview mode is we take the whole US map that was originally made for an 11 and a half by 11 sheet of paper, and we basically smushed the whole thing into our 10 line by 32 tactile display, okay, so you don’t see much in detail.
But what it does for you as a blind person is it gives you the outline. It gives you the spatial understanding of what this graphic is going to look like.
And then what you can do is you can zoom into what we call the original resolution, which in that map case is seeing the outlines of the states and seeing the labeling, and Braille labeling, and all that kind of stuff. And once you’ve zoomed into that level, then you can pan around the graphic using our arrow keys on the side of the display so that you can reveal more of the state.
But our dream, and I’ll have William talk a little bit about this, is really what we can unlock with SVG graphics, being able to drill down, as you say. We all have this dream of, you know, a Google Maps for the blind kind of thing where you can zoom in and see more details about cities, and maps, and all that kind of stuff.
William: Thanks, Greg. And yeah, that’s a part of the EBraille standard is EBraille supporting SVG, JPEG, PNG, and PDF.
But SVG is what I think everybody’s seeing as the future of tactile graphics. There’s so much you can do with them, like you can embed the text. And I love the idea of linking graphics.
So take that United States map. Instead of just zooming in, like imagine you could actually take the United States and then zoom in on Texas, or zoom in on California. And now, you get a map of Texas. And then, say you could go to one of the major cities and get a map of that major city. And think of how that could increase your spatial understanding.
And taking really complicated graphics, like the most popular graphic in the TGIL is an animal cell, which has like 12 different parts. Imagine being able to break that graphic down into simpler forms and introduce it piecemeal, and then show you the whole thing so that you’re not overwhelmed trying to figure out what each piece of this is. So there’s a lot of stuff you can do with dynamic graphics you could never do before.
Jonathan: And it’s because of the BRF that we’re reading Alice in Wonderland on the display here, and it feels like you’re reading a hard copy page because all the text is centered, that the chapter name is centered at the top of the page. You can see the way the document is formatted. I mean, it’s quite extraordinary. I’ve not seen anything like this before, where it feels like you’re actually reading a page of Braille rather than a line of Braille.
Greg: Yeah, there’s some characters in there that I didn’t even know. Because it’s using proper formatted Braille, you’re seeing emphasis, you know, bold, italic signs, and things like that that you typically wouldn’t see if you were connected to a device with a Braille display or something like that.
Jonathan: Right. And NFB has a resolution about this very topic.
Greg: I just saw that. Yeah.
Jonathan: Yeah. [laughs]
The Braille feel is interesting. It does not feel like your standard piezoelectric cell, and it doesn’t feel like an Orbit cell, but it feels like Braille. I mean, the spacing is correct. It’s kind of pleasant in a way because it’s not particularly sharp, but it’s perfectly discernible. Very very different kind of tactile feel, isn’t it?
Greg: Well, Humanware did a lot of work with Dot to really optimize that cell. I don’t know, Andy, if you have any talk about that.
Andrew: Yeah. So we’ve been working with Dot for almost over 3 years now, actually, before we actually had the project underway with APH. We were really testing the Braille cells, how they were performing, and where we are today. It’s incredible, the amount of work and effort that the team has done.
Now, what’s different about these tactile displays, we are using equal distance pins. So there is a small slight spacing between the normal standard Braille and what we can produce.
But we’ve been trialing around the world, we’ve been testing it, we’ve been giving a lot of people’s hands on this device to make sure that they can read the Braille. And we’ve had some positive results.
And over the top of the Braille display is a membrane. And so we’re not talking about 40 cells here. We’re talking about 480 cells. So it’s really important that we protect those cells.
And one of the things that I did to present the technology to APH was spilling a can of Coke over the top of the display. [laughs]
Jonathan: [laughs] Oh my God!
Andrew: So they were quite amazed.
And what it does, it forms this little puddle. And I was able then to just dry it up and carry on using the device.
So that membrane serves the purpose of making sure that those cells are well-protected. Of course, it could be replaceable as well.
Jonathan: So are there 10 lines of, is it 32 cells?
Greg: It is, it’s 10 lines of 32.
Jonathan: Why not 40 and a standard Braille?
Greg: But you picked up on the math there. I like that. That’s good.
Greg: Yeah. So we use 48 cells in a line to make 32 Braille characters.
Because we’re not using braille spacing, we adjust the spacing a little bit to compensate, essentially, so that we can make it feel like correct Braille spacing. But in order to do that, to make 32 braille characters, we actually need 48 equidistant spaced cells. Does that make sense?
Jonathan: Yes, it makes sense.
I wonder. Is that going to create some trouble when people load what I would call traditional BRF files that have been formatted, say, by NLS here in the United States, which are formatted for 40 cells? What’s that going to do?
William: This is William.
And that’s the whole point of the eBRF, the EBraille standard, is to have that dynamic Braille that can fit any page size.
William: And one of the things we’re working on is a converter that you can just take your BRF, and convert your BRF into an eBRF. And so now, you’ve got a reflowable page, you’ve got the addition of better navigation and markup, the inclusion of any tactile graphics, all from just a traditional BRF run through this converter. And our plan is to make that available for free when the Monarch launches, so anybody could use this converter to update their old files.
Jonathan: What software is on the device? Is it Keysoft, essentially?
Greg: It is. Keysoft is really the foundation, I would say. It’s sort of an updated Keysoft for a multi-line experience, right?
So the nice thing is for those folks who are familiar, and especially in education, I know Humanware has done a great job in education with the BrailleNote. And so teachers and students who do know BrailleNote commands and things like that, we’re not reinventing the wheel. All those commands and things like that will transfer right over to the Monarch.
Jonathan: Hmm. So does that mean you will have all the traditional Keysoft functions like calendar, calculator, all of those things?
Greg: We’re starting with what we would call sort of the foundation apps. We’re not bringing everything over right away, because the work to optimize those for multi-line is pretty significant.
So the apps that we’re going to have at launch right now, or the ones that we’re going to have is a BRF editor or a Braille editor. I sort of refer to this as like the electronic Perkins device, right? Electronics brailler is really what it is. What you Braille is what you get, right? So it’s a BRF editor.
We’re going to have a full word processor. So you can do your docx creation, or review, or things like that.
We’re also doing a lot of work with math in that as well. So within the word processor, you’ll be able to actually create math content, either Nemeth, or UEB math, or whatever your preferred math code is, and you’ll be able to type in equations and things like that. Those equations will then be formatted into print that looks correct in regards to math.
Jonathan: In terms of connecting this to other devices, I believe that Dot had already got some compatibility with iOS, correct? So you can make it sing and dance on your iPhone and an iPad?
Greg: This does not yet. That is an app that we’re working on.
The difference, I think, between what Apple integrated with the Dot Pad and what we’re working on right now is theirs was strictly graphical. So if you wanted to see the phone icon tactilely, if you have it connected to your iPhone, you can see the phone icon, what that icon looks like.
What we’re currently in discussions with all the screen reader providers is how do we optimize the information that’s coming from a screen reader for a multi-line Braille experience as well as a tactile experience, right?
So Dot looked at providing a tactile graphic on the fly like that.
What we’re trying to do is say, “Okay,. The screen reader, JAWS, NVDA, all of those, you’re sending us information, but I don’t just want to get one long 320 line of braille. I want to get properly formatted information. I want to get information. How can I correctly show the start menu? How can we do a number of these things? And also, how can we utilize our touch sensors on this device to even make it more independent so that you’re not always having to go back and forth from the computer? Could you do a lot more with touch tapping, double tapping on items using just the Monarch display to control your PC?”
Jonathan: Yeah. I understand that JAWS is going to, in 2024, have a multi-app system where you can divide the Braille display and show 2 apps on the display. Obviously, with a device like this,, you could potentially divide it in half and see 2 apps on this device.
Greg: Yeah, you got it.
I still have the dream of being able to show an Excel spreadsheet and be able to look at rows and columns on a device like this because it’s perfectly optimized for something like that.
Jonathan: It’s exciting stuff.
What’s the timeframe, do you think, for delivery?
Greg: So what we’re doing is over the next 18 months or so, we are going to, …
So I mentioned we start field testing in the fall, and that goes for about 6 to 8 weeks. Starting in the spring of 2024 here in the US, …
We did receive some additional funding for the 2024 year, and we’re using that funding. We’re sort of changing things up a little bit.
Traditionally, what we do is you go through field testing. If it passes field testing, then the product gets put right on federal quota, and students and teachers can buy it.
What we’re doing here, because this is such a new device and something that we’ve never seen utilized, or taught, or anything like that, we’re kind of taking a step back after field testing, and we’re then going through about 6 months of what we’re calling teacher training. And so we are working with the APH outreach team and the Center for Assistive Technology Training. And in all the regions of the US, what we’re going to be doing is putting on teacher training seminars where teachers will be invited to come to a 2-day seminar and learn how to utilize and teach this device.
And upon completion of those seminars, they’re going to be able to have a Monarch of their own to practice on and to learn so that when they’re teaching their students, when they’re able to get these devices, they’re already feeling comfortable, they have curriculums created, they have content that’s available so that they aren’t afraid to teach this device. Because we all know that sometimes, and TVIs are wonderful. But sometimes, if a TVI doesn’t have enough time to learn how to teach a device or if they’re getting it at the same time their student is, sometimes, they’re afraid to get this in the hands of their kids. And we don’t want that to happen in this case because it is a very new concept, right? [laughs] We’re teaching blind students how to pan, and zoom, and do things that are just not things that we’ve ever, as blind people, been taught.
And so our goal then is in the fall of 2024, to make these available on the federal quota system.
Jonathan: One of the discussions that’s being had increasingly is whether, in fact, we’re putting our blind kids at a disadvantage by locking them into proprietary notetaker technology so that when it comes time to think about vocational options, too many of them aren’t actually computer savvy enough to be able to go to a potential employer, or even go into study outside of schools, or college study, that kind of thing, with appropriate skills and tools like Microsoft Word and PowerPoint, that sort of thing. I wonder whether this will be quite disruptive for the notetaker market because it lends itself to being connected potentially to a PC and a lot more of that kind of study being done that way.
William: This is William, and I think you’re absolutely right about locking kids into the notetaker ecosystem, and locking them out of the traditional operating systems and things like that. And the Monarch can definitely be disruptive. And I think, Greg can talk about that.
But the other thing we’re thinking about here is how can you, within your ecosystem, make that ecosystem mimic the traditional Windows environment, mimic the traditional Office environment so that even if they are using your version of those apps, they’re still learning the same basic hotkeys, the same basic skills, so that when they transfer over to Office and other traditional Windows apps, they’re not having to completely relearn everything. It’s transferable skills.
Pricing. It’s not cheap, is it? It’s not cheap.
Greg: No no no, it’s not. You know, we’re looking anywhere between $15,000 and $20,000, and that’s really where our real efforts have.
And this is why I love being at APH is because we have a government affairs team led by our awesome VP of government affairs, Paul Schrader, who we’ve introduced language into the FY24 budget to add an additional $10 to $15 million that are dedicated specifically for putting a monarch in the hands of every blind student who would be eligible to get a Braille textbook. And it’s really the federal government’s responsibility to fund textbooks here in the United States.
And we believe that this is number 1, a much lighter way to carry textbooks because you’re not carrying 40 volumes of Braille. But number 2, you’re able to put an entire semester’s textbooks on this one device, right?
And from a costing perspective, the transcribers, the Braille producers, they are still as needed as ever because we’re not saying that I’m going to get a book from the publisher and it’s going to show up in perfect Braille, right? The transcribers are more necessary than ever, especially when it comes to STEM subjects, you know, math and science, all that kind of stuff, making sure that the formatting is correct, and the spacing is lined up, and all that still happens.
But what we’re saving is the amount of time that it takes to bind, print, package, and ship all these textbooks around the country. And that’s not insignificant, in both time and dollars. So we believe that it’s a much more affordable way for the federal government to spend their money. And it’s also an obligation that they have to fund textbooks.
We have already seen positive traction in this. We were afforded $1.5 million additional dollars in FY23. And we’re hoping to see that and more in FY24.
Jonathan: So the status of that advocacy is still to be determined then?
Greg: It is. Like I said, it’s trending in the right direction. They could have not given us anything, and we got an additional $1.5 million for testing of this. So that is a great sign.
Our plan is then to go back with testing results and to follow up with our ask for an additional $10 to $15 million in FY24. With the way that the debt ceiling and all that drama that happened in the US here, we may see that be a little bit slower uptick, but we’re already excited just to have seen the additional $1.5 million because that generally lends itself to excitement in the federal government side, so that’s definitely a good thing.
Jonathan: Commercial book providers, in particular Amazon, are you talking to them? Because wouldn’t it be phenomenal if you could read your Kindle books on this thing?
Greg: We have had conversations, but I urge each and every one of your listeners to get in touch with Amazon and you’re going to see them here at this convention. And anytime that you engage, that is my dream, is to be able to put Kindle or major book providers on this device so that a student isn’t just utilizing the traditional AT type of book providers, your NLS Bard, your Bookshare, that kind of things. Those are great resources. But wouldn’t it be great if you could literally have millions and millions of books right at your fingertips at that point?
Jonathan: It certainly would because presumably, a partnership like that could potentially cascade to other products like the Mantis, and the Brailliant, and other things. If you could ink the contract for this, you could ink it for everything.
Greg: Exactly. That would be phenomenal.
So yeah, we’ve had some conversations. But I urge everyone, if that is something that is exciting to you, make sure that the folks at Amazon and all the other providers know that.
Jonathan: How would you characterize those conversations as a kind of, well, that’s nice. Is there any real engagement at this point?
Greg: I think there is. The tech is to a point where we’re literally doing something nobody’s ever done before. And so I think the innovation side is very exciting for mainstream companies in that regard. So I think it’s definitely positive. But obviously, reinforcing that with user stories and our listeners asking that as well is definitely super helpful.
And then, the last call out I’ll do is going back to the advocacy thing. If you are engaged with any of your local congressmen or regional congressmen, that type of thing, get in touch with them and tell them your story about how a device like this could have helped you in education. Or if you’re a parent, how this device could help your child in education, because it’s those stories that politicians listen to. And politicians speak to politicians all the time. But hearing from real-life people in their neck of the woods is something that I want to make sure that everybody is telling their story.
Jonathan: Obviously, we have a lot of listeners in the US, but many listeners outside it. Is there any consideration being given to how advocacy might take place in other markets to get similar funding?
Greg: So Humanware is going to be the global distributor of this product outside of the US. And I know that Peter and Andrew have been working tirelessly.
The problem is that we’re very lucky in the US to have the APH federal quota system that we have, but that is not something that’s duplicated in other countries. And so I know Peter is specifically navigating some of those waters and trying to figure out, how is Canada different? How is the UK different? How is Australia and New Zealand different? That type of stuff, right?
Peter: It would be wonderful to emulate what the American Printing House is able to do. But even within a country like Canada, you have 9, 10 different provincial governments. There is no federal education side.
Peter: So you’re dealing not only with these individual countries, but oftentimes, with individual entities within the individual countries that all serve different purposes.
So as we try to find these different revenue kind of streams and sources for allocating funds, we run into many many different unique landscapes that will all work together, sometimes in a very disjointed fashion, to try and get us where we want to go to help bring down the cost and subsidize or streamline the cost of these devices.
And I’m sure Andrew can speak to this in Europe as well, where in the Middle East and other places, where we just see a mosaic of countries, and funding mechanisms, and sources, and what it takes to even get into those spaces that differs drastically, sometimes within the same country.
So it’s been quite the learning experience for myself. And just being able to know though that we are putting that effort in and we’re going to get there, but it’s gonna take a lot of creative efforts to do so. So it’s been interesting.
Jonathan: I know we’re almost out of time. Is there anybody who wants to make any final comments before we have to wrap?
Peter: I do want to say the one thing that I’m most excited for, in addition to everything Greg has said and what we’ve covered today, what we’re looking to also do is to teach some concepts to conceptualize zooming and panning. And for somebody who’s totally blind like myself, blind from birth, that concept is not something that is understood by our brains. We can’t wrap our heads around that.
Peter: And so in addition to, yes, putting out this device and having all this infrastructure, we’re also going to be looking to teach concepts that have never been attainable to someone who’s blind.
And so when we look at our curriculum, when we look at training those teachers or training individuals on using the device, there is also this major component of we are going to be kind of the stewards of teaching these concepts. And how do we do that? And I don’t think we know perfectly how that will be done.
And we see some users, you know, the concept of a scroll bar, for instance, I did not know what that was until about a year ago. And I was like, what is this? You know, and in wrapping our brains around some of that conceptualizing is going to be very interesting.
Our hope is that when we introduce these concepts to a 6, 7, 8-year-old, that blind child can grow up with these visual skills that have never been present. So by the time they’re 18, 20, 30, they’re just naturally understanding zoom, scroll, all of that, thanks to a device like this and really being able to bring those in.
So that is to me is something that’s another side of this that is kind of overlooked. And I don’t think we fully, it’s not fully flushed out yet, but it is something that will be a big part of this.
Jonathan: It’s similar in a way to the challenges some of us have had when we never used to be able to use a camera. Cameras weren’t things of interest to most blind people.
And then, they became accessible and useful. And those of us who didn’t grow up with that access have had to come to terms with how far away you hold a camera to photograph certain types of things.
And we’ve got used to it now. But people who’ve grown up with this, it’s just absolutely second nature.
Greg: Yeah. I’m going to finish with one story if I can, just to kind of take it home. And this was kind of an emotional moment for me.
So my wife and I are both blind. We have a 6 and a 3-year-old.
My 6-year-old is getting into her summer reading stuff. And we went to the library and we picked up a bunch of books. And she’s sighted.
And none of these books are in Braille, obviously. And so I was like, “Okay. Well, I’ll use seeing AI or whatever and try to just take pictures of the pages and see if I can validate what she’s reading.”, right?
And that didn’t work. It failed miserably. [laughs]
So what I ended up doing was I go on Bookshare, I find the books that we bought, and I download them onto the Monarch, right?
And what’s incredible is I downloaded the Daisy text version of these books. And yes, I’m not seeing the exact Braille formatting. But what was incredible is I was able to, on the device that felt like I was, I’m not going to say it felt like reading paper Braille, but I’m reading on a 10 line device, right? And I was able to follow along as she was reading. And what was really cool is that as she turned the page, I saw that we moved to page 2. So I was able to actually read alongside my daughter in Braille while she was reading print, and was able to validate exactly what she was reading was correct, right?
And that was one of those dad moments where I was like, “Whoa, this is a bigger deal than I thought.” Like this is a pretty big one.
Jonathan: It’s very exciting. We’ve been talking for years and years and years about a device like this, and it’s clearly coming to fruition. There’s no doubt about that.
You guys are incredibly busy, and I appreciate you all assembling to have a chat with the Living Blindfully audience. And we’ll keep in touch. Thank you so much for doing this.
Greg: Thanks for having us.
Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.
Voice message: Hello, at largers. It is Chad from Fort Wayne, Indiana.
I wanted to share an app with you that I’m a big fan of. And if you are a weather enthusiast or even just like checking out the forecast now and then, this might interest you.
The difference with this app is not only can you see it on the screen as in with your favorite screen reader, what have you. You can also hear the forecast voiced by real meteorologists. Now I’m not talking, you know, a regional video that you click on and it includes, you know, your region of the country with multiple cities involved. No, you literally hear your pinpoint forecast voiced by real meteorologists.
It’s a company called Weatherology. They’re out of the Minneapolis, St. Paul, Minnesota area. And they also do weather forecasts for radio stations around the country. They used to be known as Weather Eye Incorporated.
And in recent years, they have come out with an app for the public – for their phones, and also for smart devices like Lady A and those kind of things.
For the longest time for radio stations, they would just record a forecast every so often, like I’m talking to you right now, the traditional way to record a forecast. Or in some cases, even do live forecasts such as for morning shows and that kind of thing. But also in recent years, they developed a vocabulary-based computer software that uses concatenation with the meteorologist’s voices to do the forecast that way.
So let me start by playing you an actual forecast as of right now, as of about [5:30] in the afternoon my time.
VoiceOver: Home button.
Chad: There’s the home button on the right side of the screen which allows you to change your home location, if you want. And right below that is the play button for the audio forecast.
Now, I’m going to turn speech off.
VoiceOver: Speech off.
Chad: I have an iPhone 14, by the way.
The reason I did that is so that it doesn’t start chattering as soon as the play icon comes up because it refreshes the screen. So here’s the forecast right now.
Female meteorologist: Here’s a look at Your Weather Eye forecast, powered by Weatherology.com.
Cloudy skies tonight, with lows around 28. Winds out of the east, 5 to 10 miles per hour.
40 tomorrow. A mixture of precipitation likely tomorrow night. 33, cloudy skies expected.
Low 40s, Wednesday, with mixed precipitation possible at times.
Turning warmer Thursday, with highs in the upper 40s and showers possible.
A dusting of snow tomorrow night.
That’s a look at your forecast. I’m meteorologist Jennifer Wojcicki.
Currently, it’s 36 degrees.
Chad: And there you go. How cool is that? Not bad for, you know, a vocabulary-based forecast. And it sounds pretty good on the radio, too.
And you’ll notice I have no music. I usually don’t use music, but there are music options. What it basically is, it’s basic stylings of one particular music bed or theme that they use here in the app. So I’ll show you some of that here in a bit and I’ll let you hear the other voices as well.
We’re going to start at the top left.
VoiceOver: Button settings.
Chad: I’ll walk you through a little bit of that here in just a few minutes.
VoiceOver: Button send.
Chad: Send. That tells it to send a forecast for your current location. Basically, it reloads or refreshes the forecast.
VoiceOver: Fort Wayne, In, button.
Chad: Fort Wayne, Indiana. If we were to push that, it opens your search icon.
VoiceOver: City, state/country. Search field.
Chad: I scrubbed out of that.
VoiceOver: Fort Wayne, In button.
Chad: And just to the right of that …
VoiceOver: Button search.
Chad: is the search icon, which opens that same thing.
Chad: This button is not marked, but I just discovered it. It puts the forecast in a chart format with the expected high and low temperatures, that kind of thing, but also averages for the whole day, if you will. Wind speed, and temperature, and precipitation expected. It actually gives the expected high and low temperature, the average high and low for each day in the 6 or 7 days of the forecast for that time of year, based on the date. And then, it also gives expected precipitation for each day and the average wind speed expected each day.
Chad: There’s an image there, I think, for that same button.
VoiceOver: Button home.
Chad: There’s the home button that allows you to change your home location, which is what you’ll get your notifications for. And also, depending on how you have it set, it can also be the forecast that shows up on the screen when you first open the app, which we’ll talk about that here in a bit.
VoiceOver: Now, 36 degrees.
Chad: Here’s now, 36 degrees. 36 degrees.
I should point out, the screen updates more frequently than the audio does. So you may hear variances in temperature and things like that as far as current conditions go between the audio and the screen, what VoiceOver reads to you on the screen, because I think that the screen updates more frequently. But the audio does update, I think, 2 or maybe 3 times an hour.
And I’ll show you something with that, too, through other demonstrations of things when we get to that here in a second.
VoiceOver: 39 degrees.
Chad: 39 is the expected high, I think, for today.
VoiceOver: 28 degrees.
Chad: 28 tonight.
VoiceOver: Button play.
Chad: There’s the forecast play icon button we showed you earlier.
VoiceOver: Conditions: overcast. Wind: E6 miles per hour. Temperature: 36 degrees Fahrenheit. Feels like: 31 degrees Fahrenheit. Dew point: 28 degrees Fahrenheit. Pressure: 30.32. Humidity: 75%. Visibility: 10.0 miles. [7:56] AM. [5:12] PM. Last quarter.
Chad: That’s last quarter moon.
VoiceOver: Average high: 36 degrees Fahrenheit. Average low: 23 degrees Fahrenheit. Mon. 6 PM.
Chad: Monday, 6 PM. Now, we’re going into the hourly forecast. And in your settings, you can control how much is displayed on the screen initially. Simple or detailed with hourly forecasts, weekly forecasts, and current conditions. And I’ll show you where all that is in the settings menu here in a bit.
So let’s scroll down. Let’s get past all the hourly stuff.
VoiceOver: This week. Image. Tuesday. 39 degrees.
Chad: That’s tomorrow.
VoiceOver: Drop. Image. 0%. Cloudy with highs around 39. Winds out of the east 8 to 15 miles per hour.
Chad: Swiping to the right, by the way.
VoiceOver: Image. Tuesday night. 33 degrees. 75%.
Chad: There’s Tuesday night.
VoiceOver: A mixture of precipitation, likely late. Otherwise, overcast skies. Lows level off around 33. Winds out of the east 10 to 20 miles per hour. A dusting of snow possible.
Image. Wed. 42 degrees. Drop. 90%. Cloudy skies expected with mixed precipitation, likely changing to all rain. Highs around 42. Easterly winds 10 to 20 miles per hour. Precipitation amounts up to 0.23 inches.
Chad: And it goes all the way through, I think, Sunday. I think it’s like 6 or 7 days. That completes the home screen.
Now let’s take a brief tour around the settings.
VoiceOver: Button close.
Chad: Here’s the close button there on the top left.
VoiceOver: Settings heading. General button.
Chad: General. I’ll go to that here in a minute.
VoiceOver: Audio button.
VoiceOver: Favorites button. Maps button.
Chad: Just swiping to the right here.
VoiceOver: About button. Help Button.
Chad: Help. And that’s it. So let’s take a brief tour of the general menu here.
VoiceOver: Units button.
Chad: Units. That’s your Fahrenheit and Celsius, wind, precipitation, and barometric pressure settings for American versus the rest of the world.
And it may mess with you a little bit as far as where the icon is that tells you where it’s been selected. But if you simply double tap on each of the words or the actual text of the individual settings, it will work. Sometimes, you’ll see the checkmark before it. Sometimes, you’ll see it after it. So don’t panic. It is there. And very user-friendly.
VoiceOver: Theme heading. Automatic, switch button, on. Automatic. The theme setting will automatically choose to display the light or dark theme based on the time of day.
Chad: How cool is that?
VoiceOver: Current weather appearance heading.
Chad: That’s how much is on the screen for these appearances. That’s what I was telling you about earlier.
Chad: Simple. Probably just basically the current temperature, and that’s it.
Chad: Detailed is going to be your pressure, humidity, all that.
VoiceOver: Selected. Selected image.
Chad: And I have a detailed selector right now.
VoiceOver: Simple. Module requires a tap to open. Detailed. Module is open by default.
Chad: The module, the tap to open is going to be to open the more detailed information.
VoiceOver: Weekly forecast appearance.
Chad: Same thing with the weekly.
VoiceOver: Simple. Partially detailed. Fully detailed. Selected. Selected image.
Chad: I have fully detailed checked.
VoiceOver: Simple. All modules require a tap to open. Partially detailed. The first 3 modules are open by default. Fully detailed. All modules are open by default. Default location heading. Home. Fort Wayne, In. Current location. Selected. Selected.
Chad: Current location is what I have selected.
VoiceOver: This is the location shown by default on the forecast tab.
Chad: So even though you have your home location set, which is what your notifications, all that are going to be for, you can tell it when you open the app to display your current location, which can be different from your home.
VoiceOver: Default tab. This is the location. Default forecast. Selected. Selected image Check. Maps.
Chad: I currently have forecast selected.
VoiceOver: This is the tab shown by default when the app initially loads.
Chad: Because maps don’t do any good because it’s visual.
VoiceOver: Default chart heading. Weekly. Selected. Selected image. Hourly. This is the chart shown by default on the forecast tab.
Chad: That’s going to be your basic high-low version of a forecast. Just the basic stuff.
VoiceOver: Tooltips. Switch button. Show or hide contextual tooltips that display within the app.
Chad: Sorry, I have it turned on.
VoiceOver: Show or hide contextual tooltips that display within the app.
Chad: That’s going to be things like, do you want the words like cloudy and that kind of thing in the current conditions.
VoiceOver: Automatic data refresh switch button On. Controls if the app automatically refreshes all weather data every 5 minutes.
Chad: There you go, which I have mine turned on. But apparently, again, it does not update the audio that frequently.
VoiceOver: Change app icon.
Chad: Change the app icon. That has to do with your display. And every time you do it, I tried it a little bit ago. Every time you do it, you change it once. It’s going to give you an alert and set it. You can’t just swipe up. It’s a picker item, if you will. Every time you change even once, it’s going to alert you. And you have to hit the OK button. It’s kind of a pain in the tail, so I don’t even know. I’m not even going to mess with that.
Okay, that’s the general menu. Let’s back out of here. Back into settings.
VoiceOver: Notifications button.
Chad: There’s notifications. I think that’s pretty good. Well, you know, let’s walk through here real quick and see what that is.
VoiceOver: Notifications heading. Fort Wayne, Indiana.
Chad: Comes up with Fort Wayne, and a button there.
VoiceOver: Huntington, Indiana button. Indianapolis, Indiana.
Chad: That is going to display your favorites, and whether or not you get notifications for them, how many of those cities you want notifications for such as severe weather, and that kind of a thing.
VoiceOver: Settings. Audio button.
Chad: And you can probably actually click on each one and determine what you want. There’s the audio.
VoiceOver: Preferred voice heading button image. Paul Trombley.
Chad: Now, you can click on just the name itself. It’ll select it that way, too. You don’t have to look for the image, or anything like that.
VoiceOver: Button image.
Chad: And there’s a button and image for each one here.
VoiceOver: Derek Height.
Chad: By the way, Paul Trombley is their chief meteorologist. There’s Derek Height.
VoiceOver: Button image. Jennifer Wojcicki.
Chad: Jennifer Wojcicki, who actually, in reality is now Jennifer Marlowe. She got married.
Chad: But she’s still using her maiden name on the air.
VoiceOver: Image. Megan Mulford.
Chad: Megan Mulford.
VoiceOver: Button image. Laura Lockwood.
Chad: And Laura Lockwood.
Those are your 5 meteorologists you can use. And I’m going to actually let you hear what each one sounds like before we’re done here.
VoiceOver: Music style heading button. Default button.
Chad: Default style.
VoiceOver: Rock button.
Chad: Light button. VoiceOver: No music.
Chad: And no music, which is what I currently have.
VoiceOver: Selected image. Checkmark. Loading sound effect switch button off.
Chad: I have that turned off right now.
VoiceOver: Controls the presence of a background sound effect when loading forecasts within the app. Auto play forecast on open switch button off.
Chad: Right now, I have that turned off because I was doing some experimenting, which I’ll explain here in a minute. You know what? I’m going to turn that on right now.
VoiceOver: On. If enabled, this will automatically play the forecast audio when the app initially opens.
Chad: Which I, the vast majority of the time, have used. The reason I turned it off is because I was trying to get Siri to work with playing the forecast, which it used to be able to do. It’s not doing it right now, for whatever reason. And you’ll hear that here in a second. There are Siri commands. And I turned off the automatic play of the forecast to see if that would fix it, and it did not.
VoiceOver: Siri command list.
Chad: Here we go.
VoiceOver: Hey Siri, open Weatherology. Hey Siri, play my forecast on Weatherology. Hey Siri, play my current weather alerts on Weatherology. Hey Siri, play the latest trending article on Weatherology. Hey Siri, play Today in Weather History on Weatherology.
Chad: Today in Weather History, that’s really cool.
VoiceOver: Hey Siri, play Weather Word of the Day on Weatherology.
Chad: Those two, Today in Weather History and Weather Word of the Day can only be accessed by Siri. Everything else can be done in the app, selecting menus and things.
I’m going to put the default music icon on here. We’ll select that for now.
VoiceOver: Selected, default.
Chad: Okay, it’s selected. And I’m going to back out of here.
Now we’re still using Jennifer, but this will show you something else.
Every time the forecast is updated in the audio, the wording is a little different, whether it updates automatically or it also will update. What I think happens is the server just loads multiple versions with each music style in it, of each forecast, off the bat. And you just switch to it and it has the audio forecast with it.
But every time either you make that manual change or every time the forecast just updates, the audio does, the wording is a little different.
Chad: And here’s the forecast now.
VoiceOver: Speech off.
Chad: This is the default version of the music, which is just they have multiple versions of the same theme, if you will.
Jennifer: Here’s a look at your forecast, powered by Weatherology.com.
Cloudy skies tonight, with a low of 28.
Easterly winds: 5 to 10 miles per hour.
A mixture of precipitation likely tomorrow night, 33.
Low 40s Wednesday, with a chance for mixed precipitation .
Warmer Thursday, with highs in the upper 40s and scattered showers.
A dusting of snow tomorrow night.
That’s a look at your forecast. I’m meteorologist Jennifer Wojcicki.
Currently, it’s 36 degrees.
Chad: Alright. What I’m going to do now is turn the music off again. And then, I’m going to back to back to back to back, let you hear each of the other 4 voices, which again will demonstrate how each forecast can sound different, whether it’s through manual changes or through automatic updates.
We’ll start with their chief meteorologist, Paul Trombley.
Paul: Here’s a look at your weather forecast, powered by Weatherology.com.
Lows dip down to about 28 tonight under cloudy skies.
Winds out of the east, 5 to 10 miles per hour.
Cloudy tomorrow night.
A mixture of precipitation, likely. Lows dip down to about 33. Low 40s Wednesday with mixed precipitation possible at times.
Turning warmer Thursday with highs in the upper 40s and scattered showers.
A dusting of snow tomorrow night.
And that’s your forecast. I’m meteorologist Paul Trombley.
Currently, it’s 36 degrees.
Derek: Here’s a look at your weather forecast, powered by Weatherology.com.
28 tonight under cloudy skies.
East winds 5 to 10 miles an hour.
Overcast skies again tomorrow. Highs around 40.
Overcast skies again tomorrow night. A mixture of precipitation likely. Lows around 33.
Low 40s Wednesday, with a chance for mixed precipitation.
Turning warmer Thursday, with highs in the upper 40s and scattered showers.
A dusting of snow tomorrow night.
That’s a look at your forecast. I’m meteorologist Derek Hite.
Currently, it’s 36 degrees.
Megan: Here’s a look at your weather forecast, powered by Weatherology.com.
Cloudy skies tonight, with a low of 28.
Easterly winds, 5 to 10 miles an hour.
40 tomorrow. Lows around 33 tomorrow night. Chance for mixed precipitation.
Low 40s Wednesday, with a chance for mixed precipitation.
Warmer Thursday, with highs in the upper 40s and showers.
A dusting of snow tomorrow night.
That’s a look at your forecast. I’m meteorologist Megan Mulford.
Currently, it’s 36 degrees.
Laura: Here’s a look at your weather forecast, powered by Weatherology.com.
Lows of around 28 tonight, under cloudy skies.
Easterly winds, 5 to 10 miles per hour.
Overcast again tomorrow. High temperatures reach up to 40.
Cloudy tomorrow night. Mixed precipitation likely. Lows of around 33.
Low 40s Wednesday, with a chance for mixed precipitation.
Warmer Thursday with highs in the upper 40s, and scattered showers possible.
A dusting of snow tomorrow night.
That’s a look at your forecast. I’m meteorologist Laura Lockwood.
Currently, it’s 36 degrees.
Chad: So there you go. Now you know what each of the voices sound like for the forecast. And they all do pretty well. They really do. I think Paul and Jennifer are the 2 that I like the most. Laura’s right up there, too.
Now, there are some things that are kind of locked in stone, as far as what voice you’ll get with certain things.
For example, with any severe weather alert type things, if there is an audio available, it’ll be Derek Height because he’s the one who did the forecast and/or did the alert vocabulary. And in fact, for severe thunderstorm and tornado warnings, it even includes every county in the United States. Very very good, as far as the way they did the vocabulary for that.
And then, there are other things. The articles, if you will. Those you can’t really control either because those are basically just audio files. And whoever wrote them also reads them.
Let’s go back to the settings menu here.
VoiceOver: Settings button.
Chad: We did audio.
VoiceOver: Favorites button.
VoiceOver: Change home location button.
Chad: You can change your home location right there.
VoiceOver: Your home location is currently set to Fort Wayne, Indiana. Remove favorites. Huntington, Indiana. Indianapolis, Indiana.
Chad: So you can remove any one of those.
VoiceOver: Settings. Back button. You will be asked whether or not you’re sure about removing a specific favorite.
Chad: There you go. I want to show you that at the bottom of the screen there. Making sure that was the last one.
Okay. Back out of that. Let’s go back to the home screen here.
VoiceOver: Button settings.
Chad: And on the very bottom, from left to right on the very bottom of the screen, there’s like a toolbar thing there. And you have …
VoiceOver: Tab bar. Selected. Forecast tab. 1 of 4.
Chad: forecast. That’s your home screen, if you will.
VoiceOver: Maps tab.
Chad: To the right of that is maps.
VoiceOver: Favorites tab. 3 of 4.
Chad: Favorites. You can choose a favorite. In fact, I’ll even show you this real quick here. The way they’ve got this laid out here, you’ve got your couple of buttons.
VoiceOver: Button settings.
Chad: On the side there, your settings and your send button that we talked about earlier.
VoiceOver: Fort Wayne, In. Button.
Chad: There’s Fort Wayne, Indiana. That’s my home location.
VoiceOver: Image, image.
Chad: If you were to swipe from left to right, you’ll hear image, image. And then, you’ll hear…
VoiceOver: Fort Wayne, Huntington. 36 degrees. 36 degrees. Image, image. Indianapolis.
Chad: So you’re hearing image, image, city, city, temp, temp.
What you do is, depending on what city you want, you either want to tap on either the first or second image of each of the 2 blocks, if you will. The blocks of 2, I should say. You’re going to have a total of ten favorites.
VoiceOver: Image, image. More. South Hill.
Chad: That’s Moore Oklahoma. South Hill, Virginia.
VoiceOver: 53 degrees, 43 degrees.
Chad: 53 and 43 respectively. Two more.
VoiceOver: Enterprise, Alabama. Janesville, Wisconsin. 63 degrres, 37 degrees.
Chad: 63 and 37.
VoiceOver: Image, image. St. Paul.
Chad: St. Paul, Minnesota. And Firestone, Colorado.
VoiceOver: 33 degrees, 42 degrees.
Chad: 33 and 42.
There’s your search icon, if you want it. And then, back to your bottom, the tab bar there at the bottom. So you want to click on either of the 2 images you see with each block of the 2 cities. 2 by 2 by 2 by 2 by 2. And it’ll bring it up, and there for a while anyway. I haven’t messed with it in a while, but there for a while. You only had so long, several minutes, but only so long to be on that particular city before it would automatically put you back to the home screen. I’m not sure if it still does that or not, but just be aware of that.
VoiceOver: Button settings. Button close. Button settings.
Chad: Okay, that’s your top of your screen there.
Let’s go back to forecast tab, maps, favorites, …
VoiceOver: Content tab.
Chad: And the content. Those are your articles I was telling you about. Let’s open that up here.
VoiceOver: Button Fort Wayne. Search image. Search articles. Text field. 8 days ago. Ice, a Slippery Subject. Michael Caro.
Chad: Now, Michael Caro has been the one writing the most articles lately. He does a lot of things with scientific stuff and some of the environmental and those kind of things, too.
But for a while, one of their others, Megan, was doing what she called Wildlife Wednesday– How weather had to do with animals and those kind of things.
And then, what I thought was a really cool one, their chief meteorologist, Paul Trombley, had a feature for a long time called Professor Paul Thursday. And those are explaining various aspects of meteorology. Elaborating more into their science,their definition, and what causes different weather phenomenon and those kind of things.
VoiceOver: Page 3 of 141.
Chad: Now, if you Scroll, scroll, scroll, …
VoiceOver: 2 months ago. Paul Trombley.
Chad: See, here’s a Paul Trombley article right here.
VoiceOver: Professor Paul Thursday. Why do leaves change color?
Chad: Now, let’s just click on that.
VoiceOver: Today’s topic: the science behind why leaves change color in the fall.
Chad: There you go. Now, you can either read it. Or, You can hear it.
Paul: It’s Professor Paul Thursday. And today’s topic is the science behind why leaves change color in the fall.
Chad: Let’s just pause that for now, and let’s just go back to the home screen, the forecast screen.
But that’s the way you can go through the articles. Every once in a while, one of the other meteorologists will do a weekend weather spotlight. Which tells what the weather’s going to be like for an upcoming weekend. But those articles disappear after Monday of that following week. For obvious reasons, their time value. But those are pretty cool to hear, too.
Now, I’m going to search for a city that I believe has a winter weather alert, and let you hear what things look like and sound like when there is some kind of a severe weather bulletin you should know about. Stand by for that.
Okay, I just loaded St. Paul, Minnesota. And by the way, I kind of mistakenly, or I thought it was mistakenly, pressed the actual text of St. Paul. And it loaded up. So you can do it that way, too. You don’t have to search out the image. So I’m glad to know that.
So let me look here. And the reason I know this is because St. Paul, because it’s in my favorites, I also have chosen to get notifications from it whenever they have some kind of a severe weather event because I have a friend. I have several friends and used to have family up there who lived up there, too.
VoiceOver: Button. Search button image. Button. Image. Winter weather advisory.
Chad: There you go.
VoiceOver: Starts Tuesday, December 13th, 1 PM. Image. Now.
Chad: Swipe back to that. And I’m going to double tap.
VoiceOver: Winter weather advisory.
Chad: And I’m going to swipe to the right.
VoiceOver: Starts Tuesday, December 13th, 1 PM. Button play.
Chad: Now, I’ll show you the play button here in a second. But if you swipe to the right here. …
VoiceOver: Winter weather advisory in effect from noon Tuesday to noon CST Wednesday.
Issued: [2:51] PM CST. MON, DEC. 12-20-22.
What? Mixed precipitation expected. Total snow accumulations of up to three inches, and ice accumulations over light glaze. Winds gusting as high as 50 miles per hour.
Where? Portions of central, East central, and south central Minnesota, and west central Wisconsin.
When? From noon Tuesday, to noon CST Wednesday.
Impacts. Plan on slippery road conditions. The hazardous conditions could impact the evening commute. Gusty winds could bring down tree branches.
Precautionary/preparedness actions. Slow down and use caution while traveling.
Chad: That’s MN and WI, respectively.
VoiceOver: Image. Now. Image.
Chad: Now you’re on the now screen we talked about earlier.
Now, if I go back here, here is the play button. They keep this pretty basic, but it is a safety message and it will include the times expected for start to finish in the audio here. So here’s what it does.
Derek: A winter weather advisory begins at noon tomorrow, and continues until noon Wednesday.
A winter weather advisory means we can expect accumulating snow, or even blowing snow at times. A mix of sleet and light freezing rain sometimes accompany these winter weather systems as well.
Be prepared for slippery road conditions, and give yourself extra time of traveling. As with all winter weather events, if you’re traveling, always keep a winter survival kit with you.
A winter weather advisory begins at noon tomorrow and continues until noon Wednesday.
Chad: And you can hear another version of the theme they have there, too.
Now, it does not play that automatically. It just gives you a regular text notification. Then, you open the app.
Let me also play you the forecast for St. Paul, Minnesota. And you will hear not only the advisory mentioned in it, but also expected snow totals. Listen to this.
Laura: Here’s a look at your weather forecast, powered by Weatherology.com.
A winter weather advisory begins at noon tomorrow, and continues until noon Wednesday.
Overcast tonight with a low of 32. Winds out of the southeast, 10 to 15 miles per hour.
Mixed precipitation likely tomorrow. Highs around 36. Overcast.
Scattered rain showers likely tomorrow night. Lows around 34. Cloudy.
Upper 30s Wednesday with a chance for scattered rain showers.
Turning cooler Thursday with highs in the mid-30s and mixed precipitation.
A dusting of snow tomorrow with another 1 to 2, Wednesday night through Thursday.
That’s a look at your forecast. I’m meteorologist Laura Lockwood.
Currently, it’s 35 degrees.
Chad: Now, here is what a severe weather bulletin can really sound like when it’s really detailed. In this case, it’s for a, I believe it’s a tornado warning.
I recorded this several months ago, and I didn’t have it plugged into the interface. I just held my microphone up to the phone when I recorded it because I wanted to catch it right away and I wanted to show it to people. So here’s what it sounds like. This is for the Pittsburgh, Pennsylvania area.
Derek: A severe thunderstorm warning continues until [5:45] PM this evening for Hancock, Brook, Washington, Beaver, Allegheny, and Jefferson counties.
At roughly [5:20] PM, the strongest part of the storm was centered near Fairhaven, pushing southeast at around 25 miles an hour.
Both large hail and straight line winds are possible with this storm. Pea-sized hail and winds up to 60 miles an hour are likely.
For your safety, stay inside and away from windows until the storm passes. Large hail and strong winds can easily shatter glass, causing injury.
If you’re caught outside, take shelter in a sturdy building or a hardtop car. Take shelter now.
Again, a severe thunderstorm warning continues until [5:45] PM this evening for Hancock, Brook, Washington, Beaver, Allegheny, and Jefferson counties.
Bear in mind, this storm has the potential for pea-sized hail and wind gusts up to 60 miles an hour.
Chad: Let’s have one last little bit of fun, shall we? We’re going to do the search, because you can search outside of the US, and it will work quite well.
So we are going to not only search for somewhere, but we’re going to change the settings too, to show you that that can be done. And as soon as you know where we’re going, you’ll understand why I’m doing it.
VoiceOver: Fort Wayne, In, button.
Chad: There’s the icon, Fort Wayne. Let’s go ahead and hit that.
VoiceOver: City, state/country. Search field. Is editing.
Chad: And here we go.
VoiceOver: W-E-L-L-I-N-G-T-O-N. Wellington, New Zealand.
Chad: [laughs] I wonder who lives there.
VoiceOver: Settings button. Wellington, New Zealand button.
Chad: There you go.
Now, let’s go into settings, and we’re going to change one thing.
VoiceOver: General button.
Chad: Go to general.
VoiceOver: Theme heading. Units button.
VoiceOver: Selected, Celsius.
Chad: I’m on Fahrenheit right now. Let’s switch it to Celsius.
VoiceOver: Selected, Fahrenheit. Temperature unit, Celsius.
Chad: It doesn’t change where the location of the icon is, but we’ll know if it’s right here in a minute.
VoiceOver: Kilometers per hour.
Chad: Let’s do kilometers per hour. I’ll do rainfall. We’ll do rainfall.
I’m not going to worry about snowfall for now.
VoiceOver: Selected, millibars.
Chad: And we’ll do millibars for barometric pressure.
VoiceOver: Visibility units heading. Selected, kilometers.
Chad: And we’ll do kilometers for the heck of it, because I’ll show you the layout of the screen there with the current conditions as well.
So back out of there. Go back to here.
VoiceOver: Notifications button. Audio button. Favorites button.
Chad: Oops, I’m still in the menu there. There we go.
VoiceOver: Image. Now, 26 degrees.
Chad: 26 Celsius.
VoiceOver: 19 degrees.
Chad: Low of 19, expected.
VoiceOver: 16 degrees button.
Chad: Or apparently, maybe it was a high of 19, low of 16. I’ll look here in a minute.
Chad: Clear skies.
VoiceOver: Wind: NE 2 kilometers/h.
Chad: Northeast 2 kilometers.
VoiceOver: Temperature: 26 degrees Celsius.
Chad: Temperature is 26 Celsius.
VoiceOver: Feels like: 26 degrees Celsius.
Chad: Feels like 26.
VoiceOver: Last quarter: Tuesday, 1 PM.
Chad: Okay. Let’s go past all that, and let’s do this.
VoiceOver: 16 degrees. A chance of 16 degrees. Tuesday night. Image.
Chad: Aha! That’s…
VoiceOver: Clear skies expected. 5%. Draw. 19 degrees. Tuesday image.
This week image. Tuesday.
Chad: Tuesday. What time would it be there? Yeah, it would be Tuesday morning, so…
VoiceOver: 19 degrees. Draw. 5%. Clear skies expected. High temperatures around 19. NW winds at 8 kilometers/H.
Chad: Boy, you blew past that already by about 7 degrees Celsius, or about 12 or so Fahrenheit. 26 would be 79, I think, Fahrenheit. 19’s like 66, so there you go.
VoiceOver: Image. Tuesday night. 16 degrees. Draw. 0%. A chance of light rain. Otherwise, mostly cloudy skies. Low temperatures around 16. N winds at 21 kilometers/H. Precipitation amounts around 2 millimeters possible.
Chad: Now, that’s interesting. It said 0%. But then, it says a chance of rain.
VoiceOver: Image. Wed. 18 degrees. Draw. 0%. A chance of light rain. Otherwise, partly sunny skies. High temperatures around 18. N winds at 19 kilometers/H.
Chad: Not sure why that is, but…
VoiceOver: Image. Wed night.
Chad: It could be the sources that they use may not be…
VoiceOver: 17. Draw. 0%. Overcast skies expected. Low temperatures around 17. N winds at 10 kilometers/H.
Chad: Anyway, you get the idea. So let’s go to the top.
VoiceOver: Button heart. Button play.
Chad: The heart there is the favorite button.
Chad: It replaces the home location, and it says heart because it’s going to basically ask me do I want to make it a favorite, which right now, I do not.
VoiceOver: Button play.
Chad: Just for demonstration here. And you know what? Let me do this.
VoiceOver: Button settings.
Chad: Settings, Audio, …
VoiceOver: Paul Trambley.
Chad: And let’s do Paul. Paul Trambley. There we go.
Chad: Back out of that.
VoiceOver: Button, button, button play. Speech off.
Chad: Let’s hear the forecast.
Paul: Here’s a look at your weather forecast, powered by Weatherology.com.
High of 19 this afternoon. Under sunny skies. Northwest winds, 10 to 15 kilometers per hour.
16 tonight. Chance for scattered showers.
Showers again tomorrow. 18.
Near 20, Thursday.
Turning cooler on Friday, with highs near 18.
And that’s your forecast. I’m meteorologist Paul Trambley.
Currently it’s 23.
Chad: Okay. Remember I told you that the audio doesn’t update as fast as the on-screen? So apparently, it has warmed up a bit since that last audio release. But even still, it’s 23 degrees Celsius according to the audio version of the forecast.
So there you go! Thought you’d like to hear that, Mr. Mosen!
And one more thing I want to show you. At the time I’m recording this, it is December 12th of 2022, and it’s almost December 13th. It’s almost 10 till midnight. So I’m actually gonna do this at this time. And then, I’m gonna wait a few minutes and we’ll do it again for both features. And that is Siri commands.
Now the one that’s not really working well at the moment is to play the current forecast. And I haven’t tried the articles one yet.
But the 2 I want to show you, I mentioned there are a couple that are only accessible via Siri. And those are Weather Word of the Day and Today in Weather History. And it turns out, Today in Weather History has always been subject to the day itself, meaning it should change at midnight. Weather Word of the Day didn’t used to be. Used to be you could say, play Weather Word of the Day multiple times, get a different word each time, back to back to back.
Now apparently, that is also a literally day-to-day feature now, which is a good thing since they call it Weather Word of the Day. But either way, I want to show you these 2 cool features. Here we go.
Play Today in Weather History on Weatherology.
Siri: Today in Weather History, now playing on Weatherology.
Paul: On this day in 1987, a winter storm packing heavy snows and high winds hit much of the Southern Rockies as well as the Southern High Plains.
Snowfall totals in New Mexico reached around 2 and a half feet at Cedar Crest, with around 3 feet reported in the higher elevations.
Winds reached around 75 miles per hour, with gusts that nearly reached an incredible 125 miles per hour just northeast of Albuquerque, New Mexico.
Chad: By the way, as far as I know, they’re not all USA stuff. So if there was something somewhere else in the world traumatic enough to make history, they are mentioned. And I’m pretty sure that Paul updates them pretty regularly, too. So you won’t get the same one each day of each year, I wouldn’t think anyway.
Let’s try the other one – Weather Word of the Day. Here we go.
Play Weather Word of the Day on Weatherology.
Siri: Now playing Weather Word of the Day on Weatherology.
Paul: Seiches: standing waves that oscillate back and forth over an open body of water. They are most common in large bodies of water, such as bays or lakes.
The impetus for a seiche to occur is persistent strong winds that push water from one end of the body of water to the other.
When the winds let up, the water that has been pushed to one side of the bay or lake starts to oscillate to the opposite shore, and then back again.
Anyone who sloshes water from one side of a bathtub to the other side has witnessed a small-scale seiche.
Chad: Okay. That’s not the same one I got before. So Weatherology is making a liar out of me.
But I’ll tell you what I’ll do. I’m going to wait a few minutes till after midnight, and then we’ll see if we get a new one of each of those features. Standby.
Okay, it’s after midnight. I did a little experimenting. The weather word of the day is not dependent on the day. The day in history is. So here we go for now with both.
Play weather word of the day on Weatherology.
Siri: Here’s weather word of the day on Weatherology.
Paul: Cyclolysis: the process that occurs when a cyclone is weakening and decaying. At the end of cyclolysis, the cyclone has become extinct.
Chad: And, …
Play today in weather history on Weatherology.
Siri: Today in weather history, now playing on Weatherology.
Paul: On this day in 1975, unusual December. Severe thunderstorms raked south central Kansas with 60 to 100 mile per hour winds. Extensive damage occurred in Wichita and Hutchinson. However, details of the damage were not available.
Chad: And there you go. That is December 13th.
Thanks for listening to this rather long tutorial. Hope you enjoy it.
It’s a really cool app overall. Some things may work better in the US, especially when it comes to notifications of alerts and things like that than they may in other countries. But it’s a fun app to play around with. And look at it this way. You’ve got a team of meteorologists that you can hear from right in your pocket.
Thanks for listening. And Jonathan, thanks for letting me have this demonstration as a part of your podcast.
Jonathan: Well, thank you for producing it, Chad. And Chad produced that quite some time ago. It’s been a bit of a difficult one to squeeze in due to its length. But this seems like a great time, right? Here we are towards the end of July, and everybody’s got weather of significance.
People are complaining about the weather because it’s hot in the northern hemisphere. And actually, there are some really concerning weather elements in the northern hemisphere.
Here in the southern hemisphere, it’s cold. So this seems like an optimal time to talk about the weather. Although we do do a lot of talking about the weather all the time, don’t we? [laughs]
Thank you very much for listening. We look forward to your contributions. They are critical to keeping the show lively.
So if you hear anything that you want to comment on or something new is on your mind that you want to raise, feel free to do all of that using our normal contact channels. We love to hear from you.
We’ll see you back next week for episode 242.
In the meantime, remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.
Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.
If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: email@example.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.