This transcript is made possible thanks to funding from InternetNZ. You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.
Jonathan Mosen: I’m Jonathan Mosen. This is Mosen At Large, the show that’s got the blind community talking. This week Elon Musk is buying Twitter. Have you ever served on a jury? Greg Stilson and William Freeman from APH talk about the new eBRF format, multiline displays, mantas, chameleon and more.
Voice over: Mosen At Large podcast.
Jonathan: Welcome to episode 176. I hope you have had a good week, and it’s nice to be back with you again. Well, you know there’s a big tech story happening when it transcends the technology press and becomes mainstream news. Elon Musk’s acquisition of Twitter has certainly got people talking. He’s handed over a cool $44 billion to be Twitter’s exclusive owner, assuming shareholders and regulators sanction the deal which seems likely. What does it all mean for Twitter? What does it mean for those of us who use Twitter? When I reviewed the Spring app for Twitter in last week’s episode, I made some comments about why Twitter is my favorite social network from an accessibility point of view.
Its content is still fundamentally text based. I prefer it, because there are several excellent third party Twitter clients that allow me to bypass the algorithms and read tweets in chronological order. One thing I didn’t comment on last week is the democratizing nature of Twitter. This is particularly the case in a small country like New Zealand where we have less of a hierarchical culture. Twitter has allowed me to communicate with politicians, journalists, and business leaders in a way that has made a difference. Twitter has facilitated problem resolution with various businesses whose products I own.
It’s a tool that I have made use of extensively as a content creator through the MushroomFM hashtag and the Twitter account for this podcast. Just as easily, it’s a way to shoot the breeze with friends and share information among the blind community. That’s why what happens to Twitter matters to me. My view may change based on what happens to Twitter in the coming months, but at this point I’m just not interested in alternatives like Mastodon, because they’re too fragmented. That fragmentation means that you’ll never get the same diverse set of experiences from one source. Elon Musk has described himself as a free speech absolutist.
Although in recent days he’s been seeking to clarify that comment saying that Twitter must act within the law of each country within which it operates. What does free speech absolutism mean? Well, we know what it means, because those of us who’ve been on Twitter for a long time have lived that nightmare. When you run a commercial entity as Twitter is, your goal is to increase revenue. Key to increasing revenue is to have as many people consuming the product or service as possible. Twitter still has serious problems, but at least in recent times they haven’t washed their hands of their problems, and they used to do that. The fact is that Twitter was an unsafe space for many people.
Cyber bullying was rampant with Twitter taking this hands-off approach saying they’re not the police. The result ruined lives. Social media mobs are an ugly experience to watch. They are even worse to be at the receiving end of. Britain in particular seems to have developed a reputation for the police intervening when people make defamatory statements about someone online, and I applaud them for it, because when you post a tweet you have published something. In New Zealand, we now have a Harmful Digital Communications Act which at least in theory makes it a criminal offense to cyber bully someone. Something happens to some people when they get behind a keyboard. It short circuits their empathy.
I’m talking about much more than robust debate here. I’m talking about seeking to destroy an individual’s character and reputation often by simply making stuff up, or making a calculated decision to be as confrontational as possible. You know that old saying ‘sticks and stones may break my bones, but words can never hurt me’? It’s nonsense. I’m aware of people who have attempted suicide because of cyber bullying. I know of people who have succeeded. If you’ve never been at the receiving end of a social media barrage of untruth and hate directed at you, it’s easy to be smug. It’s easy to say, “Just block them.” It’s easy to say, “Who cares what a bunch of randoms think about you?”
It’s easy to say, “Take them on. Give as good as you get.” There are ramifications of doing any of these things. Random, stupid abuse is one thing. A calculated tissue of lies designed to destroy someone’s reputation is criminal, and shouldn’t be tolerated. If you’re interested in reading more about the ugly side of social media I recommend checking out a book called So You’ve Been Publicly Shamed. It’s by journalist John Ronson. It was published back in 2015 before there was a Twitter troll and spreader of disinformation in the White House. It’s gotten a lot worse since then, but it’s still an enlightening read.
There’s a story worth reminding you of. You might remember that back in June, 2018 the world collectively held its breath as a dozen Thai boys who were part of a soccer team as well as their coach were trapped in a flooded cave in Northern Thailand. A carefully planned rescue saw them all brought out to safety. Elon Musk suggested some of his technology people could design a small submarine to get them out, and I believe eventually that that submarine did actually turn up although it was never used.
Vernon Unsworth was at the time 64 and an experienced cave explorer from Britain who helped to recruit divers to perform that rescue, that amazing rescue that got the boys out, and Vernon Unsworth was later awarded an MBE, Member of the British Empire Award, by the queen for his part in the rescue. Mr Unsworth suggested that Elon Musk’s proposal was no more than a publicity stunt, and that he should stick his submarine where it hurts. Not exactly a classy comment, but Elon Musk took things to a completely new and classless level by referring to Vernon Unsworth as and I quote him, ” A pedoguy.”
Elon Musk later apologized and deleted the tweet, but when you’re talking about a situation involving 12 young boys, and you make that comment merely because someone experienced in the field is skeptical about a plan, the intention is very clear. That is a despicable thing to say about anyone. It is one of the worst things that you could say about anyone. A Buzzfeed reporter later contacted Elon Musk for comment about the “pedoguy” tweet. Elon Musk replied, “Stop defending child rapists.” This all happened because Elon Musk took offense at something a hero said. He had no basis, no evidence for making the comment and has never offered any.
This went to court eventually in the United States in a defamation trial. At that court case, Vernon Unsworth said this, “It feels very raw. I feel humiliated, ashamed, dirtied. Effectively from day one I was given a life sentence without parole. It hurts to talk about it. I find it disgusting. I find it very hard to even read the word, never mind talk about it.” In an example of how discourse in the United States is fundamentally broken, a jury found Elon Musk not guilty of defamation. The jury agreed with Elon Musk’s argument that calling a hero a pedoguy was just a trivial taunt on a social media platform that everyone views as a world of unfiltered opinion which is protected as free speech rather than statements of fact.
Mr Musk’s court papers cast his comments as part of the rough and tumble world of Twitter which rewards and encourages emotional outbursts, and sucks in readers worldwide, but that no one takes seriously. When the court cleared Elon Musk of defamation, his response was to say that his faith in humanity had been restored. He got away with it, and it’s sick. When the sole owner of Twitter is himself a cyber bully what does it mean for Twitter’s anti cyber-bullying efforts? Will they be scaled back? Will they be abandoned entirely?
I think it’ll be difficult for Twitter to do that, because while Elon Musk may have set a dangerous precedent by successfully challenging the defamation suit brought by Vernon Unsworth, other countries are taking a more sensible hands-on approach, and, if anything, scrutiny is increasing. Social media is tearing at the fabric of society, at the boundaries between truth and lives, and it’s dangerous. There’s no doubt that people have died, because they went down rabbit warrens of disinformation around COVID-19.
Again, in the United States faith and democracy is being tested as people seriously believe despite no credible evidence that a loser in a presidential election was actually the winner. What’s happening in this area is both a symptom and a cause. Inequality in Western society is its itself a pandemic. When you have a large number of people who feel no hope, that the odds are stacked against them and that there’s no way to get ahead? Of course, it’s fertile ground for the peddlers of disinformation. There are some complex ethical questions to think about here. Who’s the arbiter of what is truth? Who appoints them? How can we be assured that they themselves can’t be bought off, can’t be manipulated?
One significant source of disinformation are bots operated by foreign governments who have an interest in sowing discord in democratic nations. Their impact is immense and there’s no better way to demonstrate that than to look at the dramatic decrease in nonsense being posted about COVID-19 vaccines following Russia’s disconnection from Twitter. These bots are damaging and they are evil. Then there are the algorithms to consider. Those little bits of computer code that reward the controversial and the contentious, fueling anger which encourages engagement, which is what makes these social media platforms money. Elon Musk has commented negatively about the bots and the damage they are doing.
I am encouraged by that. He’s talked of authenticating all humans. I wait to see precisely what’s involved in doing that. After all, Facebook has a real name policy and is in theory, a bit more rigorous about authentication than Twitter and it is in many ways far worse than Twitter is. We’ll also need to have some pretty robust assurances about what is done with that authentication data, how it’s safeguarded. Elon Musk’s behavior has caused me to be wary of him. I’m not sure I’m comfortable with him owning what could be one of the most accurate databases of names and addresses in the world. Still, if new measures can get the nefarious bots under control and out of our lives, that will be a very good thing.
Twitter will need to be profitable under Elon Musk’s ownership. He’ll want a return on his considerable investment. There is hope that the way that the profit is obtained might be different, might be better. As I mentioned last week, Twitter’s relationship with third-party developers has been erratic, and that’s caused harm to the Twitter ecosystem. If we’re moving to a platform where third-party innovation is encouraged, where third-party developers can tap into every Twitter feature, including spaces, I would welcome that. It’ll be a positive thing for accessibility. If there’s one wish I have for Twitter is for the removal of the 280 character limit. It’s no longer necessary.
Twitter’s original limit was 140 characters, and there was a technical reason for that. 140 characters allowed a 20-character margin for sender information when a tweet was sent via SMS to your phone. The maximum limit of an SMS is 160 characters. That used to be a popular feature back in the day. When Twitter increased its limit to 280 characters, it was a signal that technology, as it does, has moved on. I think 280 characters still aren’t enough for good quality civil discourse. Would having no character limit suddenly mean the end of trolling, and calling decent innocent people pedoguys? Well, clearly not. 280 characters simply aren’t enough to express complex nuanced ideas.
Having one individual be they a sinner or a saint own such an influential platform is an incredible risk. Last week, Barack Obama made a thoughtful speech at Stanford University about measures that might be taken to repair and then safeguard public discourse that has become so fractured because of social media. It’s fair to point out that his own administration was asleep at the wheel on this point. Still, he had some thought-provoking things to say, and the speech is worth a read or a listen. The Twitter acquisition has still not formally taken effect. There are more questions than answers and the jury is still out on the degree to which Elon Musk really will be hands-on with Twitter.
Right now, it’s his latest toy. He has other companies to run as well. One would think that conquering the infinity of space is a much more interesting challenge than social media. Reality will bite when he realizes the regulatory complexity of operating a social media platform that must be mindful of the laws of each country within which it is available. In terms of whether I stay or go, I’m prepared to wait and see. Nervously, for sure, but I’ll wait and see. Rebecca Skipper says, “I’m not thrilled with the idea of Elon Musk buying Twitter, but I could be overreacting. After all, I wanted to leave Facebook when all the scandals broke out but never did.”
Robert Kinget tells me that he is putting together a guide to Mastodon which is a Twitter alternative. If you are interested in reading that guide, which is actively under development, you can go to www.starshipschangeling.net/mastodon. That is spelled M-A-S-T-O-D-O-N. I’ll put a link to that in the show notes. He says, “Here are some free Mastodon apps with lots of accessibility features on iOS and Android.” On iOS, you could try Mercury for Mastodon or you could also try one called Toot. There’s also a Mastodon client called Metatext. Over on the Android platform, you might like to try one called Tusky.
Voice over: What’s on your mind? Send an email with a recording of your voice or just write it down. firstname.lastname@example.org. That’s J-O-N-A-T-H-A-N@mushroomfm.com or phone our listener line. The number in the United States is 864-60Mosen. That’s 864-606-6736.
Jonathan: Many people have been giving the Spring app for Twitter a try after our review of it on Mosen at Large episode 175 and some listener comments on it. First from Derry Lawler, who says, “Hi, Jonathan, I really enjoy your podcast on a weekly basis. I just heard about the Spring Twitter client from you and I thought I would give it a go. Wow, I do like the interface, especially when using it with a Braille display. I like to see the person’s name on one line and the rest of the tweet which follows, giving me a heads up if it is a retweet or a mention. Isn’t that interesting that that was one of the criticisms that I have of it and that’s a feature that you like? Just goes to show different strokes and all that kind of thing.”
Derry continues, “I am not a fast reader of Braille. Actually, kind of slow but would like to read faster in Braille. Any ideas how to learn how to pick up my speed? I can read a lot of books with my trusty Focus 40 Blue, but a bit on the slow side.” I get asked, Derry, a lot of questions about my Braille reading, and people say, “How do you read at the speed that you read at?” It’s hard to dissect something that is just muscle memory. I’ve been reading Braille for well over 45 years now. What I’m conscious of doing if I try and slow it down and take it apart, is that I’m actually reading two sections of a Braille line at a time and my brain is processing them separately.
My left hand reading half the line and my right hand is reading the other half of the line and then I scroll the Braille display. It’s weird when you say it out loud like that. I think practice is the key thing. The more you read, the faster you get. There may be others who have really delved into techniques, maybe speed reading techniques or whatever for Braille reading. I wonder if it’s something that Hadley covers in any of its courses on Braille. If you have any hints for Derry, in terms of improving your Braille reading speed, is this something that you have consciously tried to do, and what worked for you? By all means, be in touch. That will be such a cool discussion to have.
Let’s get back to Spring and hear from Stephanie Mitchell. She says, “Hello, Jonathan. Firstly, I would like to thank you for a brilliant podcast. It’s one of the highlights of my week.” Well, thank you. “I want to comment on the new Spring Twitter client. It’s very similar to Twitterrific, but with a few excellent features. The search capabilities are more robust with Spring. I also love the keyboard shortcuts. I also appreciate the lightning-fast speed with which the app refreshes the timelines.” Good on you, Stephanie. Thank you very much for writing in. Now, I do have a couple of additions to my Spring review from last week. One came through on social media and I’m really grateful for this one.
That is that there are two additional gestures that you can make use of for actions. We covered extensively in the review how you can flick up and down to get all the actions you want and you can order those in a way that suits you. You can also, at the bottom of that screen in settings where you configure actions, double-tap a button that lets you configure swipes left and right. When you’re using VoiceOver, you get to those by swiping left and right with three fingers. By default, these options aren’t enabled at all. Since learning this, I have configured my three-finger swipe to the left to activate links and my three-finger swipe to the right for replying.
It’s really great to have taken those off the actions rotor which declutters it a little bit. I’ve now got very easy access to replying to tweets and activating links. The flexibility of this app is just amazing and there are little nuggets that you discover all the time. The other thing I would also add is that if you do want to be on the cutting edge of the Spring app, there is a public invitation available for you to beta test.
The Spring app is actively developed through test flights, which if you’ve not used that before, is a tool that you can download from the App Store where you can beta test various apps. Because of the active development of the app, it means that some of the criticisms that I offered last week have already been addressed. For example, the duplication of text on the share sheet has now been fixed. This is great. If you want to make sure you have the latest and greatest with Spring, you can opt into the Test Flight program.
The way to do that is to go to the about part of settings, double-tap that about button, and right inside there you will find a test flight button and that will invite you to be a part of the beta test program for Spring. You will have to have Test Flight installed. It is an Apple app available free from the App Store. Are you running any Windows Insider builds? It’s a good way to stay on the cutting edge to provide feedback to Microsoft and generally live the dream of being first with stuff. Marissa has a couple of questions about the Windows 11 Insider build. She says, “Have you noticed that the natural voices for Narrator sound fine, but then eventually sound very garbled?” Yes, I think that might be fixed now, Marissa.
Not only have I noticed it, but so have Microsoft. They actually did put in the release notes that this was an issue, that they knew about it, and that it would be fixed soon. I don’t use Narrator a lot, so I haven’t used those voices a lot, but by now, it may well be fixed. They certainly know it’s an issue. The fun of beta testing. “Secondly,” says Marissa, “I have installed preview build of Windows 11 and I’ve checked the box that says ‘Unenroll this device when a new version of Windows is pushed out to the public.’ Is this what I need to do in order to receive Windows 11 without the builds?” Yes, it is. What you are on now, Marissa, is this track that you can’t get off. It’s like the Hotel California of operating systems.
You can check out anytime you like, but you can never leave. You can, however, leave when the next public build of Windows 11 is out. If you’ve checked that box, it means that the moment the next public build is released, you will get that build, and then you will stop getting Insider build unless and until you re-enroll in the Insider program. I’ve actually done this on my studio machine that I’m producing this with. I decided that I would put my ThinkPad on the Insider build. I have no intention of putting this machine on the Insider build, except that it got me out of a bit of a bind. I talked about this some weeks ago when it happened.
A workaround was just to get on the Insider build and get up and running again, otherwise, it would have taken much longer to fix the problem I was having. However, this is not my play machine. This is my work machine. This is the it must be dependable machine. This is the I’ve got to get stuff done on this reliably machine. The moment that the Windows public build is out, I will be receiving that and not getting Insider build on this anymore. It’s a pretty cool system Microsoft have got. Tiffany Jessen is emailing in.
Inspired by the discussion that Bonnie and I had on a recent Bonnie bulletin where I was talking about really ramping up the communication that we have with dogs, or more specifically, trying to find ways for dogs to talk to us by some sort of electronic means.
This inspired Tiffany to share with me something that she had written a wee while ago and it goes like this, “Next month will mark the 21st anniversary of receiving my first dog. Since then, I have often thought about how we communicate with our dogs. As handlers of well-trained service animals, we, of course, know how easily our dogs reply to verbal commands and/or hand gestures for other services done at much further distances like search/rescue or animal herding where the humans often use other tools like whistles. I recall watching a demonstration in Ireland where the dogs not only heard the sheep but knew the difference between pluck two off the group or divide the group into two groups, all while herding each of the groups together and clearly defining the lines between them, and so on. Many thought it was impressive, particularly since the whistles were commanded from way down the field where they could not otherwise communicate easily.”
I will pause your narrative here, Tiffany, to say I have seen this myself because two of my three sisters are married to brothers who live in a little place called Whangamōmona in Taranaki here in New Zealand. When I was a boy I spent many a happy time helping out with the mustering where they use the sheepdogs and give them all those commands and all the magic whistles and things and I loved doing that. There is a sport here in New Zealand, probably other rural countries do this as well, sheepdog trialing. You actually take your dog, and you enter the dog in the sheepdog trials, and they put these on TV here. [chuckles] You can watch the sheepdog trials. My brothers-in-law have been in many of these.
These days, they are judges as well, because they’ve been doing this for so long. Tiffany continues, “The big limitation in communicating with our animals is communicating in the opposite direction. With my previous dog, I could ask her if she needed to park,” that means go to the toilet for those who are unfamiliar with seeing eye lingo, “and it was very clear through her body language when sometimes she would immediately leap up and streak to the door, followed by repetitive hops up and down, up and down, while ringing the bell hanging from the doorknob. She definitely needed to go out.
Though I was very successful in teaching her other skills on-demand, or even asking if she needed to go out by my ringing the bell and using no other words, I was never able to teach her to initiate her asking to go out by ringing the bell. My current dog, however, is a very different type of thinker. She may not be able to talk, but she’s very clear about getting her thoughts known. Unfortunately, to her, the bell on the door is not simply an indication of needing to go out, but rather wanting to go out. We can go outside and do all her business, but as soon as we come inside, she turns around and rings the bell again. She just loves to be outside and sun bathe.
Thankfully, I work on a laptop instead of desktop so as long as we are working from home, I’m able to humor her by working from my backyard. Anyway, back to the point. As someone who is interested in technology while studying special education, I often thought about the possibility of using switches and other alternative communication devices with dogs. Not getting beyond the initiating park with the bell, clearly, my prior dog would not have done well with the idea. I have often thought different dogs may be more receptive if I had the foggiest clue of how to do it. Recently, I listened to a segment of a radio show, which discussed a Californian University, which is studying just that thing.
I’m not sure if they all are using the same device. The person only mentioned a floor mat which the dog would step on, and it played a recording of the owner dictating a word. These dogs are not correlating simple concepts like requests for treats, but a lot more. Some of them even identify concepts like now versus later or physical pain in their foot where a thorn was found. One dog knows 68 different words. The person described some of the dogs stringing the use of multiple pads in a row, like a fragmented sentence. It was fascinating.” Wow, I had no idea about this, Tiffany. I just saw it and thought this up myself, but there you go.
She continues, “I haven’t looked up the study yet. I think they said it was being done in San Diego. Other than describing the videos on YouTube, the radio did mention another site where a lot is being done to compile and discuss the topic. I haven’t gotten deep into it yet, but I thought I would share my fascination.” I haven’t checked this out myself yet, but the URL that Tiffany has included is www.theycantalk.org. That is www.theycantalk.org. Isn’t this a fascinating thing?
Voice over: Jonathan Mosen. Mosen at Large Podcast.
Jonathan: Responding to a comment that I think was made back in Episode 172 if I’m remembering rightly, it’s Aaron Linson. He says, “I might be wrong on the statement I’m about to make, however, here it goes. One of your listeners stated that we as blind people don’t have many advantages in life then started to talk about radio voices. I believe that we have as many opportunities as we get ourselves into and network with people in what fields we want to go in. I believe that you have to set goals for yourself to achieve. You are as goal-worthy and can achieve whatever you want, regardless of your disability. I look at Stephen Hawking, for instance. The man couldn’t walk and had to use a speech synthesizer to speak. In comparison to him, blindness isn’t an issue, for your entire body to be shut down during your lifetime seems terrifying to me. Blindness, on the other hand, isn’t that bad when compared. I don’t want younger blind people thinking that they can’t do and be what they want. You are the only one in charge of your future. You make the goals and achievements for yourself. Go after what you want because only you can see where you want to go.”
Thank you, Aaron. That comment that you refer to did give me pause for thought as well. It’s not the way that I personally feel about my own blindness or choose to view the world, but what I have learned over the years is that blindness is a very different thing to different people. That starts right from the beginning if you’ve been blind since birth. I remember seeing some kids who were just mollycoddled by their parents. They weren’t treated just as any other kid who happens not to be able to see. They were wrapped in cotton wool.
What can also happen is that some parents make their blind kids feel that every achievement, no matter how small, is somehow miraculous because of their blindness. I know that it’s good to be proud of your children and to applaud what they do and support them. If you associate that with blindness and say, “Because you’re doing well at school or whatever and you’re blind, you are just a miracle child,” that’s not the signal that we want to send either. I have seen so many occasions where parental influence had such a significant
grounding on the way that blind kids have turned out.
Now, for people who go blind later in life, there can be all sorts of things at play there in terms of how do you typically respond to adversity, how do you deal with change, how do you deal with a crisis? There are many variables that determine how you respond to blindness, how you perceive it. Broadly speaking, though, I agree with you. Sometimes it’s much harder. Sometimes we confront people who want to put barriers in our way. I do think the biggest problem that blind people face is other people’s attitudes. Sometimes we might have to apply for many, many more job interviews until someone gives us a break. On and on it goes. It is frustrating.
I was struck by the very negative tone of that comment as well, but many people would agree with it. Many people perceive that to be the reality. I don’t think it’s ever too late to decide that you’re going to take control of your destiny, that you’re going to start to think differently. When you really believe you’ve got some control over your life, it’s incredibly empowering because you’re not waiting for someone to make something happen for you. You realize you’ve got the power to do that yourself. That’s incredibly liberating. Good on you for sharing that perspective, Aaron. We’ve been talking a bit about the telephone of late.
What a great combination, the radio, which many blind people are into, and the phone, which many blind people are also into. If you want to read a really interesting book, and I may have mentioned this on the show before; I know I certainly have on the Mosen Explosion, check out a book called Exploding the Phone which tells the history of phone freaking in the United States and there are so many blind people there. In fact, On The Blind Side, my previous podcast, we interviewed Jim Fetgather and I had no idea that he was a famous phone phreaker until I read this book and there were so many names, not all of them still living, that I knew from the blind community in that book. It was hilarious.
There I was half a world away, also playing with the phone, although I don’t think our phone system was quite as hackable. If it was, I didn’t know about it. Corey Cook, he’s in touch. He says, “My massive family member got a big phone bill story is not as memorable as yours. One Friday night when I was young, I spent the night with my grandmother and probably called about 30 different states here in the United States because I was playing with area codes.” You’re a naughty boy, Corey. I hope you got pinged for that. He says, “Keep up the great work on the podcast and Mushroom FM. That station sounds amazing.” Thank you, Corey. We do spend a lot of time on the sound. It’s nice to have it appreciated.
If we do have any phone phreakers lurking about who want to share some stories about this, 864-60-MOSEN. You can phone me at 864-606-6736. You can drop me an email, email@example.com with an audio attachment or you can just write it down. We had so much fun with the phones back in the day.
Charlie: Hi, Jonathan, I hope you’re good. Last night I was listening to your podcast and I actually said, “Why don’t I let Jonathan hear what we had here in South Africa or what we still have here in South Africa?” Our number’s here in South Africa is 1026. That is a number that you could call in order to hear the time. We used to do it when we were young lads at boarding school. Today people don’t use it anymore because they have iPhones, Androids, and all of those talkback and voiceover that could actually deliver the time for them promptly and not need for them to actually go calling a speaking clock operator or anything. Now, our operator was 1023. Now that’s where you could actually get the directory for all of the numbers. Later you’re going to hear what that also sounds like. For now, let me let you hear what the speaking clock sounds like. Call 1026.
Automated voice: Calling 1026. When you hear the signal, it will be 11 hours, 34 minutes, and 40 seconds. [beep] [foreign language]
Jonathan: Hi, Jonathan, says Emo Johann from the Philippines here. I was wondering if you found a way to switch the sounds from Windows 11 back to the Windows 10 sounds. I must admit, Johann, with everything else that has been going on I haven’t spent too much time on this, but I did get a really great email from Curtis Chong who’s developed a cool sound scheme for Windows that he just takes with him because there was a time when this used to be the big thing. You could download Windows sound packs and just insert them and switch to them. Windows sound schemes were a big thing for a while.
I imagine that if I Google enough, or should I say Bing enough, because we are talking Microsoft, I would probably find a zip file with all the Windows 10 default sounds and I could plunk them in or create a new scheme. I’ve become used to them, though. It was a bit jarring for me for a while, and I thought I’m not hearing these as well, but over time I’ve become used to it. Perhaps I’ve just set my Windows volume a bit higher, or just as I say, become used to it. I’m not as bothered by them as I was when I first installed Windows 11. They really seemed almost too unobtrusive.
You get used to things over time, don’t you? Something that you thought might be a big deal, you shrug your shoulders and you live with it. There is probably somewhere, if you do a good internet search, you’ll probably be able to download the sounds or you could just copy them across from a Windows 10 machine somewhere and keep them safe before you do the upgrade.
This email comes from Jean Menzies who says, “Hi, Jonathan, this is my first time contributing.” Welcome to you Jean. “Like others, I want to say how much I enjoy and value the Mosen at Large podcast. The work and passion you put into each and every episode is amazing.” Thank you. “I love the variety of topics, the expertise shared, the viewpoints expressed, and well done to you and everyone who contributes. On episode 172, Louis was asking for handsfree playback solutions for working with files for learning piano music. I recommend Express Scribe transcription software from NCH software with a USB foot pedal controller.
I use this all the time when I need to type out recorded material either for print or Braille with an uppercase B transcription and I’ve also used it for learning piano music. There is both a free and paid version of the Express Scribe software. The free version will support most common file types, including MP3. While the software may seem like overkill, the transcription features can be ignored and it can simply be used for playback. I know that the Windows version is accessible with Jaws and NVDA, but I can’t vouch for the accessibility on the Mac. Using a USB foot pedal, the Auto Rewind Backstep increments can be set for however much automatic rewind is wanted for review.
Note that the Auto Edge foot pedal is the only USB pedal that works with the free version of the software. About the foot pedal, these high-quality foot pedals connect to your Windows PC or Mac OS 10 computer to control dictation player software such as Express Scribe transcription software. The pedals are plug-and-play which makes them easy to install and use. There are three controls which are typically used for the rewind, play/pause, and fast forward functions. If you want to find out more about this you can go to nchsoftware.com. That’s nchsoftware.com. NCH software have a very good reputation for accessibility at least on Windows.
I’ve used several of their packages over the years including a voice over IP solution and their switch sound file utility which is a mainstay for many people who dabble in audio for converting from one file format to another. If there’s one criticism I have of them, it’s that sometimes they can be a bit aggressive with invalidating a version on a regular basis when they do an upgrade. Their software is accessible and their support is very good. nchsoftware.com and you can look there for Express Scribe.”
Thank you very much, Gene. That might be the trick that Louis is looking for. Bev Power writes in and says, “Once again, I raise the issue of accessibility of blind diabetics using lifesaving blood testing equipment. I had messaged you previously regarding the above issue and you introduced on an earlier podcast a man by the name of Steven. I listened closely to this monologue and got the impression it was an infomercial on behalf of the FreeStyle Libre testing system. He mentioned he used his iPhone to access his test results. That is all well and good, but not everyone has an iPhone, particularly seniors. What happens if the user is beyond Wi-Fi range, or cannot afford data, or maybe the user cannot afford to upgrade to the latest version? I am sure you’ve heard that diabetes is the world’s leading cause of blindness and has been for many years. Where were the developers’/engineers’ heads when creating blood sugar and insulin delivery systems? Maybe they were blind to reality. As you’ve already proven many times advocacy is the best method of creating change and the more the better. The manufacturers must be pressured into making changes so we can live an independent life.”
Thank you so much for your email, Bev. I certainly hear the frustration in it, and it must be frustrating to feel like you could benefit so much from some technology that just hasn’t been made accessible. I do think though that when we are deploying our advocacy guns on an issue it’s important that we don’t inadvertently engage in friendly fire, which I think is what’s happening here. I’ll come back to the really important points that you’re making.
Steve Bower is a listener to the show and therefore a member of what I think of as the Mosen At Large community. I thank Steve Bower who put this together for taking the time to do the demonstration. Because you’re right, while not everybody has an iPhone, many people do. If they don’t have an iPhone, they can have access to one. The information that Steve passed on was helpful. Rather than being an infomercial, he actually did point out a couple of areas where the product might be improved.
The last thing we should be doing when we have a problem like this is turning against each other, particularly somebody who was just trying to be helpful and provide a tool that some people would benefit from. If somebody finds a product that works well for them, or perhaps that they see some problems with, and they want to take the time to record a good quality review like Steve did, I would hate for a message like yours to put somebody else off doing that. As I say, I read the email because the fundamental point is absolutely right. I understand why frustration is so high. I believe it’ll be next week, we will be talking to Clark Rachfal from the American Council of the Blind.
One of the things I’ll talk to Clark about is legislation, which has been introduced to Congress, which would require these devices to be accessible. I think you may be in Canada. My apologies if I’ve just relocated you, Bev. Even if that is the case, if you are in Canada, obviously while the US Congress doesn’t directly affect you. You can be sure that if there is legislation in the United States that requires products to be accessible if they are to be available in that market, it’ll surely trickle down to other markets and most certainly to Canada where often North America is perceived as a single market. It is not acceptable and I hope that there will be some progress on this in very short order.
Speaking of the conversation with Clark that is coming up, as I say, probably next week. I want to foreshadow one of the things that we will be talking about because I really am interested in your experiences if you have any relevant ones to share. One of the things that I advocated on strenuously here in New Zealand in the 1990s was clarification around disabled people serving on juries. There is legislation that we’ll be talking with Clark about that has just been introduced to Congress that also seeks to clarify in the United States the status of disabled people serving on juries. It’s an important topic. When we get to talking about it I’ll tell you why I think it’s so important.
No matter where you are, if you have served on a jury, I would like to hear your stories. Obviously, we have to be careful not to identify cases. We don’t want to go there. What I’m really interested in is how your fellow jury members reacted to you. Did you have any difficulty getting your assistive technology accepted? Were people concerned about what the assistive technology might be able to do? Did you turn up willing to do jury duty, but were challenged and do you think that may have been because of your impairment?
Please do share your experiences because we will feature your experiences before we have a talk about this legislation that has been introduced to the US Congress. You can get in touch by email with an audio attachment if you want. If you want to use your smartphone or something like that to recount your experiences in person, it’s good to hear these recollections in your own voice. If you want to do that, firstname.lastname@example.org. You can also write your experiences down or phone them in, and just bear in mind that there is a five-minute cutoff point on our voicemail system. 864-60-MOSEN is the number. Let’s talk jury service. 864-606-6736.
Voice over: Be the first to know what’s coming in the next episode of Mosen At Large. Opt into the Mosen media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show. You can stop receiving emails anytime. To join, send a blank email to email@example.com. That’s firstname.lastname@example.org. Stay in the know with Mosen At Large.
Jonathan: Here’s Kushel from the Gold Coast of Australia. I hope I haven’t mispronounced your name too badly there. In response to the question about the BrailleSense 6, there is an update in beta testing at the moment, and it’s going to have some pretty exciting features like OverDrive, and Dropbox integration. Not sure when the update will be pushed out, but definitely rest assured that there is an update in the works.
Scott: Hi, Jonathan, it’s Scott from Sydney, Australia here. I wanted to pass on a very serious bug in the latest WhatsApp Beta that I actually have been testing. I actually test WhatsApp, and this is a nasty bug, which I hope doesn’t make the App Store WhatsApp release. The bug is, basically, when you are listening to a message in WhatsApp and you lock your phone, the message will stop playing. When you unlock your device, the app crashes and you are presented with the usual app crash dialogue.
The frustration that I have as we’ve discussed previously in the podcast is when you actually send your message and appropriate information to support, they either tell you to go and update to the latest version, which is why you reported the bug in the first place, or they ask you for a video. You can’t send them a video because you can only have WhatsApp on a single device, which I’ve pointed out to them. I also pointed out to them that last time I sent them a video per their request, my video was not forwarded to the appropriate team for investigation. I just wasted my time and it looks like I’m going around in the same endless circle again.
I hope somebody listening to the podcast may be able to point Jonathan in the direction of the engineers for WhatsApp so we can try and get some dialogue going with WhatsApp because they don’t know what a beta is, how to file bugs with the appropriate team. There’s no point having a WhatsApp Beta because whenever you contact support you get the same responses over and over again and your bugs go nowhere. Having a beta is pointless in my opinion. I hope this message helps others with WhatsApp. I surely hope that this bug does not make it to the final release in the app store, because if it does, it’ll be a terrible experience for all who use WhatsApp.
Jonathan: This is a common experience, Scott. We find that companies are willing to be quite inclusive about who gets into these processes now, but where we get stuck is what they do with the feedback. You go to a lot of trouble. You report good quality feedback. You produce step-by-step instructions just like they tell you to and then you feel like you’re just not getting anywhere with it. You’re not making any progress. One suggestion I do have and you may have tried this already, WhatsApp is now a Facebook product and Facebook does have an accessibility team.
You might like to hit them up on Twitter or Facebook itself and let them know about this issue and the degree to which blind people use WhatsApp for a lot of voice-related things and how this will be a bit of a show stopper if this gets all the way through to a production release. It might be that the Facebook accessibility people, if you can engage with them and they will engage with you, just know the right hoops to go through to try and get this bug addressed. We’re saying hi to Tim, who says, “Hi, Jonathan. Several live streaming systems that I use have moved from streaming MP3 to streaming on YouTube. This has created two problems.
One, I can no longer use Tapin Radio to record the streams so I need a similar accessible program to schedule recordings of the YouTube streams. Two, I will need a way of sending the recording and live YouTube streams from my PC to Wi-Fi speakers via AirPlay. Do you or any of your listeners have suggestions for these two new needs?” Thanks for writing. Tim, let’s see if we can take this apart a little bit. In every case that I can think of when I’ve seen a live stream go on YouTube, it’s always archived. It gets saved to YouTube, so you can watch it on-demand later. In that sense, it’s a bit different from many audio streams where on-demand archiving isn’t so common.
If that’s the situation, and you can get the streams afterwards, it may not be as important to find a way to archive the live YouTube streams because there are numerous ways of getting YouTube clips into MP3. Some of them have slowed down considerably of late and I guess there must have been some sort of API change or something that is trying to disincentivize people from doing this because technically, it may not be in compliance with YouTube’s terms of service. Let’s face it, if you’re a blind person and you want to be able to just take this stuff and put it on a more blindness-friendly device, then I can understand why people want to do that. Now Castro does this, by the way.
You can go to YouTube and you can find the stream that you’re interested in. This works obviously with on-demand streams, and you choose ‘share’, and you can choose sideload to Castro. Castro, for those who aren’t long-term listeners to the podcast, is my podcast app of choice. I think it’s a wonderful podcast app. It has so many features and playing from YouTube is one of those. It remembers your place, you can speed it up and slow it down, you can add some dynamic compression. It’s all in the podcast app that you might be using anyway. That’s one way to get around that. Of course, you’ve got AirPlay built right in then.
There are many websites and standalone utilities that let you copy a URL to the clipboard in Windows or Mac and then paste that URL into the utility or website and it will download. Now, as I say, some of those are taking a bit longer than they used to. I don’t know whether anybody has a recommendation for one that’s really fast. There was one that was very popular in the blind community for a long time, called either Pontes or Pontes Media Downloader, spelled P-O-N-T-E-S Media Downloader. Towards the end of last year, that thing really started to slow down. I don’t know whether people have found some sort of YouTube on-demand to MP3 process that continues to be as fast as that one used to be.
If by chance, there are some live streams that are not archived in this way, you can use VLC media player to record YouTube live streams. You can Google on how to do this. The downside is that that’s not going to completely emulate the Tapin Radio functionality that you talk about. For those who aren’t familiar with Tapin Radio, it is a Windows-based utility that allows you to listen to internet radio, and you can record it and stop recording at predetermined times. It’ll do the whole thing in the background as well. You don’t have to hear it playing in the background to get things recorded. It’s a very handy utility, the old Tapin Radio.
VLC media player is not quite equivalent, as far as I’m aware, because I don’t think it has a scheduling facility in it. Another option might be to check out a company called Applian Technologies, A-P-P-L-I-A-N Technologies. Many years ago, now, I used a utility of theirs called Replay AV. It would record all sorts of things, very similar to Tapin Radio in some ways because it would run in the background, you could run it on a schedule and it would do audio and video. I haven’t checked them out for many years but I did just quickly see if they were still a thing and they are still a thing. They still exist. You might want to check Applian Technologies to see if there’s some sort of tool that will do the job for you.
If anyone has some specific recommendations, because, of course, we’ve got two hurdles to deal with. First is finding a tool that does this in the first place. Second is finding a tool that does this in the first place that’s accessible. If anyone can help Tim short circuit this process, that would be good. I think ideally what we would be looking for is a way to take the YouTube stream and convert it into MP3 as we go or some sort of audio format and then he’ll be up and running in the way that he wants. Let’s open it up to the wise Mosen At Large community. 864-60-MOSEN is my number. In the United States if you have some recommendations, 864-606-6736. Email something written or with an audio attachment to email@example.com.
Voice over: Mosen At Large Podcast.
Jonathan: Dean Charlton is writing in and says, “On two separate occasions now you mentioned that you bought the DVD set of Sons and Daughters.” I did and I think those two separate occasions may have been on the Mosen Explosion on Mushroom FM. For those who listen to Mosen At Large, but don’t listen to the Mosen Explosion on Mushroom FM, I hope you will. A bit of background, Sons and Daughters was a soap opera that was on TV in the 1980s and it came from Australia. We used to get it here. I think it was on at [6:00] or [6:30] at night on TV 2. I used to watch it and I used to record it as well for some of the kids at the school for the blind who were otherwise engaged at that time and wanted the series recorded.
I watched this for some years. I think at some point, I just lost interest in it, it got a bit silly or life took over or whatever, and I was thinking about this last year. I thought, “I wonder if it’s been officially released on DVD or not.” I did a bit of a google and I found that my timing was perfect because they had started to release it progressively a couple of seasons a year. I think there were seven seasons in total. I started getting Sons and Daughters on DVD. I believe not only was it screened in Australia and New Zealand, but also it had a bit of a following in the UK and I think it may have been reran a few times. If you’ve seen the series, man, there are just so many unpleasant people in it.
There’s just so much manipulation of people and subterfuge. Whoever wrote it must have had an incredible cynical view of the world. I’m hooked though. They’ve got four seasons out so far. I ordered Season 4 quite early on before it was available and I got this email from the company doing these DVD releases. They said, “Since you’ve ordered so early, we are upgrading you at no additional charge so that you’ll get Season 4 signed by one of the key actresses in this thing.” I thought, “That’s really nice. Doesn’t really bother me either way but that’s nice. I suppose it could be worth something.” I have now finished Season 4 of Sons and Daughters. It’s got really weird at this point.
I eagerly await Season 5, although it’s not quite as easy to get through Season 4, as it was the first three because by the time Season 4 has come to an end, they’ve really lost the plot a little bit. They really have, and a lot of the main characters have moved on. Anyway, I have got these. Dean said, “What did you do to get the audio description added to the DVDs and could it be possible for me to get them added to my favorite TV series?” No, they don’t come audio described because they were done in the ’80s, long before audio description was common, certainly in this part of the world. For the most part, I find that the older the TV show, the less audio description is necessary.
That’s a bit of a broad statement and there are some exceptions to every rule. I think TV has evolved as a medium, movies have evolved as well. Although there are occasions with Sons and Daughters where I think what’s gone on there, there’s some dramatic music or something, you eventually get the context. I guess if I wanted to, I could sit there with an Aira agent, and have them watch it with me and describe it. In the old days of the Aira unlimited plans, maybe that might have been possible. You could have sat there with an agent and had them watch it. That’s the only way that I can think of, Dean. Otherwise, I think a lot of those old series, most of them are pretty understandable.
In terms of extracting the audio though, it is timely for me to mention my favorite app for this purpose called DVD Audio Extractor. Actually, I learned from this show that DVD Audio Extractor is still a thing because I had an old website for them and that no longer worked and I thought, “Well, the program’s gone bye-bye.” The one I had was still working for me, but it is right up to date. He keeps developing it dvdae.com is the website.
It’s an ideal piece of software for a blind person because you can take DVDs and blu rays, you can take the audio from them and extract to a wide range of formats, including MP3 and ogg vorbis and I think FLAC, a range of formats there. You can choose the language track that you want to work with. If you do have movies that are audio described, it’s a really easy way to extract the audio. Obviously, if you’re a blind person and you only want the audio that really frees you up. It frees a lot of space for one thing because you’re not taking valuable space with video that you’re not interested in, and it’s a much more portable format in a whole bunch of respects. You can put it on your Victor Reader Stream or blindness-specific audio player if you have one, for instance. DVD audio extractor is a really cool piece of software.
You can also just use it as a player if you want. If you put the DVD or the Blu-ray Disc in the optical drive, and you bring up DVD audio extractor, you can see all the tracks there, the chapters, and you can play the ones that you want. It does serve that useful purpose as well because sometimes there can be accessibility challenges playing those sorts of discs. Now the trick is, you do have to have an optical drive, and a lot of even desktop computers don’t bother now. I always thought there may come a time, there may be a time in the future where I do get DVDs or CDs or something and I want to extract audio.
When Henry, the wonder son-in-law, and I built this computer, I specifically made sure that we had a really good quality optical drive. It goes like a rocket. I can extract data from them pretty quickly. In fact, I think I did the whole of season four of Sons and Daughters in and afternoon. There were 24 DVDs with seven episodes per disk. I was able to get that done in an afternoon, which is pretty good going.
Jonathan: If you download books from the Library of Congress in the United States, or you may choose to do this from Bookshare, your local library for the blind, you may be familiar with the BRF format. It’s ubiquitous, and it’s relatively simple in its structure. It’s basically a text file with translated Braille data inside. Is it fit for purpose in 2022? APH says, “No, it isn’t.” There’s more that could be done with Braille formatting, and they are tackling this along with the rest of the industry, to talk about this and some other things that are going on at APH, I’m joined by Greg Stillson and William Freeman. Welcome to you both.
Greg Stillson: Hey, thanks so much, Jonathan, for having us.
William Freeman: Yes. Thank you so much.
Jonathan: Greg, can I start with you? Tell me a little bit about what you’re doing at APH overall and perhaps segue us into this discussion about the eBRF file format, and how that came to be. Why is this necessary?
Greg: I run the Global Technology Innovation team at the American Printing House for the Blind, on our team we have a combination of technical product managers, quality assurance analysts, and software engineers. I work really close with other product managers, such as William to basically build the technology of tomorrow essentially. Keep in mind when it comes to technology, we don’t really build anything from internally at APH, we will do some projects and we do a lot of software work. When it comes to hardware, I will say one of the things that we really pride ourself on is creating global partnerships around the world.
Products that you’ve seen and two of these that William are managing, the Mantis and Chameleon I would say are two really good examples of a great partnership that we have with HumanWare, where we work with HumanWare to build the hardware, and some of the software we provide the specifications and things like that, and these products then are born.
One of the initiatives that my team took on in 2020 is this concept of creating what many people regard as the holy Braille, with this idea of having a tablet-type device that’s capable of showing multiple lines of Braille, and tactile graphics on the same surface.
What we did in 2020 was we put out a request for information to basically put a call out to all mainstream and assistive technology companies to say, “Look, show us behind the curtain everything’s under NDA, and let us know what really amazing technologies you are working on that could possibly be capable of doing this?” We had a ton of responses. We met with so many organizations and keep in mind, this was during the height of the pandemic. [chuckles] We were doing things with Zoom that I don’t think we ever anticipated doing.
We put out this RFI, we basically entered into an agreement where we partnered with HumanWear, and their technology partner Dot Incorporated to work on this endeavor to create this– What we’re calling dynamic tactile device. HumanWare and APH it’s a very different partnership than we usually work with. We’re both equal partners, both financially and equal stakes into this project. With the goal of creating a device that is capable of creating this whole multiple lines of Braille on one tactile surface and tactile graphics.
The end goal here and APH being the largest producer of physical Braille textbooks in the United States is to create a method where we can bring Braille textbooks and Braille books in general, to a digital form. Where we don’t lose all of the formatting and basically replicate that physical textbook situation on a single device. That’s really where our work with the eBRF came into play, because as we were looking at solving this problem, what we learned was that the BRF, Braille-ready format that exists today just isn’t equipped to handle a digital experience like that.
You got to remember that today when we talk about, what this device is going to be able to do, imagine a situation where your book is being produced. You get the book from the publisher, a Braille transcriber is now taking that book and making sure that any stem content, any graphics are put together, all the alignment is great. You get that book, you’ll get a notification or an email that says, “Hey, your book’s ready, go ahead and download it.” You’ll be able to download this book directly to the device. You’re not going to have to wait for it to be shipped, packaged, bound, any of that kind of stuff that takes months.
We looked at the cost of one algebra 2 book from 2020 at the time, and it cost around $30,000 and took 13 months to produce that entire algebra 2 book. By the time the kid was receiving different volumes, the class was most likely already onto something else. Our goal is really one of the biggest things outside of reducing the cost, we’ve created this metric called time to fingertips, really reducing the time to fingertips for the student to be able to get access to these books, and really this Braille content. Like I said, that’s what really produced this need for what we’re calling the eBRF or Electronic Braille Ready Format.
I’m a blind person myself, but I am by no means a Braille rules expert. I immediately said I got to work with somebody who knows this stuff, like the back of his hand. That’s where I brought in William who is really the author of eBRF specification that we are now circulating amongst partners all across the world.
Jonathan: Let me ask you about Dot Incorporated, and clarify that partnership. I am reading about a device called the Dot Pad, is that made by them or is that a rival product?
Greg: No, that is a great question. The Dot pad, I think I call it the 320, is a device that they produce themselves, and it’s really designed to work. Apple, I want to say it was an iOS 15.2 or something like that. They produced an API that allows developers to connect tactile displays to an iPhone and produced very limited sets of graphics, but it gives the developers a lot of tools to be able to do some pretty cool things with these tactile displays.
The way that Dot is getting developers interested in this is by creating a developer kit device. That’s what this Dot Pad 320 is, is that it’s a tool that developers can use to create what they’re calling tactile experiences, but the relationship that we have with Dot they are Braille cell or Braille technology manufacturer. They provide the Braille cells that the tactile array that is going to be used in this device. This other device is separate. It’s something that they’re doing with the development community, aside from that. They’re basically providing the Braille cell technology in the same fashion that many AT companies rely on similar Braille cell manufacturers today to provide the cells and their devices.
Jonathan: William, why would we just not use DAISY for this? Doesn’t DAISY, do what you’re aiming to do with the eBRF format?
William: Yes. DAISY is a point and a good place to start. DAISY is a print file with markup and it is missing a lot of information that we might care about in Braille, especially in education. Probably the main thing that’s going to be missing are the integration of tactile graphics. With the eBRF, you’re getting a bundled, zipped folder and inside it, you’re getting all your volumes, and everything’s made with Braille in mind. It’s all thinking about Braille. It’s thinking about the rules of Braille, the different considerations that are you unique to Braille, like Braille code, for especially when you have multiple Braille codes within the same file or foreign language codes, and then the tactile graphics.
I think that’s the main benefit you’re getting. The rules of Braille and the formatting of Braille– I’m a brail transcriber. That’s my background. I’ve been a Braille transcriber for 10 years. People don’t think about the importance of Braille formatting until it’s gone. It’s like if you got a magazine in the print world and it was all just laid out as a single column, it would look very boring and uninteresting and be hard to interpret. The same is true of Braille. Braille without formatting is harder to read, and not as interesting, and not as well done.
Greg: You think it today, we have access to things like DAISY text files. When you get a DAISY text file in Braille, there’s no formatting at that point, it’s essentially a wall of text that is marked up for some navigation. You have the ability to jump chapter to chapter, heading to heading. Imagine trying to do spatial math with a DAISY text file or do a matrix or read its symbol.
William: Or take a test.
Greg: Yes. Those types of things are, unfortunately, all lost. The value with eBRF is that you get the power of DAISY or EPUB of the navigation, the linking of graphics, but you’re also keeping all of the rules of Braille in mind. Those rules, I think, are really what defines this whole eBRF experience.
Jonathan: It really does sound like newer devices that are going to be multi-line and then incorporate tactile graphics are the catalyst for this. If I’m using my vanilla run-of-the-mill Braille display with a single line, will I benefit from this format in any way?
William: Yes, you’ll get the enhanced navigation. You’ll be able to navigate by page numbers, by headings, and by links. Part of the standard is you include the tactile graphics. You also include alt text and additional descriptions. When you get the file, even if your device doesn’t support graphics, whether it’s a single line display or an embosser, it would be able to display that alt text and you’d still at least get something there rather than a blank space, or the device completely failing to try to replicate what that image is supposed to look like.
Greg: Correct me if I’m wrong, William. You also will get a benefit if the transcribers have indented certain lines for easy navigation. Imagine you’re using your space dot for a command to navigate line by line by line going down to look for say another section there. If the transcriber has elected to create an indentation or things like that, those also would show up in your single-line display for fast navigation.
William: It would depend on the software that you were using to read the file. Yes, you could definitely maintain those. The other cool thing is, I don’t know if you’ve noticed this, but when you’re reading a BRF on a Braille display that’s a different size than the page size of that original BRF, you end up with these awkwardly short lines. The BRF, everything’s hardcoded. If you get one word left on that line, that’s what you ended up getting on your single-line Braille display, even if it’s the middle of a sentence. You just have that one word, and then you have to go to the next line and now you’re back to having a full line.
With the eBRF, everything’s formatted, and so you’re going to largely do away with those awkwardly short lines that you have to deal with so often when using a BRF. The cool thing about the eBRF is the possibilities that it offers the software developers of the tech that folks in our field use. A single line Braille display, for example, okay, it can’t really display a spatial table. Because you have all the information about that table, you know where all the cells are, you could still have some kind of advanced navigation, allowing the user to move within the table the same way they might using their screen reader and a PC.
Being able to move up and down and left and right within the table is something you could replicate to some extent on a single-line Braille display using a formatted document.
Jonathan: Is back translation important to you? If for example, somebody needs this textbook, and it’s only available in eBRF and they want to be able to take it in say to Word or something like that? Are you giving thought to how this translates in that environment?
William: Yes, we talked about this a little bit before the show started about how you don’t really mess with editing BRFs. That is the correct stance, don’t edit BRFs. Once you start trying to edit a BRF, you’ve made a mistake, because you’ll throw off all the formatting, everything will get messed up, it gets messy very quickly. With the eBRF, there’s going to be enough information in the file. Because you’ve got the Braille, because you’ve got the markup for the styles and the inline markup, like the emphasis and things like that, you’ll be able to back translate that file and maintain the formatting. You can go back, you can get it back to print.
There’s enough information that if you wanted to edit an eBRF, you’ve got enough information there to take it back to print and start editing or you could even edit in Braille and add text and it doesn’t mess up the entire rest of the file. You’re just editing within that one style or changing the style or whatever it is, and then everything reflows from there.
Jonathan: There are real challenges with adoption of a new format, aren’t there? I’m thinking of the old ubiquitous mp3, even this podcast after all these years has an mp3. Because despite the fact that mp4 has been around a long time and it’s superior in every respect, you’ve got maximum backward compatibility. You’ve obviously thought a lot about the buy-in of the whole industry, making sure that every Braille device possible is going to be able to understand this format.
William: Yes, we had an eBRF summit at CSUN and it was very successful, and we were really just grateful at all the different organizations that showed up. It’s huge. This affects everyone. It affects Braille readers, it affects Braille manufacturers, Braille display, and embosser manufacturers, it affects the libraries. It affects printing houses, it affects Braille transcription programs, and Braille transcribers. Every single step of Braille and Braille technology is affected by this. There’s so many different needs that you have to think about.
The primary need is the Braille reader. If we don’t satisfy the Braille reader then why did we bother to do this at all?
There’s a lot of different folks. Greg and I were going back and forth trying to think for older hardware and software, how do we get back to a BRF? Well, we could write a program and then you could put the eBRF in, and then you could spit out a BRF. Then that way, folks that have older hardware or software aren’t left behind, and they can continue to use Braille. Then we started getting feedback from the field and somebody suggested, “Why don’t you just bundle a BRF inside the eBRF?” It’s so obvious. Once it was said, it was like, “Oh, wow, of course, why didn’t we think of that?”
While you’re there, while you’re creating your eBRF, go ahead and also create a BRF, and then you’re covered and it’s much easier to have backwards compatibility. You also have that BRF as almost a master file that you can compare against, and make sure you’re maintaining everything properly.
Greg: When it came to this whole mass adoption thing, and that was really at the forefront when we started this project. Is we looked at this as the primary benefit is keeping the integrity of Braille while giving advanced navigation. You’re right, Jonathan, that these multi-line displays are going to be the biggest benefit of this format. Having said that, it doesn’t matter if nobody uses it. That was the first thing that we did is started really cultivating these partnerships with organizations around the world, the CNIBs, RNIBs, Vision Australia’s all these organizations that we reached out to and said, “Hey, we’re thinking of doing this, would you like to be involved?”
We almost received the same response from all of them, which was, “Hey, we started down this road as well but then it got really big really fast.” We realized as time progressed, that everybody was going to start going in their own direction and replicating each other’s work. That’s really where William and I just said, “All right, we’re going to start this and offer up the opportunity to really partner and say we want to make this a community-wide effort.” This is something that APH by no means wants to own, we don’t want to own the eBRF format specification. In fact, I deliberately don’t want to own it.
This is something we want the community to own. We want it to be held by a standards organization and maintained by a standards organization in the community. This was a situation where we started with those conversations with BANA, with National Library Service, with ICEB, and said, “Listen, William and I are not specifications writers. I’m a Braille reader. He’s a Braille transcriber. Ultimately, we want to start with the use cases that this is going to benefit. Ultimately, we need support and help in writing specifications.”
That’s really where once we got the feedback from the 20, or 30 partners that we’ve started communicating with that said, “We want to do this, this is desperately needed.” We said, “Okay, we’re on to something, this is something the field is really going to need.”
Jonathan: What’s the status of the spec right now, then? Is the spec complete, or is it still under development?
William: It’s still under development, we’ve called it the 2.0 draft, it’s probably the sixth draft that we’ve done of the standard. We’re about to do one more draft. Then we’re going to enter into probably the most exciting time. Because right now, we’ve been working with our partners, but it’s all been going back to APH. Folks are giving us feedback and then we’re incorporating that feedback into the draft. We’re going to do one more draft based on all the feedback we got from the summit and through email and so on. Then we’re going to partner with a standards organization.
Put the draft in a Git repository, schedule regular meetings, get a mailing list going, and really just open it up so that other folks can help us in finalizing this and incorporating everybody’s feedback. As a part of that too, we’ll be making the first examples of an eBRF. We’ve got some really rough examples that I’ve made but I haven’t really shown them to anyone because they’re not worth looking at just yet. We’ll have good proper examples made by folks that specialize in making new file standards like this.
Jonathan: When do you think it will be that listeners will be opening their first eBRF files on their device?
William: It’s really hard to say, but I think it’ll be sooner than you might imagine. I do think it’ll be sooner than you would expect for how big of a change this will ultimately be.
Greg: Our goal is to have the first field test units of the dynamic tactile device ready by the end of 2023, and with that will come a subset of eBRF file types. What kind of wide adoption will be there at the beginning? I don’t know, but the nice thing is that APH, this one of those things where we’re making the hardware, we’re making the software, and we provide the books. We’re going to at least start with our most popular textbooks, and books that we provide and convert those to eBRFs at the beginning, and hopefully, it catches on.
We’ve got a great relationship with the National Library Service and with Bookshare, and such, so they’re all participating in the creation of this, so with their interest, it’s our hope that they will begin to adapt their libraries to eBRF as well.
Jonathan: People don’t buy a Braille device every day, so we are going to have to hope that manufacturers will go back and do firmware updates for a range of Braille devices. Even those of us who have single-line devices, so that we can navigate more effectively, jump to pages, all those benefits that you mentioned.
Greg: Absolutely, and we’ve already had those discussions with several of the manufacturers, we’ve spoken to Duxbury and ViewPlus, and those providers as well of the actual transcription software. That I think is the other piece if the Braille transcription software doesn’t adopt the eBRF spec, then it probably doesn’t succeed. The good news is, is that from the conversations that we’ve had, they’re extremely interested in this. I think they see the writing on the wall as well, that this is the way the world’s going.
Jonathan: Can you tell me a bit more about the physical layout or the description of this multi-line device? I presume you’ve seen prototypes or at least you have a very clear spec in your mind. What’s the device going to be like physically?
Greg: I can give you a rough– We’re finalizing patent works and things like that, but what it will basically be is a tablet-style device, with Braille keyboard input behind the Perkin-style keyboard will be a tactile array of– I’m not going to say the total number of cells yet, but it’s a significant amount of more Braille than you’ve ever seen on a device in your entire life. It’s something that we’re really excited about. You’ll have the ability on this tactile array to, as I say, navigate multiple lines of Braille, there will be panning controls near the display. You’ll also be able to zoom in and out on tactile graphics. If you want to pan around a tactile graphic, you’ll be able to do that.
If you want to zoom into a tactile graphic, some of the things that we’ve been looking at, we’re going to for the first time have the ability to create multi-layered graphics. Graphics that have a certain level of information when you’re zoomed out, but then as you zoom in, you’re going to unlock more information. One example I always give is, imagine the map of the world and you see countries or continents at the beginning, and then you zoom in and now you see countries and then you zoom in again and you see states and things like that. It’s something that a blind person has really never had dynamic access to on a device, such as this, and being able to unlock that information at your fingertips.
Having said that we’re going to have to create experiences for this, and so that’s one of the bigger excitements is that we will be looking for software partners. Part of the design of this product is going to be creating an SDK or software development kit for software developers to create experiences on this device as well. Creating apps for this device that can create experiences and things like that. The device will come with some basic-level efficiency tools, things like that. Book reader, we’re also looking at a graphing calculator, things like that, and hopefully a web browser. That’s one thing that we’re also looking on this device is that there’s so much that is online today, especially in education.
Many learning management systems that are used. We’re trying to create an experience where just using this device, you’d be able to participate at some level of the learning management system access.
Jonathan: Do you anticipate this being standalone or do you also anticipate that it will have some sort of terminal functionality? Because if it’s going to have that, it’s going to require Vispero in particular to have a really good hard think about the way that JAWS engages with a multiline Braille display, isn’t it?
Greg: Yes. The answer to that question is both, it’s going to have functionality for standalone in the classroom. Primarily, the one thing I keep saying to the team is if we don’t get the book reading use case correct? Because the book reading use case is what allows us to gain federal funding for this. Being able to optimize Braille literacy in the classroom by creating a really effective digital textbook experience, we have to make the best textbook reading application possible because if we don’t get that, we don’t get funding to continue on with this, to reduce the cost, to do all that kind of stuff.
The textbook use case is our number one priority, but the device will have the ability to connect to other devices for two purposes. The first purpose being what I’m calling the tactile monitor use case, and this is very similar for those of you who follow the Graffiti that we worked on back in 2015-ish timeframe. Being able to plug it into a visual display, a monitor, or a computer and replicate the visual display in some ways on a tactile array like this. That’s the first use case is being able to say, “Okay, can I plug it in or Bluetooth it over to an iPad and see the layout of my home screen.” Maybe I’m not seeing Braille labeling, but maybe I’m seeing symbols that represent icons or things like that.
One example, and this is one that we’ve actually seen when we were working on the Graffitis. Imagine a scenario where you connect it up to microscope and you’re able to zoom in, in detail on some cell that you’re looking at. All those things will be possible with this when you connect it up to other devices, we’re going to have to refine a lot of the tactile filtering mechanism. Then that’s all going to be done in software, because if Jonathan have you sent me a Google image of somebody’s face and I threw it on the a tactile display, it’s not going to look like a face.
You’re going to have to do filtering to simplify that structure so that a blind person can understand, “Okay, the eyes are here, the nose is here, the mouth is here.” That’s the first part, is the replication of a visual screen, but the second part is exactly what you’re saying, the creation of multi-line Braille terminal, and that is something we’re already starting those conversations with the screen reader manufacturers or providers. Because they may be thinking of it in smaller scale, one or two lines and things like that, but this is upwards, it’s going to be more than eight lines of Braille at a time.
You’re going to have a lot of real-estate and a lot of ways to quickly navigate to things you want to be able to access.
Jonathan: Braille cells tend to weigh quite a bit, so I take it that these Braille cells are very different technology. Otherwise, with that much Braille, it’s going to be huge. It’s going to be heavy.
Greg: Yes. Well, that was the biggest criticism about the Graffiti, when we worked on it originally is how bulky it was. No, the dot cells, they’re just a brilliant technological design. They’re incredibly thin and incredibly light. The entire device, all the electronics, and we produced our first non-functional prototype weighs less than five pounds.
Jonathan: What about responsiveness?
Greg: The entire tactile display and this is actually without us optimizing pin recognition. To refresh every single pin on the display takes less than three seconds to do. Remember that you’re reading multiple lines of Braille and it refreshes from top to bottom. If I’m refreshing every single line, the use case of reading Braille in a textbook as you reach the last line where does your hand go first? It’s going to go to the first line on the display, and that’s already going to have been refreshed because that’s the first line that is refreshing.
Jonathan: You know, if I had a chocolate fish for every time somebody told me that they had cracked and you called it the holy grail and it absolutely is cracked the holy grail of multiline Braille, I would be morbidly obese. How confident are you that this is actually the real deal and that by the end of 2023, people will have these devices?
Greg: I’m going to say it [unintelligible [01:28:49] I am right there with you. My message to the field is I’m telling you what our vision is, I’m telling you where we are today. We have a non-functional prototype with a physical thing that exists. We know that the cell technology works. We’ve had prototypes of the cell technology that we’ve actually shown to a number of folks that can replicate the multiple lines of Braille in standard Braille spacing, along with tactile graphics. We know that the technology at its core works. We have a physical device that is a nonfunctional device, but we know that all of the parts and pieces fit together and that nonfunctional prototype is a form factor that people want to use. Having said that we’re doing so many things that have never been done before. Even the art of combining multi-line Braille and tactile graphics in one use case in a digital device like this has never been done before. What I’m saying is that I am optimistic. I feel very confident in the technology side, I am less confident confident in the supply chains of today. When you say, if they’re available by 2023, that’s our goal, but I also didn’t expect a pandemic in 2020, and I didn’t expect all the supply chain issues that we’re seeing in 2022. Timeline-wise, I can’t guarantee that we’re going to see it by the end of 2023.
That’s our goal, but I am very confident in what the technology can do, which is something that I can say that with every other attempt that I’ve seen thus far, and you’re right. There have been a lot of attempts and a lot of promises, and so it’s very easy. I totally relate to the audience, who’s hearing this saying, “Yes, yes, yes.” Going again one year and the other.
Jonathan: Well and you and I have both been in positions where we’ve had these product ideas pitched to us, and they appear promising for a while, and then when you start to dig, not so much, but it had to come right sooner or later, and maybe this is the one.
Greg: Well, and I’ve always said if not APH, who’s going to do this? We’re a non for profit. We looking to almost take the Apple approach of this, which is control the hardware, control the software, control the content. If we can control all of those pieces of the puzzle, that’s more control in our hands that we’re not relying on others as heavily to do. In this case, I think the pins are all stacked up for us to knockdown. Now, it’s just a matter of whether we see any unexpected variables, but I can tell you when I saw this technology, and I think it’s been that reality check with almost everybody who touches it.
This is so different than anything that we’ve touched before, where you feel a tactile graphic and you feel Braille labels next to the tactile graphic that feel like real Braille. We always felt we were making some very significant trade-offs with technology in the past, and I don’t feel like I see those significant trade-offs as much with this.
Jonathan: Is it too early to talk, even in broad terms about price?
Greg: I would say yes because it’s still very high. You’re still talking about more than $10,000 for a device, and that’s something that we’re really hard at work on. We’re working with our VP of public policy, Paul Schrader, who you and I know quite well. He is working very closely with Congress and our folks there to put additional line items for supplementary funding for this project, and if we can get supplementary funding for this project, that means that we can then reduce the price to the end-user. That’s really our goal.
In addition, we’re working with blindness organizations like the National Federation of the Blind, the American Council of the Blind to work with their membership and with their public policy teams to help reduce the cost to end-user as well. This is not just something APH is primarily education-centric. This is something where we’re looking at the entire field and saying, “Okay, federal quota dollars will only get us so far with this project.”
Jonathan: I’d be very interested to see where this goes. While we have you both there. William, can I talk to you about Mantis and Chameleon? I did a review of the Mantis on this podcast. I own one, I bought one. I’m a huge, huge fan. Congratulations for all the work that you’ve done with Mantis. One thing I did want to ask you before we talk a bit about the future and see what I can tease out of you in that regard.
Do you regret jumping on board with the HID protocol given that we’ve still got some problems getting that working on Android, haven’t we? After all this time, it seems to have been a bit of a slow grind, and then I think it’s fair to say Apple’s implementation of it despite their familiarity with it internally has been a little bit patchy from time to time.
William: I understand where you’re coming from for sure. I don’t regret it. I think it’s the right move and we’re moving in the right direction. I feel we were going to have these growing pains, no matter what we did, other than just staying and stagnating and not moving and not trying to have some standardized way of connecting. No, I don’t regret it. We definitely have growing pains. We’re getting there. I think we’re moving in the right direction.
Jonathan: Do you think some relief for our friends on Android who currently can’t use a Mantis unless they connect with a USB? Is there help on the way do you think?
William: I don’t like to point the finger, but it’s all basically in Google’s hands. They need to report the problem through Google’s accessibility channels and do everything they can to help us put pressure on Google to support the standard.
Greg: Just to comment on that, I don’t remember how many years ago it was, it might have been in three, 2018, 2019. One of the things that APH did is we brought all of the organizations together and we said, “Okay, Apple, Google, Microsoft, all of the screen reader providers, this is the direction we want to go.” We basically had an agreement that said, “Yes, this is what we want to do.” We’ve seen, as William mentioned, Google, one of them, but some of these folks just have not adopted it going forward.
As William mentioned, getting the word out to the Google accessibility team that you want to see this both on Android or on Chromebook, we can shout and tell we’re a horse, but, unfortunately, it’s the customer’s dollar that ultimately speaks.
Jonathan: Yes, and you guys have to be careful, but this podcast isn’t called Mosen At Large for nothing. I don’t have to be anymore, and I can say one of the interesting things too, is that even Apple, which has embraced the standard, sometimes there are just odd things happening with Apple and Braille. One of them is that when you want to assign a Braille-related function in voiceover, the user interface makes an assumption that you want to assign a Braille command to a Braille function. Even though you could push any number of keys in conjunction with another, like the function key and a modifier, or even just command in something.
You can’t assign any of those key combinations with their implementation because they won’t let you. Really all you have on the qwerty device is the thumb keys, and that’s the extent of your customization. That is eminently solvable. Obviously, Android’s got some bigger problems, and I think as Android has matured in terms of accessibility with speech, it’s just a shame that TalkBack doesn’t have Braille built-in that it’s faulted on still. There’s a lot to do, and I think that’s one of the challenges of us being so dependent on mainstream third parties for our access that we have to take our place in the queue.
I don’t necessarily expect you to respond to any of that for diplomacy reasons, but I make that comment and I hope that individual users will keep putting the pressure on them. What can we see coming out, William, in terms of some features that you and your partner HumanWare are working on for the Mantis and the Chameleon?
William: The big thing right now is the text-to-speech update. We’re testing that now and hoping to have something we can report to our beta team of users. I think you’re a member of our beta team. Thank you for that.
Jonathan: That’s Chameleon only though because the Mantis doesn’t have that capability, is that correct?
William: Right. Text to speech will be Chameleon only, but this next update will come to both displays, and so there’s going to be features available for both displays, including a Braille editor, support for external keyboards, which will be a nice feature, especially for folks in the deaf-blind community, and a number of smaller things that aren’t as exciting and fun to talk about, but that’ll still be of great benefit to folks.
Jonathan: I get how the Braille editor would work on Chameleon, which has a Braille input keyboard. I take it on the Mantis you would use that Braille emulation mode where you use SDF and JKL to type Braille and the Braille editor, is that right?
William: Exactly. We talked about don’t edit BRFS, but there are people, especially people that are either in school or work in Braille that really want the ability to make small edits to their Braille files and keep them as Braille files. This will allow you to open a Braille file, read it as Braille, no back– It’s not going to be back-translated into print. It’s going to stay as Braille, which is also a huge benefit for folks if you want to read music Braille. If you want to read music Braille, you don’t want it back-translated as though it is just UEB or something, because it’ll be a mess. It’ll give you a good way to read and edit your Braille files and then save them and keep them as Braille files.
Jonathan: One of the things that made me chuckle when I got the Mantis to evaluate, and then I declined to send it back. I said, “Just take my credit card number. I have to have this in my life.” One of the things that I thought when I looked at this Mantis as a former product manager was, “OMG, this is just ripe for scope creep because there’s so many things that people want this thing to do.” How do you draw the line? Where is the boundary? I suspect that’s quite a difficult question between what you would expect to be functionality built into a more expensive fully-fledged notetaker and what you would put into a device like the Mantis.
Greg: Yes. William, how do you draw that?
William: That’s a really good question.
Jonathan: See, I’m glad I can ask the questions these days and not have to answer.
William: Yes, no, it’s a really good question. Part of it it’s been based on, I feel we have a great relationship with our users and I love how active the mailing list has been and getting that direct feedback from users and folks being comfortable telling us, “Hey, I don’t like this.” I want to hear it. When there’s something they don’t like, and when there’s something that folks want, we’ve had a number of very successful surveys. I don’t know how much folks like your listeners are going to be aware, but nobody fills out surveys in our field.
People have a really hard time and I’ve been super successful because people are so passionate about these products of getting survey responses so much so that my other product managers have been coming to me, like, “How are you getting people to fill out surveys?” It’s easy. When folks care this much about the product, they will fill out the survey and tell you what they want. Tell you what they think. The roadmap that we’re working from is based on what we’re learning from the mailing list, what we’ve learned from the surveys, and really focusing on doing the features and the apps and things that folks really want. I can’t talk about the next update to come.
I’ve given you a little bit of what we’re hoping to put out here within the next month, but it’s exciting. I think it’s stuff that folks are going to be really happy with.
Jonathan: What will the text to speech on Chameleon do on this first go-round? What should people expect?
William: Well, right out when you first install the update there is a limitation to it which is you’ll have two voices. It comes with two voices for English and two voices for Spanish. We’re going to have an update in the future. The near future. That’ll allow you to switch out those voices and pick your own. For the very beginning, you’ll be stuck with one of those two voices. For English it’s Shorona and Will, we’re you using acapella. My understanding is that Sharon has been changed to Shorona in the latest release. I don’t know why they would make a change like that. That is the change that’s been made.
It’ll have Shorona and Will and you’ll be able to get any voice once we put out that update that lets you download and add your own voices.
Jonathan: Well, that’s only fitting that you include the Will voice, right?
William: We did a lot of like, “Which voice do you like best?” Will won. I did feel a little weird about it since that’s my name.
Jonathan: Even now people ask me if the M in BrailleNote mPower stood for Mosen. That never even occurred to me.
Jonathan: It stands for mobile just on the record there. There’s a lot of good stuff coming up with Mantis. It’s just one of those products that for people who want this, who want qwerty with a Braille display, they really feel passionate about it. It sounds like you were quite surprised by the uptake.
William: Yes, Larry Skutchan It was his idea a long time ago when he introduced the idea to us, we all were like, “You’re crazy. Nobody wants that. It’s not going to work. What are you talking about?” Yes. He was absolutely right. People really did want it. We had people tell us when we were like shopping it around we had people tell us you’re crazy and you’re going to lose money and no one’s ever going to buy that don’t even bother. He stuck to his guns and the field testing backed him up. The early feedback we were getting backed him up. Then the history over the last two years has backed him up. Kudos to Larry for seeing his vision through there.
Greg: Just to chime in on that. I think it’s also a matter of timing as well. I remember folks telling me that Larry had been pitching this product for years and years before it got accepted. I think when you look at really where we are today blind people working on these electronic devices. Now it’s not strictly a standalone note-taker that people are using.
These are connected devices to mainstream tools and maybe they are using a note-taker, but it also connects to these other devices. The idea of using a qwerty keyboard on these other tools is not such a foreign idea anymore because you’re using a qwerty keyboard most likely on your laptop or with your iPad or something like that anyway.
I think the timing factor is also a big piece because now we’re so connected with all these other mainstream devices.
Jonathan: It serves two purposes, really. I used to use one of those Logitech wireless keyboards because I could easily switch from one device to another. I could just sit in front of one keyboard and do all these things. Well, the Mantis fills that function while also giving me Braille. It actually meant that there are fewer devices to carry around.
William: That’s a good point. We have folks that work at APH and that’s what they do. They use the Mantis full-time. We had one guy– He retired, but he would sit in a reclining chair and then use his Mac, and then he’d switch to his iPad. Just have the Mantis in front of him.
Jonathan: Well, it’s been great to catch up with both of you and people can go to aph.org just as we close. How do you intend to keep people up to date with the progress of the eBRF format? Because I think there’ll be a lot of interest in this. Is there a way that people can keep tabs on how it’s evolving?
Greg: As William mentioned he’s going to, we’re going to be creating a mailing list for the eBRF and with the dynamic tactile device we are going to be putting out update blogs as time progresses. We’re also going to be at all of the summer conventions giving updates there. We do have an active email address. If you have any questions or suggestions or just want to be interested receivers of EV of information you can email DTD, dynamic tactile device, DTD @aph.org that goes to me or I’ll see those emails and we can go from there.
Jonathan: Brilliant. Thank you both for your time and your generous contributions to the show today. I really appreciate having you on.
Greg: Hey, thanks so much, Jonathan.
Speaker 4: On Twitter, follow mosenatlarge for information about the podcast the latest tech news, and links to things we talk about on the podcast. That’s mosenatlarge, all one word on Twitter.
Jonathan: Thank you for your email Christian Burtling which says, “Hey Jonathan do you know of Spotify supports navigating podcast chapters? I want to switch all my music and podcast over to Spotify because it’s cross-platform and it has both music and podcasts in the one app, but I love the chapter support in your podcast so much I will refuse to switch to Spotify if it doesn’t support podcast chapters.” Thanks, Christian. I don’t use Spotify for podcasts myself. In fact, I don’t really use Spotify at all, but I did some research on this and it seems that podcast chapters are a fairly frequently requested feature, but at the moment it looks like no podcast chapter support on Spotify.
You’re going to have to refuse to switch unless you can persuade Spotify to implement the feature. It is such a handy feature that wasn’t it, especially with a long podcast like this where there are distinct sections. There may be one section that doesn’t interest you. Another one that does, being able to skip around the podcast that way is so cool. Believe me on the rare occasions where I push a wrong button and we don’t generate the podcast chapters within minutes of me publishing the podcast I start hearing about it from listeners. It’s obvious that the podcast chapter feature is super popular.
Here’s a good question to throw open to the Mosen At Large community and it comes from Edwin Khoo and Edwin says, “Hi, Jonathan a quick question. Do you know of any macro recorder that allows one to perform repetitive tasks in Windows? Edwin, it’s been a while since I looked for something like this, but when I needed one I was using something called macro scheduler. It was really accessible and as some Americans like to say, hella powerful. It is very powerful this thing. You can go to MJT net.com to get it. Macro scheduler is the name of it, but I have no idea whether it’s as accessible now as it was when I used this.
One of the main things I put this to was to create a script in the day before The Archers, this BBC radio series that has been running since 1951 was available online as a podcast.
The only way you could hear it online was to catch it live on BBC Radio 4. This was in the days when they were streaming with real media. We are talking a long time ago, but with this macro scheduler, I was able to have it run sound forge, hit record, go to real player and go to BBC Radio 4 in real player, keep recording for a determined amount of time. Normally about 20 minutes just to be safe, save the file with a file name based on variables and sound forge and shut it all down and macro scheduler did all that for me.
It was a super program. I haven’t had a need as I say, for a wee while for something like this. There may be better things on the market. It could be that something has happened to macro scheduler and it’s not accessible anymore, but if anyone has any hints on a good macro recording type app for Windows please be in touch and let us know Jonathan@mushroomfm.com is where you can send an audio clip or just a regular email message. You can also phone into the voicemail line at 864-60Mosen, 864-606-6736.
Speaker 5: Aloha Jonathan. This is Keao from Hawaii. I was wondering if you or your listeners had an experience with DigiSign. Now, this is a site that has a lot of forms for different people to sign. I just started a new job with a company called Access to Independence and their duty is to being a independent living service coordinator is to send out release of information forms to consumers. I know a lot of the forms that are accessible aren’t really accessible for us, but is there a way to manipulate a field to put in a signature or something along those lines and to use DigiSign to send the form to a person?
Jonathan: First of all congratulation on the new job. That is great news. I have heard of DocuSign before, but I’ve not heard of DigiSign that’s one that has not made it here at least to the best of my knowledge. I can’t comment at all on this, but we’ll open it up and see if anybody has any experience of this technology and its accessibility and how we can engage with it. If we can engage with it let’s hope we get some comments back that will help you out on this.
Hello to Dave Carson who says, “Jonathan, if this has not been brought up, I would like to know if anyone else is irritated with the presence of the vertical scroll bar in the photos app in iOS. I’m not able to swipe through my photos without being caught by the VSB immediately. It is very frustrating. Does anyone know of a fix for this? I like the VSB in the mail app and in settings, as it does not get in the way of simple swiping. I’m running the latest iOS 15.4.1 on iPhone 12 Pro if that makes any difference. Thanks for your excellent podcast,” says Dave. Well, thank you for writing in Dave. I can’t duplicate this, but I am running the latest beta of iOS 15.5. so it’s possible that it’s been resolved.
I have the same phone as you. I have the 12 Pro Max actually, but I’m not seeing this. I have found though, scrolling through the photos app pretty convoluted in general. It just doesn’t seem to reliably keep focus. I have a lot of trouble locating the photo that I’m looking for, particularly if I’ve just taken one, it seems to be really difficult to get to it. I’m not seeing the vertical scroll bar thing, but I just think the photos experience really isn’t that great compared with many of Apple’s stock apps, which of course are exemplary in terms of accessibility. Perhaps somebody can comment on this vertical scroll bar. It’s a funny control that one because sometimes it just doesn’t behave like I expect it to behave.
I have very long threads of messages. For example, obviously Bonnie and I have a very, very long message thread and I keep my messages, I don’t delete them. I thought it would be fun to just scroll through those old messages, just to see what we were talking about 5, 6, 7 years ago, because I’ve got messages that go back that far. What I found was there was a vertical scroll bar at the bottom, which I think was supposed to have you scroll through this long thread of messages, but it didn’t appear to take me to the beginning no matter what I did with it and I ended up having to perform a three-finger flick down to scroll little pages at a time. It took me ages to get back to the beginning, a long time.
I don’t know that vertical scroll bar is a curious creature. Maybe somebody else can comment on this, Dave. Hello, Rick Roderick. It’s always good to hear from you, Rick, and he says, “I have an old computer. It’s a Dell that I got in 2014. It still works great for most things. I am running Windows 10, and the latest version of JAWS. For several months, Outlook is getting stuck on certain messages, JAWS literally stops working and when I start it again, I find myself in the list of messages, not in the messages themselves. This happens most frequently with emails from American Public Media. I will send you a sample,” and Rick indeed did send me a sample.
It’s a very rich busy message, and it looks to me like one of those mass mailings that may have been sent out through a service like Mailchimp. I have also experienced some issues, not as drastic as yours Rick, but where you get some of these Mailchimp-generated messages that are full of decorative tables and images and that kind of unnecessary clutter, and sometimes those messages can bog JAWS down. What I suggest you do is see if you can get on a tandem session with one of the techs at Freedom Scientific or send them a sample of the message and explain exactly what the symptoms are. If you can produce this on-demand if you can keep a message where every time you open it, it does this.
I suspect that somebody from Vispero’s escalation team may want to have a look at that with you, tandem into your computer and see it in action so that they can try and narrow that down. I know you’ve got a lot of good tech-savvy skills there, Rick, and you’d be doing everybody a favor because if you are having this problem, I think the chances are quite high that someone else is. If it is unique to you, maybe Freedom know of some sort of fix that you can apply. Very best of luck with getting that resolved. If you do get it resolved, do let us know what the magic trick was.
David: Hope you had a lovely birthday. Anyway. I’d like to come in here to pay tribute to Bruce Russell. Now, for those outside New Zealand who might know who Bruce Russell is. He worked on radio for many years [unintelligible [01:54:41] about 50 years or so, but I got to know Bruce from King Country Radio, 1512 King Country Radio [unintelligible [01:54:50]. When I would go home for the school holidays, Bruce was doing the morning show, reading the news, or he would do the normal morning show. I have a cassette recording of Bruce wishing me a happy birthday, happy 17th birthday in August 1995.
Unfortunately, I didn’t win the [unintelligible [01:55:16] that day, but quite a few people I remember I also celebrate my birthday with that day as well and I still have a recording. Sadly, I don’t have anything to play it on oftenly. If you listened to news talk ZB, you would hear Bruce overnight from midnight to six or midnight to five, I think it was. He’d be in the newsroom six hours later doing another six-hour news shift and it just happened to appropriately be that he died at the news talk ZB news desk preparing to go on air.
Jonathan: I love to hear from you, so if you have any comments you want to contribute to the show, drop me an email written down or with an audio attachment to Jonathan firstname.lastname@example.org. If you’d rather call in, use the listener line number in the United States, 864-606-6736
Voice over: Mosen at Large Podcast.
[01:56:23] [END OF AUDIO]