Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.
Advertisement: Chris Peltz here with the Blind Grilling Experience – where we talk all things cooking, grilling, and barbecue; tips, tricks, and techniques for the blind and visually impaired, all things accessible technology centered around food.
If you like brisket and breads, you like pizzas and pies, folks, we will leave you hungry and wanting more.
Check out the Blind Grilling Experience on your favorite podcast app, or visit our website at BlindGrilling.com.
Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.
Hello! As iOS 17’s release looms, critical braille bugs remain, from the NFB Convention, Orbit Research and Access Mind bring us up to date with Optima, OrbitSpeak, and more, and we learn about an app that could be a game changer for people who struggle to hear in noisy environments.
Brrrr! Have you ever tried typing B-R-R-R-R-R as many R’s as you like into a text editor and then having your text-to-speech engine of choice read it back? Sometimes, you get some interesting results.
It is nice to be doing episode 243 of Living Blindfully on this very cold winter’s day in Wellington. Nice to be all warm and snug in the studio, talking to you. Thank you very much for listening.
Now, US area code 243, yet again, doesn’t exist. Pretty slim pickings around the 240 mark, isn’t it? So I guess, there’s some room for growth in the North American numbering plan. So no 243 in North America to tell you about.
But meanwhile, if you were listening attentively to last week’s episode, which I have no doubt at all that you were, you will recall that country code 242 was for the Republic of the Congo. Country code 243 is for the Democratic Republic of the Congo, where there are over 111 million people hanging out, according to the census from 2023 this very year. So if you are one of those 111 million people, a warm welcome to you. It is amazing where this podcast reaches.
We’re not going to do a Bonnie Bulletin this week because, you know, she charges such a large appearance fee. I can’t afford to bring her on the show every week. [laughs]
No. Actually, there’s a lot going on for Bonnie. She’s doing a bit of travel, and that kind of thing.
But I did want to report on the fact that we did manage to secure the Sir Paul McCartney tickets. We opted to go to Melbourne. And so we will be there on the 21st of Roctober. Really looking forward to that.
It’s most likely the last time that I get to see him, I suspect. I don’t want to be maudlin about it, but that’s probably the truth. So it’s worth going all the way to Melbourne even though he’s not coming here, which we’re very upset about here in New Zealand. My son, Richard, is going with us, and it’s going to be a very special occasion going to that Sir Paul McCartney concert.
We got it through one of those pre-sales. So I had to get a special code. And then, you go to the website and they put you in this waiting room thing. This is with Ticketmaster. And eventually, you get assigned a random number in the queue. And my heart sank because when we got assigned our random number in the queue, it said there were over 10,000 people ahead of us. And I was sitting in the waiting room for some time before the queue opened. But apparently, they just assign you a random number.
So we finally got in there. Man, concert tickets are getting expensive. And if it wasn’t Sir Paul, I would probably have pressed Alt F4 when I found the price. [laughs]
But I’m a major Beatles fan, and this is going to be special. I appreciate that his voice is starting to get quite rough around the edges, and all that sort of thing. But how much more often are we going to get to see a real Beatle performing live? So it’s going to be a very special occasion for us.
Advertisement: I’m grateful to Pneuma Solutions for sponsoring transcripts of Living Blindfully so the podcast can be as inclusive as possible. Inclusivity is a good word to describe all that Pneuma Solutions is about.
And if you’re employed, you’ll know that RIM and Scribe for Meetings can make a significant difference to accessibility and productivity in the workplace. We know it. But how do we convince the boss or the person holding the purse strings in our workplace of that? Pneuma Solutions takes some of the pain of this process away through their self-advocacy kits for Scribe for Meetings and for RIM.
Simply visit the Pneuma Solutions website, find the self-advocacy template that works for you for either product, fill it in, and send it to the person who makes the decision. It really is that simple to get the conversation started at your workplace. And if the person you’re talking with has questions, Pneuma Solutions are on standby to help.
So if you know you could benefit from Scribe for Meetings or RIM in your workplace, don’t hesitate. Use the advocacy templates and start a conversation.
To find out more, head over to PneumaSolutions.com. That’s P-N-E-U-M-ASolutions.com.
There’s good news for you if you’ve been waiting patiently (or impatiently) for the Be My Eyes virtual volunteer product (at least, that’s what it was called originally) to roll out a little bit wider.
You’ll recall that we did a comprehensive demonstration of this back in episode 223. It has come a long way since then in terms of speed and the quality of what it’s producing. It’s become even better. And I know there’s a lot of interest in this product, which is now called Be My AI.
Mike Buckley, who’s Chief Executive at Be My Eyes, put out a message a few days ago as we put this show together to say that imminently, they are expanding the availability of Be My AI iOS initially. So sorry about that, Android users. And they say they’re doing it equitably on a first-come, first-served basis. So if you were quick and you registered your interest in becoming a Be My AI tester early, you are likely to get access quite soon.
So at last, it’s a public beta moment. It’s a chance for a much wider cross-section of the blind community to kick the virtual tires, to put this through its paces. And I am very interested to hear what you think of Be My AI if you get access to it in the next few days.
So if you play with it, what are you using it for? Has it met or exceeded your expectations, or failed to meet them?
Let me know what you think. Really interested, and I know others will be as well. opinion@LivingBlindfully.com is how you can get in touch. And you can also call 864-60-Mosen. That’s 864-606-6736.
Also in notable items of news, the 2.0 firmware for SensePlayer has been released by HIMS, Inc. And there are many enhancements there. But I think the one that everybody was waiting for was the ability to sideload Android apps.
Now, having looked at people’s comments on this on Mastodon (because I am not a SensePlayer user), it looks like there are some limitations. Google services are not on the device. And from my understanding, when you’re using these third-party apps, you’re going to be using eSpeak.
Now, text-to-speech engines are a very subjective matter. And I know that eSpeak is really responsive, but I cannot understand it. I have given up on audio tutorials and things in the past because they’ve used eSpeak. And I simply cannot make out what it’s saying. So I have seen some disappointment expressed on Mastodon about this eSpeak thing.
And people are experimenting at the moment. They are sideloading apps. I understand you do need to find the APK files to do this and seeing what works and what doesn’t work.
So if you own a SensePlayer and you’ve got the 2.0 firmware installed and you’re using this new screen reader functionality to expand the capability of your SensePlayer, how is it working out for you? What are you using well? What’s not working for you? And how do you feel about the lack of Google services and the jolly old eSpeak?
Oh my word! I’m not really the SensePlayer’s market, I don’t think. But if I were, that would be an absolute show-stopper for me because I wouldn’t be able to use the apps in question. [laughs] But let me know what you think of how it’s all rolling out.
I understand that there is a webinar coming up from HIMS which will tell us more, and that is happening a little after we publish this episode.
As I’m putting this episode together, there has not been confirmation from Apple at this stage about when the iPhone 15 event is going to be happening. But smart money would be on about the 12th or the 13th of September. That means that iOS 17 will go public the following week, and those who are beta testing will get the golden master the week of the Apple event. So what I’m telling you is that it’s quite close.
The iPhone 15 sounds like a very exciting device in some ways.
Apple will be hoping to change the narrative because there’s a bit of a smartphone slump, and that has been reflected in their recent quarterly earnings call. And that’s got investors a little bit nervous about Apple’s performance, even though they showed strong growth in the service area.
I mean, it’s an old product category now, smartphones. It’s a mature product category. So you would expect to see growth in services and not necessarily in smartphones, particularly in the current cost of living conditions where people may keep their phones for a bit longer because they’re doing all that they need them to do. Thank you very much. But investors seem to have been a little bit spooked by that Apple quarterly earnings call.
So we’ll see what’s in the iPhone 15. We know there’s going to be USB-C, and it looks like that’s going to be in all models, not just the pro models but in all iPhone 15 models. It looks like certainly for the pro models, there may be prolonged battery life due to some new technology, better camera and who knows what else? We will see.
But one thing I can assure you of is that we will have a Living Blindfully special right after the Apple event, and that will go live for plus subscribers just a couple of hours after the Apple event. It’ll be public 3 days later.
So if you’re not a Living Blindfully plus subscriber yet, feel free to sign up anytime for as little as $1 New Zealand a month. That’s like 60 American cents at the moment. So it’s a bit of a bargain. You’re welcome to pay a bit more if you think the podcast is worth more. LivingBlindfully.com/plus.
But the reason why I raise this now is that time is running out for Apple to fix critical bugs, and I think we need to talk as a blind community about what is going on with Braille in iOS 17. I’ve chosen to raise it now here on the podcast because the beta cycle is nearly over, and we know Apple’s track record with Braille. And let’s be frank about it. It is not good. Apple leaves serious Braille bugs festering for a long time after official release. So no one can blame Braille users for not trusting Apple anymore when it comes to Braille because they’ve betrayed our trust repeatedly over many iOS cycles.
So let me tell you what I’ve found as a Mantis user. There have been issues with the Mantis right throughout this beta cycle. The first few builds wouldn’t take any keyboard input in edit fields, but Braille worked fine. What that meant in essence is that if you wanted to write anything, if you wanted to create any kind of content at all, whether it be an email, a text message, or a document, or a toot on mastodon, anything at all, you could not do it with the Mantis in the early builds of iOS 17.
Now, that issue was replaced in developer beta 4 with the problem pretty much in reverse. At that point, I contacted Humanware who duplicated my findings, and hopefully, they have some magical connections with Apple to get this sorted.
But let me explain what goes on. When you install a new build of either iOS or iPadOS, (and I’ve personally verified this on 2 different iPhone models), everything works beautifully, initially. So when I installed beta 5, I said to Bonnie, “Thank goodness, this is fixed. This would have been incredibly debilitating.”
But my optimism was misplaced because everything starts to go bad when you restart the Braille display. in my case, the Mantis. At that point, the Mantis connects to the iPhone but it only works as a keyboard, and not as a Braille display. You don’t actually hear that sound that you hear when a Braille device is connected. So the device is acting as a keyboard, but there’s no Braille.
The usual trick of choosing the reconnect option from the menu on the Mantis doesn’t resolve the problem, nor does rebooting the iPhone or the Mantis. I’ve seen a few occasions where I can bring the full functionality back to life from the power off screen of iOS. I have no idea why this works sometimes, but it doesn’t work every time. Sometimes, if you completely unpair the iPhone and the Mantis, and then you pair them again, you’ll get lucky for a while. You’ll get your Braille back. But then the problem recurs, and even that doesn’t work most of the time. It is what Americans like to call a crap shoot. The upshot of this is that mostly, the Mantis will only work as a keyboard and not as a Braille display.
In talking about this on Mastodon, I have heard from people who are using the QBraille XL from HIMS who are having very similar issues. I asked one user if he knows whether the QBraille XL uses the Braille HID protocol. He tells me it does not. So my initial thinking was this has got to be Braille HID-related, and I was getting a bit grumpy with Braille HID because to be honest with you, it seems to be a lot more trouble than it’s worth. But other people who are not using HID devices seem to be experiencing some issues with Braille as well. I’ve also heard that similar problems are occurring with the NLS readers and other devices.
So this is a significant issue, and let’s be clear about the consequences of this if it were to be released this way. I want to put the most important people that this would affect right up front here, and that is deaf-blind people. Deaf-blind people don’t have a choice about whether they will use Braille or not. So let’s say that this issue was not fixed by the time iOS 17 is released, and a deaf-blind person updates their device and they have flaky or basically non-existent Braille, that goes well beyond inconvenience. It literally can be a safety issue if your means of communicating has been taken away from you. Obviously, if you use your iPhone for productivity (on the job) and you use Braille, then this is a show-stopper as well.
And assuming that this is a software issue, which it must be, it would certainly prevent me from buying an iPhone 15 because I couldn’t downgrade iOS on an iPhone 15. You have to have iOS 17 on that, and I wouldn’t be able to use my Mantis properly until this is fixed. It would mean I couldn’t do this podcast in the way that I do it because I read your emails that you’re kind enough to send me using the Mantis and the iOS Mail app.
I am very fortunate to have a backup display. I have a Freedom Scientific Focus 40 Blue. And to be fair, I haven’t tested to see whether it is performing okay with iOS 17.
But what we know is that as we get quite close to release, this serious Braille bug that seems to be manifesting itself on multiple types of Braille display still exists, and it’s getting a little bit too close for comfort for me. I’d rather raise the alarm now and see if we can advocate for this to be addressed before release, than see many people inconvenienced by upgrading because maybe, they didn’t get the news about don’t upgrade because of this Braille bug. It is very hard to downgrade again, and Apple quickly stops signing previous builds of iOS. So if you upgrade and you find that you don’t have Braille, you’ve got a very limited window to downgrade again, and it isn’t easy. It requires quite a degree of tech savviness to even do it.
If you are running iOS 17, then please do file a report on this. If you know anybody that might be able to make a difference, please let them know. I know that Humanware is onto it.
Can you imagine the tech support nightmare that will occur for all Braille display manufacturers impacted if iOS 17 gets released this way? They will be overwhelmed because of a problem not of their making.
And drawing my equivalency argument which I made at the National Federation of the Blind convention, if sighted people installed developer beta 4 and they found that their screen was very flaky, preventing them from working effectively, you can be sure it would have been fixed in developer beta 5. No question about that. That would be considered a critical show-stopping bug.
That’s the equivalent for a blind person who uses Braille. This is a high-impact bug. Let’s hope it all gets sorted by iOS 17 time.
VoiceOver: Has something on the show got you thinking?
Share those thoughts with the rest of the Living Blindfully community. Send us an email. You can include an audio attachment recorded on your computer or smartphone so we can hear your voice, or you can write it down. The address is opinion@LivingBlindfully.com. That’s opinion@LivingBlindfully.com.
Or phone our listener line in the USA – 864-60-Mosen. That’s 864-606-6736.
Let your voice be heard.
We are perfectly entitled to call out discrimination when it occurs. But I think it’s good also to celebrate when we’ve had a good experience, and this is what this extraordinary email is all about.
This is a listener from The Netherlands.
Thank you so much for such an amazingly informative podcast. Ever since I discovered the Mosen at Large podcast, it’s the one I look forward to hearing most each week.
Congratulations on your transition to Living Blindfully, by the way.
I would like to share a story. As blind individuals, we often encounter situations where sighted people we interact with decide what we can and cannot do or achieve. While this can be demotivating at times, life can surprise you unexpectedly, upping your joy meter, so to speak, when you encounter the opposite.
In 1986, I was diagnosed with terminal kidney failure, the result of a progressive genetic disease I have had since birth, and I have been dependent on dialysis ever since. I did receive a donor kidney 3 times, but all of them had to be removed shortly after transplantation for various reasons. I have been on dialysis continuously until today.
In 2008, I met my lovely wife on a dating site, and we’ve been married since 2012.
In 2009, while hearing a conversation between two nurses while I was on dialysis in the hospital, one of them mentioned home dialysis. That sparked my interest, so I asked her if she would mind telling me about it as much as her time would permit, and she did.
When discussing it with my wife, she expressed complete willingness to give it a try and see if it would lessen the burden of being a kidney patient. And gee, what a difference it made.
Dialysis is the process of removing excess waste products from the human bloodstream. Normally, the kidneys perform this function 24/7 for all of us, in addition to removing fluid overload from the blood, which together form urine.
When your kidneys stop functioning, though, as happened to me, one option is to rely on a machine to survive, requiring 3 treatments a week, each lasting 4 hours. Basically, 2 needles are inserted into your body. One withdraws blood saturated with waste products from what you eat and drink, and the other needle returns the cleaned and fluid-depleted blood back to you through the artificial kidney in the dialysis machine. Undergoing this treatment has a significant impact on your energy, spendable time, and overall life. In almost all countries where dialysis is a viable option for people with kidney failure, it is done in a hospital.
Having my own machine, no longer needing to go to the hospital, and being able to administer and plan the treatment at home was nothing short of a dream. In the Netherlands where I reside, a doctor contemplated the possibility and numerous benefits of patients doing dialysis at home. No medical personnel, flexible treatment times during the week, and the option to treat oneself while sleeping, regaining freedom during the day. In short, it is now a reality, but only for those fully-sighted patients willing to take full responsibility for their own treatment, with or without the assistance of their partner.
In July 2009, my wife and I decided to see if we would be eligible to undergo the extensive education required to be certified to perform dialysis at home without professional oversight. I underwent an assessment and obtained permission. That, in itself, was a miracle.
Despite having experienced many occasions where others seemed to know what I can and cannot do, I still felt somewhat hesitant about undergoing this particular education as a blind person due to its medical and sterile nature. I feared that almost every action would be delegated to my wife.
However, during the initial conversation with the doctor who proposed starting home dialysis, I was taken aback. He said, “Paul, I have no idea what you as a blind person are able to do and what obstacles you face. We had extensive discussions with your future instructors, and here’s our plan.
We’ll assume that you are capable of performing all the required actions. You will learn to prepare your own machine, which involves a lot of precise but manageable tasks. Most importantly, you will learn how to insert the needles necessary for your treatment into your own veins.
Although this may be challenging at first, with education and patience, I believe you will be able to do it. Only if something seems impossible or potentially harmful to your health, both you and we will decide if your wife should step in.”
Wow! I remember feeling like an energetic dog eager to get started, but confined in her crate, and my joy meter sprung through the roof because I was given such a unique opportunity.
However, the thought of inserting a sharp, long, thick, sterile needle correctly into my left upper leg shunt was still frightening. The needle is 0.1 inches in diameter, an inch long, and I had to navigate through my skin tissues and the vein wall without anesthesia to access the high-pressure bloodstream.
In the end, all the fear turned out to be unnecessary. The team kept their promises. They took all extra time needed to create the entire protocol, since home dialysis with a blind patient and a partner had never been done before.
After 6 months of theory and extensive practice, my home dialysis machine was finally installed. Woohoo! Now that my wife and I have been doing it for a long time, it is no longer something to dread, but has become a part of our lives 5 nights a week.
As a result, we now have complete control over when to plan the treatment over the week. And I can independently dismantle and reassemble the machine with all the necessary parts before each treatment, which previously seemed too complex to handle.
If an alarm goes off during the night, my wife reads the machine’s display to me, but I myself need to know what needs to be done. So the responsibility lies with me, and it feels absolutely empowering.
There are still tasks that I, as a blind person, cannot do on my own. While preparing for each treatment and during the treatment itself, my wife needs to be with me. I could have been a solo patient, but operating the non-talking touchscreen on the machine and setting blood levels, among other tasks, require sight. Thus, my wife dedicates 5 evenings a week,and she has been doing so for the past 14 years to help me, which is truly invaluable.
The nephrologist who introduced the idea of home dialysis and made it a reality in the Netherlands occasionally mentions me in his keynote speeches across Europe. I seem to be the only blind person in Europe currently administering the needle myself and managing most aspects of home dialysis, and that fills me with a sense of pride. When people tell him that a blind person cannot insert their own needle due to the need for good eyesight, he now can prove them wrong.
To conclude, in the hospital 14 years ago, I used to receive 12 hours of treatment per week, leaving me with little energy and no life. Now, at home, I receive 50 hours of treatment each week, 10 hours per night, which has given me a renewed sense of energy, a fulfilling, blindful life, and regained mobility.
Even a guide dog became truth. As I write this, she is sleeping soundly beyond my chair. I’ve been wanting to work with a guide dog all my life, and now that she’s finally with me, she turned out to be more than good help out and about. She is an immense enrichment and dear friend. Had I not gotten the trust of the medical team 14 years ago, my life would not be as blindfully fulfilling as it is right now.
Good luck, and a big thanks so much for all your invaluable contributions to our community. Please keep up the good work to help us all stay in the know.”
Thank you, Paul! What a wonderful email to get.
Isn’t it just so refreshing when somebody actually says some magic words like how can we help? What is it that we can do to make this work and make it a good experience for you?
I’m so pleased it worked out for you the way that it did. That’s great stuff, and thank you for taking the time to put that detailed account together. That was very special.
This next contribution comes from Steve Bauer.
Steve: Recently, there’s been a lot of discussion about blind people and hearing, and especially being in noisy places. And I know that that can really be a problem because I’ve said for many years that it’s amazing how much you hear with your eyes. I’ll give you a couple of examples in a moment.
Whenever I go out to eat, when possible, I try to go a little earlier than most people. The main rush crowds say 6 PM, Well, I’ll try to go at 5 PM to avoid when the restaurant is exceptionally noisy because it just does get on your nerves.
A very good friend of mine who is blind passed away about 3 years ago. When he was working and he’d go out to lunch with one of his buddies and be in a noisy place because it was the lunch hour, he had a hard time hearing the waitstaff or ordering at a counter.
So they did an experiment. He had his friend close his eyes, and he said a short sentence. His friend got part of it, but not the whole sentence.
So he had his friend open his eyes and look at him, and he said the same sentence. 100% comprehension. So it is amazing how much you hear with your eyes.
With me is my dear friend Donna, and we have been able to work on some interesting communication skills. And I’ve got to give her credit because she’s very innovative, and it’s really pretty cool. Because eye contact for a blind person is impossible. It’s just not going to happen.
But Donna, you have come up with a very unique and very nice way of using touch to help communicate in different situations.
Donna: Are you referring to the toes? [laughs]
Steve: Oh yeah, you know, or just a touch of the hand, whatever has to happen to communicate without eye contact.
Why don’t you talk just a little bit about how you kind of came up with the idea?
Donna: Well, I’m a dancer, and I’m used to being led, or dancing with, as a couple. And some of the moves are so subtle and it can be a touch, it can be a turn. It’s feather light and I can be led easily. I’ve been told that it’s easy to lead me in a dance, especially in West Coast Swing.
And so I do the same thing with you. It’s like a little brush, a little touch. And you’re easy to steer in most cases. [laughs]
Steve: [laughs] But it’s very subtle and just very light. It’s nothing heavy or very intense. It’s very light and very sensitive. And I find it extremely refreshing.
Donna: It’s not like I’m trying to lead you like a parent might direct a child and turn their head, or jerk their hands to get them to focus on something else other than what they’re focusing on.
As a blind person, I would suspect that you are very responsive to any touch that you might feel, whether it’s a brush against your sleeve, well, the toe thing. [laughs]
Steve: [laughs] Okay.
Donna: We’ll have to explain that.
Steve: Yeah, do that just a little bit so they understand what that’s all about.
Donna: One of our first dates was at a restaurant. We sat across from each other and he had no idea about communication, or anything. And he was sitting there, feet under him like normal. And I took off my shoe and touched the top of his shoe. And so it was my bare foot, and it just seemed to make the whole experience more relaxed because you weren’t touching me or anything, but I was.
Steve: But we were connected.
Steve: Through touch.
Donna: With the toe. [laughs]
Steve: Right, exactly, and it’s just really pretty cool. So that is kind of how we are working to enhance communication without eye contact. And it has just been really great. And I appreciate you coming up with the idea on how to do that and how it works. And it’s just cool.
Donna: Another thing that I like when we’re walking and I need us to be a smaller package passing between chairs
Steve: Or down a crowded aisle in the store, or whatever.
Donna: Yes, is to pull your hand to the middle of my back. And it doesn’t take much, but I can feel him back there, and I just feel like I got a little train and we’re going through the aisles. [laughs]
Steve: So communication is so important, and being able to hear what’s going on is important. And it all comes together, when people work at it just a little bit and it can really improve your life.
Donna: Another thing. When we get out of the vehicle, I tell you which end of the vehicle to go to.
Steve: Right. The front or the back, which is helpful because I won’t always make the right guess. [laughs]
Donna: [laughs] Yes,we’ve done circles around the track. [laughs]
Steve: [laughs] But that’s okay.
Donna: And when we walk up to the vehicle, I don’t try to lead him to the door. I let him be as independent as possible. And I tell him you’re at the tailgate or the back bumper, and he can find his way, and he finds the door handle.
Steve: Yup, it works out great.
So anyway, we just wanted to share our little story with you here on the podcast, as it is our way of improving communication.
Jonathan: Randy Reed says:
I am writing in response to Amanda from episode 240 and her message about trouble hearing in loud environments without a hearing impairment.
I am 35, and also have good hearing. I also have had similar experiences to the ones she described while trying to navigate restaurants and events.
I recently found myself in a position where I needed to replace the headphones I use on a daily basis. I’d heard about the transparency mode on AirPods Pro, and decided to give it a try. I found it worked to my satisfaction. While doing the trial, I discovered, quite by accident, conversation boost mode and the ability to filter out loud sounds.
The weekend I got them, I was attending a diversity event that focused on, among other things, disability. It was going to be crowded, feature live music, and was an unfamiliar environment. So I decided to try it out.
What I found was that after configuring my personal hearing profile, I was able to hear conversations around me with relatively little difficulty. I didn’t feel like I had to shout to be heard because the mics picked me up just fine, and other people didn’t shout because they normally don’t in places like that since they can read each other’s lips. I enjoyed the event more than I normally would something like that. And I didn’t feel completely drained after leaving and returning home.
Until I heard Amanda’s contribution, I figured it was just something I was dealing with on my own. Since I’m clearly not, I wonder if with the advancement in audio technology, devices that can offer a set of features like I’ve described here might be one day considered mobility aids? To my way of thinking, this is no different than a set of glasses a sighted person might wear in a particular lighting environment. They also have cheaters, small reading glasses that magnify for people that don’t qualify as visually impaired, but still need assistance in certain circumstances. I hope headphones like AirPods Pro do get considered to be mobility aids. They’re expensive, and not everyone who would benefit from them can afford them.
In any case, I wanted to write in because I feel the conversation about loud environments and their impact on blind people’s enjoyment of them is one worth having.”
I read a lot of tech news and read an article about this very subject just the other day, essentially saying that earbuds, headphones, are increasingly being used to remediate noisy environments. It’s kind of sad that it’s necessary, but it is obviously true that sighted people are doing a lot more lip reading than they may even realise.
Voice message: Hi, everybody. It’s Robin Christopherson from the UK here.
I’m just providing a couple of comments on, I think, it was Amanda’s input about finding it challenging to hear conversations particularly in noisy environments and then your comments, Jonathan, also to that effect, particularly at the NFB convention.
I think we’ve all been there. We’ve all experienced how challenging it is to hear conversations when there’s loads of other noise going on and particularly, other conversations, too.
Judith, my wife, who can see and hear fine still needs to be able to lip read to be able to hear what people are saying. So she says, let me put my glasses on so I can hear you. So yeah, she pops her glasses on and then she can hear people much better because everyone I think is subconsciously getting reinforcement by seeing people’s lips move.
So yeah, I feel doubly disabled when I’m in a noisy environment, and I can’t hear what people are saying and I also can’t see anything either.
I did hear once (and I’ve never had an opportunity to put this into practice) that in a noisy environment even where the acoustics are bad, if there are lots of people talking then if everybody just spoke very softly and quietly, everybody would be able to hear everybody else perfectly well. But as I say, I’ve never had a chance to put that into practice. It would require somebody telling the whole room to try this, and we would be able to see if it worked. But certainly at a vision loss convention, for example, that might be the perfect opportunity to put that to the test and see whether that actually did work.
The thing about noisy conversations in a noisy environment where everybody’s talking is that the volume levels tend to go up and up, as people need to talk louder and louder to even be able to hear what their neighbours are saying because everybody’s talking loudly. So if anything, it tends to go the other way. So it would take a lot of self-control on the part of all people present to see if we could put that to the test and see if it actually works.
The final thing I want to say is around active noise cancellation. Now obviously, if you’ve got AirPods Pro, for example, or the Sony Link Buds, or the Jabra Evolve Buds or something like that that have in-ear squidginess which blocks out the outside world and also active noise cancellation, then that can really help. But often it doesn’t give you the directionality that you need.
And I just want to quickly mention the HeardThat app. HeardThat, all one word. I’m not sure if it’s been mentioned on the show before. It’s a free app. And what it does is it gives you the active noise cancellation that you wouldn’t otherwise have in your earbuds of choice.
Now, active noise cancellation only works if you’ve got the in-ear earbuds with the squidgy in-ear canal tips. A bit like the AirPods Pro, but they already have active noise cancellation. So you can get other earbuds that block out the outside world that may or may not have good active noise cancellation.
Now, the good thing about the HeardThat app is if you’ve got these kinds of earbuds but they’re not really doing it for you in terms of sophisticated noise cancellation, maybe giving preferential pass through to the vocal range so that you can hear people, then this HeardThat app has really sophisticated algorithms that just concentrate on the vocal range.
There is an all voices mode and there is a directionality mode. And if you put it in that one and you have one person in particular that you want to listen to in a noisy environment, then you can point the phone’s microphone at that person, trying to get it as close as possible, I guess. And the microphone will be directional and the noise cancellation is incredibly sophisticated. There is a slider on the main screen between zero and a hundred. And if you slide it all the way up to a hundred, you will literally just hear that person and all extraneous noises and hopefully, many of the other conversations as well will be suppressed almost to silence in that directional mode.
In the all voices mode, you’ll get all the voices, all the conversations, but you won’t get a lot of the, in the case of a banquet, cutlery and crockery, clattering noises, etc. All of those will be suppressed.
So yeah, give the HeardThat app a go, and it will give all in-ear earbuds sophisticated active noise cancellation capability.
I will say though, you only get 30 minutes a week free of charge. Otherwise, there are 2 tiers of subscription. There’s a 9.99 a month and a 99 pounds or dollars a year subscription, but well worth it if that’s going to really help you if you find yourself in those environments on a regular basis. But yeah, give the freebie version a try when you’re next in a really noisy environment.
Jonathan: Robin, this is gold. Gold, I tell you. I have not heard of this app before, HeardThat all one word.
So I downloaded it to check it out, and I’ve learned a bit more about it that may be interesting to some people.
First, it is available for iOS and Android, and you can go to the website at HeardThat.ai for information, including a very cool interactive demo.
You adjust the slider just like you adjust the slider in the actual app, and it filters out the noise. The HeardThat app is really designed for anybody who wants to hear better in noisy environments.
If you have lightning earpods, you can use those. If you have made for iPhone hearing aids, you can also use those. The made for iPhone hearing aids introduce a wee bit of latency. It’s similar to the latency that you may have experienced if you use the live listen feature which is available for AirPods users, as well as made for iPhone hearing aids users. I just find it enough to be a wee bit disconcerting, but I wouldn’t call it a show-stopper.
If you’re listening through certain types of other Bluetooth devices that are not MFI, then you may find the latency even worse. It could be really disconcerting, but I guess the only way to find out how bad it is is to give it a try. Even if you don’t have any of the models of earbuds that Robin is talking about, you can use this thing. You can even use it with headphones, just basically anything that plugs into the lightning port or pairs with your iPhone via Bluetooth.
With my made for iPhone hearing aids, for example, I can, if I want to, completely mute the microphones and just get audio from the iPhone, which would make it ideal. Although as Robin says, if you do that, you lose the directionality. Again, I don’t think that’s necessarily a show-stopper if you know the people around the table, or if you’re just talking to one person. You can find out where they’re talking from before you point the iPhone in their general direction.
There are two modes – directional and unidirectional. At least with iPhone, when you’re in directional mode, it is the top front microphone that is being used. So if you lay the iPhone on its back, then it is the top of the iPhone that you need to point at the person speaking, which makes sense.
When all voices are switched on, that’s the unidirectional feature, the algorithm still works. You’ve still got that slider. It is accessible.
The premise of this app is that with the power of today’s smartphones, you can do a lot with machine learning, and these smartphones can be harnessed as very sophisticated hearing aid filtering type devices that can separate irrelevant noise from relevant speech. And the developers claim that they can do a better job of that than many hearing aids can, because of the power of the processors on these devices.
How well does it work in practice? Well, Robin seems to think very highly of it. I can’t tell you my own experiences yet because Bonnie and I just haven’t had an opportunity to go out to a noisy environment yet. I’m really excited about trying this app, and I can’t thank you enough, Robin, for sharing information about it with us because I’ve not heard of it before.
I must say, I’m cautiously optimistic that this could be a real game changer, but I am used to many hearing aid manufacturers and hearing products over-hyping and under-delivering. This sounds like it might be a really good app, though.
And the other good thing is that Robin is correct. You only get 30 minutes a week in the free mode. However, when you install it on your device for the first time, you get 30 days of unlimited usage. That should give us ample time to try this thing in a range of environments.
What I also found in my very quiet environment is that if you set this to all voices mode, it is listening very carefully for speech. So you may find that this is also a very useful tool in a meeting. If you put your iPhone or your Android device in the center of a table and put it into all voices mode, if you do have trouble because you’re hearing impaired hearing people far away in a meeting environment, I have the feeling this may even do a better job than some devices like the Roger On.
But we’ll see. I’ll let you know how I get on when I take this thing places.
But I’d like to know how you get on, too. If you have difficulty hearing in these noisy environments and you give this app a try, HeardThat, just search for that in your App Store or Google Play and get lots of information on their website at HeardThat.ai. Let’s compare notes, find out how much of a game changer this really is.
Voiceover: Like the show?
Then why not like us on Facebook?
Learn about upcoming episodes, and see when a new episode is published.
Recently, I read a fascinating article in Ars Technica which is a very credible technology publication, and they do a lot of in-depth stuff. And this was extremely in-depth about the way that large language models work. Large language models are the kind of technology that ChatGPT and Google’s new Bard technology are using.
What really stood out for me about the article was a statement that humans really don’t understand anymore fully why these large language models are making some of the decisions that they are making, which is quite awe-inspiring and perhaps a wee bit frightening, depending on how you view it.
I raise all this because Haya Simkin is writing in from Israel about ChatGPT. He says:
I’m at the end of a university assignment that is supposed to be 4 pages long, where the font is supposed to be David and the size 13.
I also work as a translator and spend a lot of time reading briefly from the source file, going to the translation file and typing a wash, rinse, repeat. It’s all very slow.
And so I got curious about how long these things take most average blind people. I tried to look this up years ago when during the worst job interview of my life, I was asked to take a typing test. I managed to type at 30 words per minute when they wanted 38. Never mind that typing texts are inaccessible since sighted people read with their eyes and type simultaneously with their hands, while blind people read with their hands and type with their hands. At any rate, after I didn’t get the job, I was curious about the average typing speed for blind typists, and couldn’t find anything.
Being frustrated with my seemingly slow progress on my short paper, I asked ChatGPT how many words could fit on 4 sheets of A4 paper using the David font at size 13, as per the instructions of my assignment. And it said approximately 2000.
I then asked how fast a blind typist could type and edit this with a Braille display and a screen reader. And what follows is its answer. I would like to know if this rings true for you or other listeners and whether or not it matters if the speech is turned off, since I generally turn it off.
What follows is the answer. Notice how ChatGPT handles the word Braille.”
Okay, I will check that out.
“Typing with a screen reader and Braille” (with an uppercase B). Oh, look at that, go ChatGPT! [applause sound effect]
“Typing speed with a screen reader and a Braille display can vary depending on your familiarity with the system and the complexity of the text you are typing. Generally,” says ChatGPT, “typing with a Braille display might be slightly slower than conventional typing, especially if you are not very experienced with the Braille system. However, some individuals who are proficient in Braille can achieve relatively high typing speeds once they get used to it.
Assuming a moderate typing speed of around 25 to 30 words per minute with a Braille display, it might take approximately 66 to 80 minutes to type 2000 words.
Editing text with a screen reader and Braille display involves navigating through the content, identifying errors, and making corrections. The editing speed using a Braille display can also vary depending on your familiarity with the Braille system and the complexity of the editing task.
As with typing, the editing speed with a Braille display might be slightly slower compared to visual editing with a keyboard and monitor. For an average editing speed of around 800 to 1200 words per hour, it might take approximately 1.67 to 2.5 hours to edit 2000 words.
It’s important to note that these estimates are based on general assumptions, and individual performance may vary. With practice and experience, you can improve your typing and editing speeds using screen reader and Braille display, just as one can with conventional methods.”
Well, thanks for that, Haya. I mean, that seems extraordinarily slow to me on both counts, but I don’t actually know what’s average.
What is the average Braille reading speed? I don’t actually have that data. So maybe it’s right.
It’s certainly not reflective of the way that I personally work, but I acknowledge I’m a very fast Braille reader. So I guess we’ll open it up and see what people think of ChatGBT’s little missive on this subject. But it does get the kudos for spelling Braille with an uppercase B. Congratulations!
Thanks to Michael Babcock for sharing this with us.
Michael: Today, I’m gonna walk you through the process of configuring VoiceOver for preferences to be used on a portable device.
Now, the portable device that I’m using is kind of extreme. I think it’s 4 terabyte. Yeah, it’s a 4 terabyte external hard drive, but this is just for demonstration purposes. You definitely don’t need something that big. [laughs] You actually can put your portable preferences on a 2 gigabyte thumb drive if you’d like, and take them with you. That’s the advantage to this.
When I turn VoiceOver on, it doesn’t do anything different. It just says VoiceOver on. But if I press VO F8, …
VoiceOver: Opening VoiceOver utility. VoiceOver utility window. Utility categories table. General, selected. Has keyboard focus.
Michael: and select the general category, you’ll see an option in this section.
Now, why might you want to do this? Let’s say like myself, you’ve gone through and spent the time to configure your numpad commander, your trackpad commander, and your keyboard commander, and you’ve configured the verbosity and made all these configurations to your VoiceOver.
And then, you use a spouse’s Mac, or then you use someone else’s Mac, and you’re like, why is this doing this this way? Well, portable preferences is the answer. With portable preferences, when you turn VoiceOver on, it will ask you do you want to use the portable preferences on, and then insert the name of the drive that’s connected. VoiceOver is smart enough to realize that there is portable preferences on the root directory of the drive that you’ve connected.
So for example, if I jump over to Finder, …
VoiceOver: Finder. External SSD window.
Michael: And I’m on my external SSD, and I tap the letter V.
VoiceOver: external. Parallels folder. Slack, application.
Michael: It actually takes me to Slack, which doesn’t make any sense, but that is showing you that there’s nothing in here. So if I command tab back over to …
VoiceOver: VoiceOver utility. VoiceOver utility window. Utility categories table, general.
Michael: And then I VO right arrow or in my case, use numpad 6, we’re going to locate …
VoiceOver: Portable preferences off, image.
Michael: And that says portable preferences off.
Let’s press VO right arrow, …
VoiceOver: set up button. Set up portable preferences.
Michael: And we’re going to choose the set up portable preferences option. Press VO space.
VoiceOver: In dialogue. Volumes table, select an external drive.
Michael: So, now this is telling you select an external drive, so I’m going to tap the letter X.
VoiceOver: External SSD.
Michael: And then, press VO end.
VoiceOver: Okay, default button.
Michael: And we’ll press VO space.
VoiceOver: Stop button. Closing dialogue. Stop button.
Michael: Now it says stop. Set up portable preferences. And it says set up portable preferences. If we press VO right arrow, …
VoiceOver: Keeping your voiceover preferences up to date on the drive. External SSD.
Michael: And we’re going to now go back over to VO left.
VoiceOver: Stop button. Portable preferences, on.
Michael: Portable preferences is on. Now, anytime you make a change to your VoiceOver preferences, those changes will be reflected on the portable preferences file that’s now been stored on the root directory of your external drive.
Let’s take a look at that. Command Q.
VoiceOver: Finder. External SSD window. Slack, application.
Michael: And then if I press V now, …
VoiceOver: Voiceover folder.
Michael: There’s now a VoiceOver folder. We can go in there.
VoiceOver: VoiceOver 4.portable group. 1 of 1.
Michael: And there’s VoiceOver for portable.
Now you might be saying Michael, this is good, but I just need a backup of my VoiceOver preferences so when I get a new Mac, or if I reset my Mac, then all I have to do is import it. And that’s great. That works. Let me show you how to do that with VO F8.
VoiceOver: VoiceOver utility. VoiceOver utility window. Utility categories table.
Michael: And then, we’ll go up to the file menu.
VoiceOver: Menu bar. Apple. File.
Michael: And move it down.
VoiceOver: File menu. Import preferences. Export preferences. Command shift E.
Michael: So you can use command shift E to export your preferences. You can use command shift I to import your preferences, so VoiceOver always works the way you expect it to.
Voiceover: Living Blindfully is the podcast with big dreams, and no marketing budget. Oh, dear, dear, dear.
This is where you come in.
If living blindfully is something you value, we really would appreciate you taking a few minutes to write a positive review of the podcast. Positive testimonials help spread the word and encourage new listeners to give us a try.
If you have a moment to give us a 5-star review, we really would appreciate that. Thank you for supporting Living Blindfully.
Josh Kennedy writes in. He says:
are you familiar with guide horses and using small 24 to 32 inch miniature horses or equines as guide dog animals? And do you know Anne Eadie and her guide horse, Panda,” which, I suppose, is better than having a guide panda called horse, “and Mona Ramoney and her guide horse, Callie?
I myself am very interested in possibly getting a guide horse someday.
Do you know of any trainers willing to train one for me, and to provide financial assistance for finding the right horse, buying it, training it, and perhaps assistance with veterinary, farrier, and dental care?
If not, could you at least devote one of your podcasts to guide horses? I really want to spread the word about guide horses. And I think, since lots of blind people listen to your show, that the Living Blindfully podcast may be a great way to do that.”
Well, I’m not sure about an entire episode, Josh, but we’ll certainly devote this segment to it as best we can.
Bonnie Mosen is not only very attuned to guide matters, but she’s also an horse fan, an horse fan. So I thought I’d get Bonnie back in the studio to talk about this.
I remember, Bonnie, during the Blind Line days, so we’re talking almost 25 years ago now, guide horses were a real thing. A lot of people were talking about them for a while.
Jonathan: And I kind of wondered how they worked. Are they traditional horses? Like if you were to run into one at an NFB convention, would you think, “Oh, there’s a horse in here.”?
Bonnie: Yeah. So let me just start out a bit and talk a little bit about my background. I’m a lifelong equestrian, so I grew up around race and show horses. I’ve ridden hunter-jumper. I’ve also done dressage, and I’ve owned racehorses. So I have quite a bit of knowledge around horses. I also have a minor in equine science.
So we’re not talking big horses like Marcus, who’s over 16 hands. We’re not talking about the size of Marcus.
Jonathan: This is Marcus the racehorse, for those who haven’t kept up.
Bonnie: We’re talking about miniature horses that are about 24, 30 inches tall, maybe the size of a large dog.
There was a school, I believe it was in North Carolina, called Guide Horses that was training them.
There are a few people, he mentioned Anne Eadie, I don’t know what she’s doing now. She had a horse called Panda.
Bonnie: There was also a woman down in Florida who had a horse called Confetti. There’s also a Muslim lady that has a guide horse. I haven’t heard much about that lately.
Jonathan: And that’s an interesting point. So that would get around some of the concerns that people of that faith have with dogs.
Bonnie: Yeah. And I believe there’s a BBC journalist in Britain that has one.
There’s a lot to unpack here. Horses are not natural animals to live in houses, so you have to look at a lot of things.
A guide dog, they’re totally different kinds of animals.
The advantage that some people believe to having a guide horse is that they live a lot longer.
Bonnie: A horse can live to be 30 years. However, your circumstances might change in that time to having one.
And you’ve got to look at the whole public perception, too. Getting a horse on an airplane could be difficult. They can’t lie down. Where are you actually going to keep the horse?
The woman that I’ve heard about in Florida actually kept the horse inside her house. It stayed in a closet. But that’s not really a natural environment for horses. Horses need to go out and graze.
Jonathan: But couldn’t we say the same thing about dogs? That a house is not really a natural environment for a dog?
Bonnie: Well, I think it’s different with a horse because they’re just totally different kinds of animals.
Jonathan: How able are they to obey commands? I mean clearly, they must be able to be trained to respond to commands and navigate.
Bonnie: They are smart. Horses are smart. None of mine I would trust to guide me, though. [laughs]
Jonathan: Right. But if you trained one and gave them the training, is their training readily available anywhere?
Bonnie: Not really. I mean, you could probably find some trainers that would do it. But in terms of a school that you could go to, there wouldn’t be, to my knowledge.
Jonathan: Do they have horseshoes on, so they’re sort of clopping along?
Bonnie: Some of them wear sneakers.
Jonathan: What? Wow.
Bonnie: Or a special kind of sneaker.
Now, I do know that one of the guide horses was at an NFB convention, and the horse was absolutely terrified and was kicking out at some of the guide dogs. It was a miniature horse.
Because horses are flight animals, and so they can be a bit spooky. I mean, I think if you lived in Montana or lived on a big farm, it might be OK. But I can’t imagine working a horse in Boston, or New York, or even Wellington, New Zealand.
Jonathan: How do they guide you? Do they have a harness?
Bonnie: They have a harness, yeah.
I’ve never worked with one, but from what I understand from things I’ve read about them, they do have a harness.
Jonathan: Could you ride one as well?
Bonnie: Not a miniature, no.
Bonnie: Not if you were an adult size. You couldn’t. Now, there was some woman that did supposedly have a horse that she rode around.
But also, I think what some people don’t realize is horses are very expensive to keep.
Jonathan: Well, if you did the numbers though, and you are say, 20 years old and you get a guide horse that takes you through to when you’re 50, then presumably, the numbers are in their favor.
Bonnie: It could be. But then, you’re, you know, the same thing with a guide dog. Horses interact with people very differently than dogs do. So a lot of it would depend on what you’re wanting for. I mean yes, you could have a horse as a companion, but it would be very different from a dog. It’s crazy enough when you take a guide dog in somewhere. But if you can imagine bringing a horse somewhere, you know, into a restaurant, and it’s just not a normal environment for a horse.
Jonathan: Walk into the restaurant and say hey.
Bonnie: I mean, the person that I heard about in Florida that had a horse called Confetti, she would, you know, rock up to the restaurant and tie it outside like you were at the old hitching host.
Jonathan: Right, right.
Bonnie: Yeah, and I don’t know.
Jonathan: And so would you necessarily need to keep your horse inside?
Bonnie: Well, I mean, I don’t think Panda stayed inside. I’m not sure.
But again, you’d have to have the land to keep the horse on. A lot of it is the type of surface that the horse is on. And my friend Lisa could speak more to this. But when you look at a barn and the stalls, there’s different kind of straw and footing. So living in a house with certain kind, and plus the damage the horse could do to the house, you know.
They put diapers on them. And some of the guide horses, I think they had a diaper on them when they were out walking in public.
Jonathan: So can they not be trained to indicate like a guide dog that they need to go out?
Bonnie: I don’t think so.
Jonathan: Maybe others will know a bit about guide horses, too. I do, as I say, remember it coming up. And there was a lot of buzz about this about 25 years ago, I remember, when I first started doing Blind Line.
Bonnie: Yeah, yeah.
Jonathan: So it’s funny how these things come around.
Bonnie: It was a guy that, … There was another user that lived in Maine that had a horse called Cuddles, I believe was his name.
Jonathan: [laughs] That’s nice.
Bonnie: But again, when I was living in Boston, people had, you know, they’d kind of, …
I think it was Cuddles. It was a guide horse. Again, the horse was absolutely terrified.
I think if you lived in Montana or somewhere like that, … But I mean, being a lifelong equestrian, I just, I wouldn’t do it.
Bonnie: I mean, that’s my, that’s my thoughts on it.
Jonathan: Well, if anyone else has thoughts on it, opinion@LivingBlindfully.com, if you would like to contribute on this whole guide horse discussion.
And yeah, I thank you for sharing your perspective on that.
Advertisement: Living Blindfully is brought to you in part by Aira, and I thank them for their sponsorship of the podcast.
You know we’ve become used to some businesses offering free Wi-Fi. It’s a nice touch, and it makes us feel valued whenever we come across it.
And I know similarly that when I learn about a business that has purchased Aira Access, it’s like putting out a massive “Blind people are welcome here.” sign. I know that if I need it, I’ve got a trained professional agent available to me to provide assistance, and that means that the business cares enough to pay for that. I appreciate that.
From airports, to Starbucks, to Target, and more, Aira Access can assist you to navigate, shop, browse and not be reliant on friends, family or others who may not understand our needs as well. And don’t forget that as well as the offerings in physical locations, there are other businesses providing Aira Access that can be used from home.
So you should check out the app to find out what’s available. To do that, just open the Aira Access section in the Aira Explorer app on your favorite device. You can also visit Aira’s website to find out more at Aira.io. That’s A-I-R-A.I-O.
Jonathan: We are in a candy store, a toy shop at the moment because we’re surrounded by Orbit equipment. And also, of course, Access Mind too, because we’ve got Venkatesh and Adi from Orbit Research and Access Mind. Welcome to you both.
Venkatesh: Thank you.
Jonathan: Yeah, yeah. Good to have you with us in person, no less.
And last time we were talking, we were discussing the Optima and various other products. Who shall I start with?
Adi, how’s the Optima going? I see you’ve got the thumb keys. That’s fantastic.
Adi: Yeah, we’ve added thumb keys, thanks to this guy. I mean, I’m just the small person in the combination. Venkatesh’s team is doing the majority of the work. [laughs] But we did add thumb keys, and we’re moving forward with the software part, too. And of course, the hardware part you already saw.
In terms of the software, you saw our sort of notetaker experience that we are starting to work on, in addition to Windows. And yeah. For now, we’ll be showing the thumb keys as an improvement. Things are coming along pretty well.
There are some challenges, but we hope that we will be on track in the best way we can. But that’s to show you that feedback is critically important. A lot of people ask for thumb keys, you have them in less than 3 months after.
Jonathan: Well yes, I’m heartened by that. And look, it’s an amazing piece of hardware. Just actually using one and seeing Windows on it, you’ve got the notetaker environment there for those who want that, and it’s really easy to shoutout to JAWS. So it’s coming together. And the idea that you can just have this device and it’s full Windows, No compromises, no proprietary stuff unless you want that, no kind of shell to worry about. Man, it’s exciting! And with pretty impressive battery life.
Adi: To hear this from you, Jonathan, is quite an achievement.
Jonathan: Yeah. Well look, I’m actually excited about this product.
Adi: To hear from you that you like the hardware means a lot to me personally, at least.
Jonathan: [laughs] Well, I wonder. It’s a big step, isn’t it, for a professional to replace a trusted brand of laptop with a device like this. It’s a huge ask, isn’t it? It requires a lot of trust.
Venkatesh: It does. It absolutely does. And we expect people will want to think hard about that and weigh the pros and cons. And we feel that I think, many people will agree that the unique things that it offers, such as the form factor, the battery life, the built-in Braille, and having a singular device instead of having multiple ones will convince them that it is a device that they should at least try.
Adi: And the future-proofing.
My main mission with this project, … I’ve been wanting to do this for years. This is not a secret. I think that, and we discussed this with you, Jonathan, we were talking here a lot for a while.
I think that unfortunately, we don’t expect enough from our assistive technology that we have today in the market. Why do we have to buy a $6,000 piece of equipment that from the moment you purchase it, has stuff that is 3, 4, sometimes even 6 or 7 years old on board? These things don’t make sense to me.
And while, you know, many arguments can be said that it is difficult to learn to use a computer with a screen reader, you know what? At the end, a blind person who wants to get employed will have to do it, anyways. Regardless of the notetaker or other solution that they start from, they’ll have to do it, anyways. So why start from a platform which you know that will be obsolete for you?
For example, even in the basic stuff, one of the notetakers I have (I’m using one).
Recently Microsoft changed their email protocol access due to new login policies with two exchange servers. And I lost access in the email client of the notetaker because the manufacturer chose not to update it. And while I can go to the Play Store on that particular device, download Outlook, that removes the whole purpose.
Adi: I mean if I paid for proprietary experience, I expect it to get working. I don’t care if the manufacturer has resources or doesn’t have resources to do so.
If you send us to use mainstream things anyways, why not start from them? And why not be based on them from day 0 and then these things won’t happen?
So the ultimate goal why we started this company together and what we’re doing first with the Optima, and it will continue, I hope, to other areas is to make these things run on modern mainstream hardware. And more importantly, to redefine what we as consumers expect from these things. Because I think, personally, I think Venkatesh will agree with me too, and I’m sure you, Jonathan, might as well. We should expect more for the money we pay. I’m sorry.
Jonathan: Oh, for sure.
And the other thing. I’ve just come from the Employment Committee meeting here at NFB. And we know that for all the advancements that we’ve had in technology, there does not appear to have been a commensurate improvement in unemployment, even though there are so many more paperless offices. We’ve made so much progress.
Adi: Yup, yup.
Jonathan: And it is incumbent upon us, I think, mindful that attitudinal barriers are still significant, to reduce our own barriers as much as possible. And the reality is, the majority of the world’s workplaces use Windows. So if we can, …
Adi: Go ahead and try to integrate one of these notetakers to a workplace.
Adi: Good luck to you doing that.
Jonathan: Right. So we’ve got to sort of take charge of what’s within our control, right? And if we are proficient with our Windows technology, we’ve got good Microsoft Office skills, it’s not a slam dunk because there might be some proprietary stuff that’s being used in some of those workplaces that will cause problems and might require scripting.
Adi:, Yeah. Correct, correct.
Jonathan: But you’re a lot further down the track if you’re using a device that’s mainstream, so it is an exciting product in that regard.
Adi: And also, we want to leverage it down to other areas. For example, it’s not going to be available from day 1, but we learn from the big guys. I mean, I have big software hopes for this. My idea is, and we of course share it with Venkatesh. Our idea is to create an ecosystem of products that will, you know, talk to each other, seamlessly sync with each other, and connect to the mainstream altogether.
So for example, if I create a Braille file, even I come to my PC, I want the same thing to be available on my PC and later on, mobile platforms, too. So our goal is really to make this behave, look, and be based on mainstream concepts through and through. This is what we believe the future is.
Jonathan: When I go into that little notetaker kind of environment, I guess, would you call it a notetaker environment? What would you call it? It’s kind of like a shell on the Optima.
Adi: It is essentially a set of Windows apps that are fully accessible and simple to use, just like what you have on these other notetaker devices.
Adi: We call this Braille UI.
Jonathan: Yeah, okay.
Adi: This is our name for the software experience. It contains several parts. It contains our apps which you can use either if you are in full Braille UI mode, which means that, for example, instead of your desktop and the start menu, you have this simple apps list that you saw, like on any other notetakers – simple menu like File Manager, Word Processor, these kinds of stuff, and they will be customizable. Or the apps themselves will be able to be used individually without this mode, so you will be able to choose to operate your device as a laptop, but open our apps individually.
For example, if you want to use the Braille editor to create a Braille file, or we’re going to have a podcast player, and we’re going to have a Braille calculator that accesses the blindness libraries to read books from. So we are going to offer all of the functionality that is missing that is Braille-specific, but still Windows-based heavily. That’s the point.
Jonathan: And when I go in there, it is running JAWS, so it’s not self-voicing and self-Brailling, which is quite good. And I presume you can also just choose to have NVDA there as well.
Adi: We have installed NVDA by default, yes.
Adi: Now this is running JAWS, too. JAWS is the one I use more myself the most. [laughs] But yeah, we’ll have both NVDA and JAWS working with tweaks to each of them.
Adi: So the software we’ll also provide, the support between the screen readers will be as consistent as possible.
Jonathan: Okay, that’s good.
And I was arrowing through and seeing all the applications that looked very standard, very easy to use. Just arrow through, press Enter.
You’ve got cursor routing keys above the display. and in the latest version, the thumb keys.
So what will the word processor type product be like on this?
Adi: Similar to your KeyWord – KeyWord and the BrailleSense word processor that you have today on these, too. So essentially, BRF creation and reading of course of BRF files, converting standard, like, docx, doc files to BRF and vice versa, you know, the standard.
Jonathan: When I save, what format is it saving in?
Adi: It will save to various formats. You will be also able to save to Word.
But this is not a Microsoft Word replacement, okay? We don’t target or advertise it as a Microsoft Word replacement. It will have all the same formatting capabilities that the other notetakers have in their word processors. But if you are really heavily relying on Microsoft Word, you should continue using Microsoft Word.
That’s our ultimate goal. The word processor that we’re making in the Braille shell will be primarily for those who need a dedicated Braille environment.
Jonathan: What I really liked was that you could bring up a list of all the applications on your device in the shell, and it’s a nice, easily navigable list. They’re in alphabetical order, I think, aren’t they?
Adi: Yes, they are in alphabetical order.
Adi: The idea is that when your device is configured to operate in Braille UI mode, there’s no desktop and there’s no start menu. So you are really interacting. If you will come to the Optima from another notetaker, it’ll be a piece of cake for you to get used to.
Jonathan: The little list of applications there reminded me a bit of the Leasey start menu concept.
Jonathan: I’m a huge Leasey fan.
Adi: I use Leasey, too. [laughs]
Jonathan: Yeah. I’ve been looking forward to having Leasey on it. Can you imagine the immense power? If you are a JAWS licensee, so you put your JAWS that you already have on this device, and you’ve got Leasey on there as well, on this sort of form factor where you can just carry it around, I can imagine that Executive Products would want to manufacture a case for a device like this. So you have it over your shoulder. You could potentially use it while it’s over your shoulder, open up the case, and you’ve got the power of Leasey, of JAWS, of NVDA, whatever your preference is, without any kind of compromise at all. It’s actually pretty mind-blowing.
Adi: That’s right. And we are going to make some things global. For example, we are going to have a global module for displaying system tray icons, which will be screen reader agnostic. Even if you’re on Narrator, this will work. Same thing for the window list, for the taskbar, and some other tweaks.
One thing that is very important is context-sensitive help. Everyone knows that. So we’ll have a global key for context-sensitive help for various apps, which again will be screen reader agnostic.
So in addition to the power that you’ll have based on the screen reader you choose to use, our stuff will be completely the same in terms of operation, regardless of which one you pick to use.
Jonathan: And Venkatesh, the machine that you’ve been tantalizing me with, which I know you’re going to take away when you leave, so I’m keeping you here as much as possible and I might hold you hostage. It is using the Orbit cells. And I made the comment that it’s been a while since I’ve been up close and personal with Orbit cells. They’ve definitely become a little quieter over time. And I don’t know whether everybody realizes that.
Venkatesh: That is absolutely true. We’ve been working continually on our Braille technology. We manufacture each and every part of that Braille display. And so that gives us the ability to tweak things, to improve things like power consumption, the sound that it makes, reliability, etc.
So over the past 7 years that the Orbit Braille products have been on the market, we have made steady improvements, and we continue to do so.
Adi: 7 years? Wow! Time passes fast.
Venkatesh: Time passes. Yes, indeed.
Adi: Yeah. Jonathan: Yeah. So that’s pretty impressive.
You’ve also got the OrbitSpeak here, and I’ve had so many listeners expressing interest in this.
I’ve got to say, the first thing that happened to me when you put it in my hand was, gosh! This is cute.
Jonathan: This is a really cute device. Very reminiscent of the Braille N Speak days.
Adi: You really want to keep it?
Jonathan: Yeah, yeah. [laughs]
Where are you at with this product? Because I think, last we talked, you were hoping to actually have it for the conventions. But it’s not out quite yet, right?
Venkatesh: That is correct, yeah. We were hoping to have it for the conventions. But as you know, development, especially hardware and of course software as well, is a complex process, and has been made much more difficult by the pandemic, supply chain issues, etc. So all of those have caused us to have some additional delays beyond what we originally anticipated.
So we’re currently projecting having these available for shipping sometime in the September-October time frame.
Jonathan: So I powered this thing up and arrowed through it. It’s pretty responsive. It’s running Vocalizer voices, and it’s got quite a good little suite of products.
What does it do? You’ve got calendar, notetaker-type thing, media player. What else have I missed? Lots of other stuff.
Venkatesh: Yes. So we’ll be offering this in two different models.
The standard model will offer a note taking editor, a book reader which will read text and BRF files, a calendar, an address book, clock and alarm.
The elite model will add to that a media player that can play common media files – MP3s, AACs, etc. It’ll also have the ability to read more advanced file formats, such as PDF, DocX, EPUB, DAISY, etc, and library access, so libraries like Bookshare, NLS Bard.
Got newsLine on this, too?
Venkatsh: Yes, absolutely.
And it has internet radio, a podcast player as well. And the platform is designed to be very very powerful. And we expect it to be very capable for the next several years to come. And we’ll continue to add more capabilities to it in software.
It’ll also be possible to upgrade from the standard model to the elite model, purely by software.
Jonathan: What operating system is it running under the hood?
Venkatesh: Under the hood, it’s running Android.
Jonathan: Oh right, it is. Okay. Are you keeping that Android current, or does it matter?
Venkatesh: It doesn’t really matter. That’s a great question because we are putting in a lot of effort into making the apps so that the user has a really really good user experience when not using TalkBack.
Jonathan: And you don’t envisage it ever being able to shut out to TalkBack?
Venkatesh: That’s a great question. I would not say that we don’t envisage it ever doing that. It is an option that is always available to us. It’ll depend on how the market reacts to a possibility like that.
Jonathan: Are you hedging your bets a bit? Because it strikes me this is the absolute antithesis of everything the Optima stands for.
Venkatesh: [laughs] Yes, indeed. I’m hedging it because though we currently do not intend to make it a completely open Android platform, there may be situations where there are applications which we do not wish to create ourselves, but where an available app might be very nice to have and for which TalkBack support might be necessary.
Venkatesh: So the platform fully allows it. How we deploy it, how we position it is something that remains to be seen.
Adi: And it’s also a little different markets for these devices.
Jonathan: Very. Yeah. Because on the one hand, you’re saying mainstream devices are where it’s at. We should embrace the mainstream. And then you’ve got this quite proprietary market.
Adi: First of all, it’s also based on mainstream techniques.
Adi: This device also contains Android under the hood at the end. These are Android apps and stuff.
Adi: The hardware, (I know that because I know the insides of this), this is a lot better even than more of your expensive hardware available in your typical Braille display that you buy for more money. So in terms of the hardware, Orbit did an amazing job to keep it mainstream in terms of being modern.
But these are two different markets. You have to not forget that this is a device that is used, and it’s designed to be used primarily as a simple notetaker on the go, to grab a quick note, to read a quick file, and to close, to shut it down. It’s not your serious workhorse like these notetakers or Optima are intended to be.
This is why I don’t like the name notetakers, because this is a notetaker. The OrbitSpeak is a notetaker. [laughs]
Jonathan: This is a notetaker, yeah. This reminds me of the old Braille N Speaks where the idea is you just want to fire this thing up and jot something down with as minimal fuss as possible.
Venkatesh: Just like Braille N Speak, it instantly turns on. You can capture a note right away, shut it off, and put it back in your pocket.
Adi: This is why I really think that the term notetaker is not appropriate for the high-end devices that the Optima aims to replace because it became an Android tablet. All the other notetakers around are essentially Android tablets, right? And this is exactly, Jonathan, what you’ve been talking about. But they’re Android tablets that lag behind the mainstream very significantly, so they should not be called notetakers. This is a notetaker. The Orbit Reader 40 is a notetaker. Your other displays like the, I don’t know, Brailliant or the, what else? The Mantis.
Jonathan: The Mantis, yeah.
Adi: These are notetakers.
Adi: Not what is marketed as a notetaker.
And this is a part of what we are trying to do with Access Mind, and also Orbit from their side. We are really trying to change the way that people think about these things, because people should think about them as they are, not as they are marketed for.
Jonathan: This OrbitSpeak is just, … I can’t get over the form factor. You can imagine it being put in a pocket, at people’s purses, that sort of thing.
How much bigger is it actually than the Orbit Writer?
Venkatesh: It’s a little thicker. It’s about roughly twice as thick, and just slightly bigger in the other dimensions, literally by about a quarter inch in the other dimensions. It’s very very compact. Yeah.
Jonathan: So who’s it for? In this day and age when people have so many options – iPhones, and goodness knows what, who do you hope will buy this?
Venkatesh: So we know that there are a lot of people who love the Braille N Speak, so that’s one of the primary user communities that we expect will like this device. The reaction from the community has been very very positive.
Jonathan: So others are like me when they get it in their hands? Do they exclaim and say wow? [laughs]
Venkatesh: Yes. Precisely, precisely, and when they hear their descriptions.
We also expect that people who really want to have a dedicated device that they can instantly use without having to mess around with a screen reader or a touchscreen experience would like to use this.
Jonathan: How do I get stuff off the device if I’ve written a document? Is it going to be easy to get it onto a computer, and what format will that be in?
Venkatesh: So it’ll let you store in various formats that it supports, you know, text and BRF files, if you have doc files that you have, you know, somehow downloaded to it. Those will also be transferable back to a computer.
You can do it either over USB (so when you plug it in over USB, it appears as a mass storage device and you can drop the file over), or through the SD card as well.
Jonathan: And you’ve got Bluetooth. And that means that you could potentially pair this with speakers, with a QWERTY keyboard as well, I guess?
Venkatesh: Absolutely, absolutely. So in its fully fully functional Bluetooth interface, not only can you pair it with anything that you would typically pair your phone with, but you can also connect this to other Braille devices such as our Braille displays. So you can have actually speech output and Braille at the same time, if you choose to.
Adi: There you go.
Jonathan: We are hoping to get a review unit of this, and we’ll definitely put it through its paces extensively because I think this is going to be very well received by a lot of people. It’s really exciting.
It is like being in a candy store because we’ve also been having a look at the Graphiti tablets, and we didn’t discuss the Graphiti much at all when we last talked. Can you talk to me a bit about Graphiti, who it’s for, what it does?
Venkatesh: Sure. So the Graphiti is a tactile graphic display that’s interactive. It’s got an array of 2,400 pins arranged as 60 columns and 40 rows of pins, and it is intended for anybody that needs to have access to graphics. It could be students learning STEM, it could be professionals who want to use it for visualizing floor plans and architectural drawings, it could be someone who wishes to program and know what the output, what the user interface that they’ve created, what that layout looks like.
It has a touch interface. So the entire graphic area is actually touch-sensitive. So you can actually draw on it, you can use it to move the mouse and your cursor on your screen.
It has several interfaces that make it a lot more powerful than what the device itself is in stand-alone mode. In stand-alone mode, it can display any kind of popular picture format – JPEG files, bitmap files, etc.
But when you connect it to a computer or a phone through a standard HDMI cable, it becomes a tactile monitor. Now, the possibilities are limitless. Whatever appears on your computer screen or your phone screen, you can instantly get a tactile rendition of that.
And most importantly, the Graphiti and the Graphiti plus are the only devices that offer our tactuator technology, which allows each pin to be set at multiple heights. And our pins are very unique in that not only can they have a projection of up to two millimeters, they can be set independently to different heights, to different levels, which can then be used to depict color, or texture, or importance, any kind of topographical feature. You can use heights, levels to depict any kind of abstract or real characteristic of the picture.
It also has a touchscreen to allow you to draw on it. So you can actually draw with your fingers, save the files, edit them, annotate files. So the possibilities are truly limitless. People have used it for everything from visualizing protein molecules, to drawing floor plans, to playing games like Tetris. So you can connect it to a computer and have a program running, a Python script someone created to play animated Tetris in real time.
Jonathan: We were having a really interesting discussion. Bonnie’s here as well, and we were all talking about how some people just get these tactile diagrams and thrive with them. And I’m so delighted that this technology is available for people who do.
I don’t know whether it’s because I’m congenitally blind, whether it’s some spatial thing. I’ve always struggled with understanding what a tactile diagram signifies. And Adi, who’s a very bright guy, was sympathizing with this position. And even Bonnie, who’s got a visual memory because she went blind when she was 8, was saying a similar thing. You’ve been thinking about that too, and you’ve got a device that seeks to bridge that gap. And I guess hopefully, over time, when you get more exposure to these sorts of tactile images, you might start to better understand what they’re signifying.
Venkatesh: Yes, absolutely. What you’re describing is definitely true. We have come across it through the years that we have had the Graphiti, both in field trials as well as as a product.
What we find is that a little bit of hint or a description of the overall image goes a long way towards allowing someone to then, interpret tactilely whatever is being displayed. And we have several ideas on how to leverage things that are now becoming very commonplace such as AI for image description, to allow precisely that.
So we pop up a picture of a scene, let’s say, with a landscape of mountains and trees in the background, and a pond. And if you have the description, then looking at the picture makes it a lot lot easier to interpret where is which part.
Also, in both the Graphiti and the Graphiti plus, we have included image processing filters, which make the images a lot easier to understand in tactile form. Not every image is tactile-friendly. But with the filters, you can make it a lot more tactile-friendly.
Jonathan: It was fun drawing on this thing.
I know there’s a lot of talk about this sort of field at the moment, particularly because of the publicity surrounding the Monarch initiative, which is a collaboration of several entities including DotPad, Humanware, and APH. What do you think? And given that your product’s been around for a couple of years, the Graphiti, what are they trying to add, do you think? Is there anything of significance that you feel Graphiti isn’t already doing?
Venkatesh: Well, the Graphiti offers many unique things. It offers pins that are very very strong, so you can have your hand on the display as the pins are changing state. And that is really really powerful because for dynamic graphics, real-time dynamic graphics, you need to be able to feel what is changing. If you were to lift your hand away, and then the picture changes, and you put your hand back, it’s much much harder to understand what has changed. So with that capability and the multi-level capability, we feel that the Graphiti and the technology used in the Graphiti basically provides everything that one needs for graphics.
Now, one thing that the current form of the Graphiti cannot do in normal sort of regulation form is display Braille because the pins are equidistant and they are a little bit wider apart than traditional Braille. So to allow people to experience both Braille and graphics simultaneously, we created the Graphiti plus, which adds a line of Braille.
Technologies such as those used on the Monarch have pins that are closer to each other. They are at the pitch of 2.5 millimeters, which does allow Braille characters to be shown.
However, any technology that is designed for graphics which requires equidistant pitch cannot faithfully represent Braille because the spacing for Braille is different within pins of one character and between characters.
Jonathan: You were saying that you believe there’s the potential in the right economic climate for the cost of Graphiti to be quite competitive.
Venkatesh: Yes, absolutely. Like most other products, the economies of scale are a big driving factor in achieving an affordable price point. We have worked very hard, as you know, on the Orbit Reader products to keep them very very affordable. And one of the ways we were able to do that was by having relatively good volumes so that procurement costs for parts and materials would be reasonable. We would like to do the same for the Graphiti.
Right now, they are being made in very small quantities, and that’s why the cost is pretty high. And with larger quantities, even moderately large quantities such as a few hundred units, the cost can be brought down substantially.
Jonathan: It’s an interesting product segment, and I’ll be fascinated to see where it goes. But particularly for STEM subjects, that kind of thing, it’s got a lot of potential, hasn’t it?
Venkatesh: Absolutely. The majority of our customers use it for STEM. A majority of them are in college.
But it resonates very well with professionals as well, especially actually software developers, people who are doing animation. We have some artists actually who are interested and have tried it out with amazing results where they’re able to actually draw, use the drawing feature to draw on the Graphiti.
Jonathan: We also have been playing, or I’ve been playing with the Orbit Writer, which is a product that I remember seeing when it came out. For those who get on well with a Perkins-style keyboard, this is a very cute little device.
And it just does one thing, really, doesn’t it? It does it well. And that is to allow you to input in Braille into any number of devices. What devices does it work with?
Venkatesh: It connects to pretty much every device that is on the market today – every mainstream device.
So for the Orbit Reader, over the years, we have developed a very comprehensive ecosystem of screen reader support. So the Orbit Reader family works with TalkBack on Android, VoiceOver on iOS and macOS, on Windows with an editor, JAWS, NVDA, and Dolphin Supernova, and even with BrailleTTY on Linux, VoiceView on FireOS, ChromeVox on Chrome. So pretty much every platform is supported for the Orbit family, and the Orbit Writer leverages this ecosystem to work with pretty much every hardware platform.
Jonathan: So does that mean I can also use it with the Fire TV stick with VoiceView on there?
Jonathan: That’s pretty cool! So you can kind of use it as a remote control when you’re on the couch? [laughs]
Venkatesh: That’s right.
Jonathan: How many devices can you have it paired with at one time? Can you switch between devices?
Venkatesh: Yes, you absolutely can. It can pair with 5 devices over Bluetooth, and 1 over USB.
Jonathan: And the price point. How much is that?
Venkatesh: It is $99 US.
Jonathan: This is a pretty good deal. And you just put it on a table, it’s a Perkins-style keyboard.
Can you use that over your shoulder? Does it have a case or anything like that?
Venkatesh: Yes, it has an attachment point to which you can attach a lanyard, and you can hang it from your neck and use it in that position. There are also cases available for it from us, as well as from others, which allow you to use it while you’re suspended from your neck or shoulder.
Jonathan: That’s pretty cool!
And we’ve been looking at the Orbit Reader as well, the 40-cell version. Is there anything new with that? Do software updates come out from time to time that add functionality?
Venkatesh: Yes, absolutely. In fact, we continue to support not only the Orbit Reader 40, but even the Orbit Reader 20. It remains in active production. We have no plans of discontinuing any of the Braille display products.
And yes, absolutely, we continue to enhance the software, you know, fix issues and enhance functionality from time to time.
Jonathan: That’s fantastic. I see there’s a lot going on.
When can the public have the Optima? When? When?
Adi: That’s a question I want to answer, too.
Venkatesh: So we projected at announcement time that we would be launching it in the first quarter of next year. And that is still the plan.
Jonathan: You’re so confident you can make that?
Venkatesh: We are fairly confident.
Adi: I never want to risk it and say, you know, yes, for sure. I’m saying this is our target. I’ve learned from, you know, past experiences. Things can go, things can slip sometimes. One thing’s for sure. We’ll release it out completely ready for prime time.
Venkatesh: And we will keep the community posted about, you know, what’s going on and where we are, you know, if there are any changes in the schedule.
Adi: And the community will also be involved, like we have been involving the community with the hardware. See, for example, the thumb keys, right? We’ve implemented that.
We also want the community to be involved in software. So very soon, we’ll be starting user meetings to demo the software stuff that we have been working on and that we’ll continue to be working on and to, again, gather feedback and keep everyone happy and informed until the release.
Because it is a product that is developed for the community and by the community, by any means. Most of the people who write our software, at least the Windows parts of it, are blind completely. And I am managing the development of the whole product itself. And of course, Venkatesh is managing both our company and the production of all of this with Orbitz part that does the hardware for us. And that is really commendable.
So we really hope that this will change not only the way and the things that we expect from our assistive technology, but also we are hoping that this will be a product that will really encourage the community to be involved because nothing will be changed if we will not push for that change to happen. And this is a very important point for me to emphasize, and I thank you that we have your platform to do that.
If we, as consumers, will not challenge the vendors that produce the products for us, or the mainstream vendors that have decided to enter the accessibility space, things will stay in the condition that they stay in right now. If we don’t make a change that is a very substantial change and will have higher requirements, the notetakers, the $6,000 devices will stay limited in the functionality that they have because it goes well, it sells well. Companies, you know, they produce something that works for the user, but I don’t think that they necessarily care about being the latest and the greatest and offering at the level we expect because, you know, they have some business models that they have to fit. And we as customers need to show everybody that hey, the times have changed. The requirements have changed. We should not be left behind. And there is really no reason for us to buy a device that costs so much and gets so little in return.
And this is why I decided to start a new assistive technology company, because I feel that what is out today is not the assistive technology of the future, it is reinventing again what was available in the past to a certain degree.
Take a look at the Mantis. Take a look at the other Braille displays that have these notetaker capabilities inside of them. It’s essentially a modern version of the Braille Lite, for instance, right?, or the whatever was back then. And people market this as, oh, this is revolutionary. This is smart, intelligent. I’m sorry, but this doesn’t fit 2023 and beyond. And we’re here to change that.
And I could not be more proud about the fact that as soon as I talked to Venkatesh about this, he immediately agreed to work with me on these projects because they are really Venkatesh and his team. And this is why I chose to work with him. Primarily, he really cares about the users. He does things for the users. Orbit Research does things for the users.
And I believe that this is the way to go. If we will not disrupt the market with something that will change the game, things will not happen. And I’m really thankful to Venkatesh for allowing me to do this because if it’s not his motivation and their production capabilities, I would not have been able to.
I’m a good product idea person. I know how I want things to work. I know what needs to be done, but I cannot, of course, produce on my own.
But what’s really important is it needs to come from the field. If we will not speak, things will not happen. Things will stay the same, and we’ll be complaining.
People complain to the vendors, you know. Oh, this doesn’t do this, and this doesn’t do that. Yes, you’re right. But you don’t challenge the vendors to make it happen.
Jonathan: You made a couple of references to price there. When will you be more specific about your own pricing for this product? How much is it going to cost us?
Adi: [laughs] I’m afraid only close to production we will be able to give definite answers.
Venkatesh: Yes. The target is very clear. We want it to be more affordable than the notetakers.
Adi: And we insist on that.
Venkatesh: We will absolutely make sure it happens.
Exact pricing has got a lot of variables. As you know, there’s a lot of configurability. It’s still in several months, a few quarters away from production.
Adi: We still have a lot of adjustments to make, both in terms of components we use and external stuff, you know, that needs to be…
Venkatesh: So there’s a lot of stuff that needs to be worked out.
Jonathan: Yeah. I mean, that’s going to be absolutely critical, isn’t it? And there’ll be a lots of interest.
Adi: We will announce the pricing as soon as we can. That’s for sure.
Jonathan: Right. Preferably exclusively on Living Blindfully first.
Adi: [laughs] We really don’t want to keep anyone waiting. This is, again, like I said, we know that this is needed. We know that there is a lot of anticipation for it. I started this because I need it for myself. That’s the main point. So we’ll keep everybody involved.
Jonathan: I love all of this innovation. And it’s been a very pleasant … We spent a good couple of hours just geeking out over some of the stuff before we turned the recorder on.
Jonathan: So it’s been a pleasure, an absolute pleasure to catch up with you both. I’m very much looking forward to keeping in touch and finding out what happens next.
And I hope you enjoy the rest of the convention.
Venkatesh: Thank you. The pleasure has been ours.
Venkatesh: And we really appreciate all that you do, Jonathan.
Adi: Thank you for giving us this platform. What you’re doing for the community is really commendable.
Jonathan: Thank you.
Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.
Let’s do some reminiscing about old iOS apps, as they used to say, gone from the charts, but not from our hearts.
Rich Yamamoto starts us off.
“This discussion on old iOS apps of the past reminded me of a few that I used to have on my old iPod Touch 4th generation. I don’t think any of them are around anymore, unfortunately.
I used to use Fleksy VO as a third-party keyboard. I loved being able to type in the general vicinity of the location of the keys and having it predict the words that I was wanting to use.
FlickType succeeded it, and it used to be available as a system-wide keyboard for iOS until late 2021, I believe. I don’t remember all of the specifics as to why it became unsupported, but I just know that it was quite ugly indeed.”
Yes, indeed. Fleksy started off as an app for the blind community. And then, they really branched out and they created this universal Fleksy keyboard, which was not particularly accessible.
And then, around about I think it was 2013 or so, maybe 2014, they introduced this Fleksy VO thing, and there was mixed response to this from the blind community. A lot of people said, “We don’t want some sort of segregated app. You had an app before Fleksy that was fully accessible. Now you’ve got this blindness-specific app that kind of keeps us in this box. What’s up with that?” And people felt a bit grumpy that they had essentially been beta testers.
And then, the blind community became too small for them and they branched out. But for Fleksy, when universal keyboards became a thing in iOS and they haven’t always been a thing, that was the big game changer for them. And that’s when they tried to branch out.
And then, one of the developers of Fleksy went on to do FlickType. And there has been, and there may well still be a dispute between Apple and the FlickType developer, which we have covered in a previous episode of this podcast. It’s all very unfortunate.
Rich says, “I also used the LookTel app.”
That’s right. I remember LookTel.
“The VoiceOver tutorial and the money reader that they had available.
I never got around to using Recognizer, but I heard that it was quite handy for its time.
The VoiceOver tutorial was, for all intents and purposes, my first audio game that I played as well. It had little adventure exercises that were very fun to mess around with.”
I don’t think I ever used the VoiceOver tutorial from LookTel. But I did use their main LookTel app, and I remember us labelling all sorts of items and using that with LookTel, and it was very effective.
“Of course,” says Rich, “I played the games available from Something Else. These included Papa Sangre 1 and 2, Audio Defence, and The Nightjar. I’m really mad that I never got around to beating Papa Sangre 2 before they all disappeared from the App Store. Very unfortunate.
Sixth Sense was a fun zombie shooter as well. And I distinctly remember sitting in an airport with my iPod and freaking out strangers around me because I wouldn’t have headphones in, and I’d just be playing along without a care in the world.
I used VIA from the Braille Institute to access a resource of accessible iOS apps. I don’t know why but for some reason, that thing was quite the resource hog. VIA was how I discovered the games Spellstack and Stem Stumper. Those were fun, though they didn’t run very well on my iPod. It was a shame to see them go.
I think that’s all of the apps I can remember using at the moment. If I think of others, I’ll write back.
I will say that I got Audio Defence and The Nightjar back on my iPod 6th generation via Alt Store, and old IPA files from the Wayback Machine. Sort of hacky, but I suppose it works.
I’m interested to hear what others used. I’m sure I left plenty out of my listings.
Those apps were great for their time, and the one I miss the most is FlickType’s system-wide keyboard. If you know of another accessible third party keyboard, I’d love to hear about it.”
Thanks very much, Rich!
I was a big user of Fleksy until Braille Screen Input came along. And then, that kind of did it for me. Braille Screen Input was just there. When it works, probably it works very well, so I haven’t really investigated anything else after that.
Some great memories there, indeed.
Voice message: Hello, Jonathan and Living Blindfully listeners. This is Bryant from Idaho.
I wanted to first of all, thank you, Jonathan, for the rebranding of this podcast. I really like the name Living Blindfully, and I think it speaks to a lot of all of what is discussed on this podcast and the important issues we discuss every week. So thank you for the rebranding. And thank you also for your continued commitment and time that you spend on this podcast.
I wanted to continue a discussion that was started on last week’s episode about apps that we fondly remember using in the early years of iOS.
When I first got iOS, I was using an iPad in 2012. And that was, of course, back when iOS and iPad were the same operating system. So you had iPad and iPhone, both running iOS.
There was an app that I used to use on the iPad a little bit. I believe it was called SayText. And if I remember correctly, SayText did something similar to what KNFB Reader did. I believe it was a scanning app.
It was probably available on iPhone, too. But back then, I didn’t have an iPhone.
So you would hold your iPad over something, and I believe it would speak the text of the document.
Another app I used to use was a game called Smack Me. Now, Smack Me is similar to what ZanyTouch is. It is a Bop It-style game, or was a Bop It style game. And you had commands such as Smack Me, where you double tap the screen. You had Shake Me, where you shake your iPhone. You had Freeze Me, where you do nothing. You had Pinch Me, where you pinch the screen.
And Smack Me was unavailable when iOS stopped supporting 32-bit apps because Smack Me never had a 64-bit version. It was always a 32-bit app.
Another app I remember is YouPlayer. Now, YouPlayer was a YouTube player.
And so you open YouPlayer, you selected the channel you wanted to play videos from, then you started playing the video.
And it had several options. I believe there was a speed control, there was a time increment so you can increment by percentage, fast forward or rewind by percentage. I believe it was every 10 or so percent. But I think you could change that in the settings. So every swipe up would be 10 percent. Every swipe down would go back 10 percent. Every swipe up would go forward 10 percent.
YouPlayer, I felt like at the time, at least offered a better YouTube experience because it seemed to be more accessible, at least at the time, and it offered more options for YouTube videos at the time.
Jonathan: Thank you so much for those great memories, Bryant, and for the positive feedback about the podcast. I really appreciate that.
I kind of remember the pre-KNFB reader days on the iPhone as being a bit of a wasteland for OCR apps that worked well enough for blind people to use them consistently and reliably.
And I remember feeling like I’d given up something pretty important when I went from Symbian to an iPhone because the KNFB reader, although it beeped and took a long time to do things and carried on, we still had access to the printed word in our pockets. You could take a picture, and then go away and make several cups of coffee, and do things. Sometimes, depending on the complexity of the document. But often, ultimately, it got there with reasonable results.
I do remember SayText and thinking, yeah, this is probably one of the most promising ones of the bunch. But I still couldn’t get that same reliability.
I remember one called Text Detective as well that was in the space.
And then when KNFB reader came along, it really changed everything. It was an expensive app. I don’t really regret paying for it because it just gave such quality access.
I remember the first picture I took with KNFB reader when it dropped on the App Store, and we were all waiting for it. It was really exciting! And I took this picture actually of my daughter’s school report. [laughs] She probably wishes that KNFB hadn’t quite been as good. And it read it flawlessly. And I’m thinking wow! Okay. We are off to the races with OCR.
So now, there are free apps that do an even better job because cameras, and processing, and technology has moved on. But I think we owe an enormous debt of gratitude to KNFB reading technologies for getting us to the point where we realized at last we have something that’s truly viable.
I do also remember YouPlayer. That was a great app. And no question. At the time when it was out and given the then state of the YouTube app, YouPlayer was a much more accessible, powerful experience that allowed better navigation and a whole bunch of other perks as well.
And I still miss that app. Oh.
And on that note, we will leave you for this week.
Thank you so much for all your contributions. I really enjoy receiving them, so keep them coming in. And we’ll be back with you next week.
Remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.
Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.
If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: email@example.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.