Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to 265.. 2

Advocacy Comments in Episode 263.. 4

Two Users of Framework Computers Comment on Their Experiences. 7

Inaccessible Date Pickers on Travel Agency Websites. 11

Using Emoji on Mac and Windows. 13

What Are Accessible Options for Those Interested in Crypto Currency?.. 18

Samsung Accessibility Uses Audio Eye.. 19

Questions About the iOS Health App, and the Need for an Apple Watch.. 19

Blind People Generating Images With AI 22

Suno.ai, All the Subscriptions, and Improvements in Mainstream Gaming.. 24

Shubh Mittal Tells Us About the Seleste Glasses. 26

Feedback on the Seleste Glasses, and a Minecraft Accessibility Mod.. 39

Wanting to Learn More About Multiline Braille Displays. 41

How Can I Advocate to Save Gmail’s basic HTML?.. 41

Volunteering to be in a Be My Eyes Group Gives People More Than They Bargained For. 42

Update on New Guide Dog and Technology for Developing Countries. 43

Problem with My iPhone SE.. 46

Sound on my Dell Computer. 47

Blindness terminology and new iPhone 15.. 47

Closing and Contact Info.. 49

 

 

 

Welcome to 265

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s Living Blindfully – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

This week: with only so much time and energy, how do we pick the things that we’d advocate on?, how good are framework computers?, using Emoji on Mac and Windows, And the Seleste Smart Glasses offer custom-built hardware for a monthly subscription.

A warm welcome to episode 265.

The North American Area Code 265 is not assigned. It’s up for grabs. It says it’s available for geographical assignments, so I look forward to finding out where 265 eventually goes if it happens in my lifetime. [laughs]

Meantime, the country code 265 belongs to Malawi.

Long ago, I met somebody who worked in Malawi, and they quite enjoyed it.

So I thought we might ask our friend ChatGPT to tell us what it can about Malawi. So let’s see what we got.

Jonathan: Give me some facts about Malawi, please.

ChatGPT: Sure. Malawi is a fascinating country with a rich culture and history. Here are some key facts about it.

Location

Malawi is a landlocked country in southeastern Africa, bordered by Tanzania to the north, Mozambique to the east, south, and west, and Zambia to the west.

Capital

The capital of Malawi is Lilongwe, which is also one of the country’s largest cities along with Blantyre and Zuzu.

Lake Malawi

One of the country’s most notable geographic features is Lake Malawi, which is among the largest lakes in the world and home to more species of fish than any other lake, including over 1,000 species of cichlids.

Population

Malawi has a population of over 18 million people. The country is known for its friendly people, often referred to as the warm heart of Africa.

Economy

Agriculture is a major part of Malawi’s economy, with tobacco, tea, sugar, and corn being some of the primary crops. However, the country faces challenges such as poverty, health issues, and reliance on agricultural exports, which are susceptible to global market fluctuations.

Biodiversity

Malawi is rich in biodiversity, with several national parks and wildlife reserves. It has a wide variety of wildlife including elephants, hippos, big cats, and hundreds of bird species.

Culture

Malawian culture is diverse, with various ethnic groups each having their own traditions and languages. English and Chichwa are the official languages. Music and dance are important cultural expressions in Malawi, with traditional and modern styles influencing each other.

Jonathan: Thank you, GP!

I call her GP now, you see, because I spend so much time interacting and engaging, and she gives me useful information a lot of the time, not all the time. You’ve still got to be a little bit reticent, you know.

But that was good, and I understand why the person that I talked to all those years ago who did some work in Malawi enjoyed it so much.

If you’re listening from Malawi, a special welcome to you.

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions. Among other things, they are the producers of RIM (Remote Incident Manager). And that just keeps evolving. It keeps getting better and better. There are some new features I want to tell you about in RIM this week.

First, if you need to, RIM can now be installed without admin privileges, and that’s a big deal in certain work situations. It works unconditionally if you’re the controller. If you’re the target, there will be a couple of limitations, and you can find out about those on the RIM website. But this is a big step forward.

Second, it is scary to me that I think that Pneuma Solutions are reading my mind or something because just recently, I was assisting Bonnie with a remote task, and I prefer to do that from my own computer. And I thought, “Wouldn’t it be great if we could have voice chat going in an unattended session?”

Because Bonnie’s given me permission to have her ThinkPad on my desktop. I just press Enter on that, and I can help her out with any tech requirements that she has. But I can’t initiate a voice chat that way. Well, at least I couldn’t. But now, I can.

So this is an option that you have to control session by session, obviously, because you don’t want to spy on the person whose machine you’re controlling.

So if you do want to initiate a voice chat after you’ve connected in an unattended session, the owner of the target machine will get a message saying, “Is it Okay to start a voice chat?” And they have to consent to that. So this is a very cool addition to RIM.

For our friends who use Macs, there’s good news for you all around, because controllers can now get a list of running processes and applications on the Mac side. What’s more, Mac users can now easily and completely uninstall RIM via an easy access button in the About window if they run into problems.

And finally, RIM weekends are now starting earlier. I mean most of us, when we’re chilling on Friday night, we consider that the weekend, right? So RIM has got with the program. And from 7 PM on Friday nights, if you’re part of the weekend plan where you have unlimited access all weekend, you now have unlimited access starting at 7 PM in your timezone.

To find out more about RIM, visit GetRIM.app. That’s G-E-T-R-I-M.app.

Advocacy Comments in Episode 263

Voice message: Hey, Jonathan. This is Sy writing in from New York City.

Wanted to address something you said on episode 263, in the middle of the segment on the application and copilot keys with Microsoft.

It doesn’t actually have to do with those keys. But you said at one point that basically, when we don’t take opportunities to do advocacy for ourselves and something negative happens because we didn’t take that opportunity, that we have nobody but ourselves to blame, and that struck me as harsh. And I wanted to [laughs] write in, or call in, or whatever this is, email a voicemail in and say with all due respect, I disagree.

I think we ought to keep the blame for things that happen to us that are discriminatory in the place where it belongs, which is with the people who did the discrimination in the first place.

I don’t take a lot of the opportunities that I have to do advocacy for myself because if I took all those opportunities and advocated in all the cases where I was being discriminated against in various ways in my software, and hardware, and other ways throughout the day, I wouldn’t frankly have time to do anything else. And I have other things to get done.

And also, it can be exhausting. We didn’t put ourselves in these situations where we have to advocate just to get the same things that everybody else has. Other people put us in that situation. So I think we ought to be extending ourselves all the grace in the world.

And obviously, an application key on a keyboard might be a relatively small thing, but the idea applies much more broadly.

So anyways, I just wanted to say that I so appreciate what you do on the show, and on the BlindPodMaker email list too. I’ve benefited from that enormously, and I thank you very much.

Jonathan: Si, I appreciate your message very much. It is the archetypal example of a great Living Blindfully contribution. We’re debating something that affects many of us daily, a real dilemma that we confront often as blind people, and we’re disagreeing without being disagreeable.

So thanks very much for taking the time to record the contribution, and for the tone of the contribution as well.

One day, we will actually have Living Blindfully merch. We are working on this. And if we had it now, I would send you something in honour of that contribution.

I don’t completely disagree with what you’re saying, so let me expand at length on my original comments and respond to yours. We may meet in the middle, or we may not, but the conversation will certainly get more people thinking, and that’s a good thing. It’s what this podcast is all about.

I agree with you that we do have to pick our battles, and there may be multiple calculations that we make when we decide to pursue something, or not to pursue something. You’re right. If we take on every act of discrimination, we’d be stressed, we’d be bitter, and we’d be exhausted.

So if you learn that Microsoft might take your application key away and to you, this isn’t a big deal (and that seems to be what you’re saying in this particular contribution), I agree, there’s no point in expending energy over something that doesn’t have a high impact for you.

I view this specific example differently. I think it is a big deal.

There are some applications where the application key does different things from Shift F10. You can also use the application key in conjunction with Shift, Alt, or Control, in some applications, to get different context menus. And in fact, in newer version of Windows 11, pressing Shift with the application key gets you right into the full context menu, not the abbreviated one that’s there by default now. I think there are a lot of screen reader users who don’t actually fully appreciate that Shift F10 and the application key are not identical in their function all the time.

So for me, as someone who values efficiency and for those with dexterity issues who may not have the ability to press Shift with F10 because it takes two hands, this change is consequential and detrimental.

But sure. If you’re chill about this, maybe you’ve been functioning without an application key for years and you don’t know what you’re missing, then absolutely fair enough. Why waste your precious energy over something that you don’t care about? So sure. In that case, pick your battle. Sit this one out. Keep your powder dry for something that really matters to you.

But the comments I made in 263 weren’t directed at someone who holds the position that you do on the issue. My comments were directed at those who feel that the issue is of consequence, but just believe, without any evidence or without checking, that the problem will magically sort itself out.

I also agree with you fully that we should not indulge in victim blaming. If someone’s discriminated against, then the perpetrator of that discrimination is absolutely to blame.

But I view an issue like this one as different from, say, a ride share driver who refuses to let you into their vehicle because you’re traveling with a guide dog. That’s a blatant act of discrimination, and it’s a violation of the law in every Western country, at least.

In this case, you have a decision that’s driven purely by marketing, and the efficiency of screen reader users is an unfortunate and totally unintended casualty.

Microsoft is investing a lot in AI. They have various AI products, all called Copilot. (Actually, it reminds me a little bit of the Monty Python Bruce’s sketch. Anyone who hasn’t heard that, look it up in your streaming music service of choice, it’ll be entertaining for you.) Everything’s called Copilot these days that’s relating to AI and Microsoft. So naturally, the marketing people and the bean counters want people to use Copilot.

Now, they can already get to Copilot from the keyboard. If you’re running a new build of Windows, you can press Windows with C. And there are other Copilot commands in specific applications, like Alt I in Microsoft Word, if you’ve enabled Copilot Pro by paying for it.

But the marketing people want a button. They want a button on your keyboard that has Copilot written on it, in the hope that curious people will say, “Ooh! What does this button do?”, and press the key, like the feature, get hooked, and pay for premium. Yay, capitalism!

So in the case of the Copilot key, there’s nothing nefarious going on here. There’s nothing intentionally discriminatory. But I feel confident that the people who dreamed up this concept didn’t contemplate what the disappearance of the application key might mean for some screen reader power users.

Now, I don’t wanna get too specific about who I’ve been talking to regarding this issue at Microsoft because I don’t have their consent to do that, and I contact them every so often when I think the issue truly warrants it, and it’s been an open, respectful, receptive discussion. But now, Microsoft has a better understanding of this issue from the perspective of at least some screen reader users. And I’m not the only screen reader user who has been in touch on this, and who have had dialogue with Microsoft on it.

And this brings me to another point, which maybe we’ll agree on. The point I was making last week was that if you care about an issue, but you just have faith based on no evidence at all that things will work out, in my view, that’s a deradiction of duty. That’s the point that I was making. Because if everybody cared about an issue but no one acted on their concerns, we would not make progress.

And I hear you, Sy. I hear you resoundingly about the fatigue that we can all feel about seemingly having to fight these battles.

But we are a minority. Our needs are not well understood. And if we don’t articulate them, then we’re not going to make progress. This is particularly important when a feature’s in the planning stage, and there is time to have our needs accommodated. However, if you’ve verified that an organization or individuals that you trust are in dialogue about an issue you care about, then there may not be value in you spending your own precious time, and energy, and stress pursuing it further.

If you care deeply about an issue though, I would advise checking whether that’s the right strategy in a given situation. I can tell you that certain companies are very sensitive to volume. They want to know how many users feel this way. And fair enough too, I’ve been an advocate for a long time and I know how to put my case. But when you have many people competing for the allocation of precious resources, it could be that sitting on the sideline because you think someone else has got this may not help you get the result you want.

I also understand that some people feel nervous or ill-equipped when it comes to expressing in writing how they feel and what action they’d like. And other than going in there and doing the hard work oneself, I think one of the greatest gifts an advocate can give is self-advocacy skills. For those who have the will but not necessarily the skill, then those of us who are advocates should be willing to teach and encourage and talk about how you get results, how you make your case effectively. This is also where consumer organisations and well-crafted petitions can help in terms of the volume issue.

So that is a much lengthier exposition of the point that I was making in 263. And you still may disagree with me, and that’s what makes the world an interesting place.

But I thank you once again, sincerely, for the opportunity to think further about this and elaborate on what is an important principle of advocacy.

[music]

Two Users of Framework Computers Comment on Their Experiences

Vaughn Bennison says:

“Morning, Jonathan, from not-so-sunny Hobart,”

Ooh! I hope you did get some sun over the break (because this was written in mid-December).

“Responding to your request about framework computers.

Upon completion of the build of my studio in November last year, I moved away from permanent laptop use to a desktop computer, which has greatly aided in my recording and post-production work with Reaper. In my professional role working from home, it hasn’t made a lot of difference.

But not having a laptop has meant that I have been unable to work easily from other places. So upon consultation with my elder daughter, I decided to try a framework laptop. I’ve always enjoyed building my own PCs and have done so for many years, so the idea of choosing and assembling my own laptop was extremely interesting to me.

I ordered one of the new Framework 13 machines with the AMD processors. I ordered 32 gigabytes of 5,600 megahertz RAM, an AMD 7840 processor, and originally a 1 terabyte SSD. Notably, the AMD version of the Framework main board uses DDR5 RAM, which significantly increases the memory bandwidth available to each core of the processor.

I decided whilst I was waiting for my pre-order, about 3 months, that given that I could, why not go to a 2 terabyte SSD? I was easily able to purchase one from a typical computer store. It is also worth noting that Framework now reports that they have stock of components. So a 3-month waiting time is only because I pre-ordered.

My machine arrived a couple of weeks ago now, and my daughter and I assembled it immediately. There is an enormous amount of packaging. But as June pointed out, each aspect of the machine is available for individual purchase, so it’s not surprising.

The instructions for assembly are very accessible. They are written on the website, and there are videos which demonstrate each aspect of the assembly process. I don’t know how accessible they are, as I didn’t actually watch them.

When you look at a Framework laptop, interestingly, the main board is attached to the bottom of the machine. This differs from most laptops in my experience, where everything is attached to the plate which holds the keyboard and touch pad. As many options for keyboard exist, Framework has moved their components to the bottom of the machine to make replacement or substitution easier. The only tool you need is the Framework screwdriver, which has a double-ended bit for different purposes. I have not replaced the keyboard, but if they ever come up with a mechanical keyboard, I’ll be in.

As a blind person, I was able to complete all aspects of the assembly unaided, except for the screw which holds the M2 drive in place. I think I would have got it in time but as the tolerances are so tight, I found it difficult to line it up properly. Everything else was simple, though, and it is very obvious where things need to go. The connector to connect the input panel, i.e. keyboard and trackpad, is very straightforward, and lining the input panel up is really simple and guided by magnets. Then you simply screw in the captive screws, and attach your chosen bezel, in my case, black.

Setting up Windows requires some work if you purchase the AMD version of the machine. This is because the drivers for the WiFi module are not available in Windows 11 yet. I used Rufus to create Windows installation media because it allows you to remove the requirement for a Microsoft account, and does not pull the internet for updates. Once created, though, aside from the initial need for sighted assistance to select your installation drive, everything went smoothly. I installed using Narrator and the framework driver bundle, and everything went smoothly.

In terms of usability, I have found the Framework exceptional. It has excellent performance, and the fans are quite quiet most of the time. I’ve read reviews that the Intel version runs the fans rather a lot, but mine certainly doesn’t have that problem.

The only issue I experience is the machine has an annoying reluctance to sleep. It locks the screen, but the processor keeps working, particularly when plugged into my dock. At the moment, only one of the onboard microphones appears to work, but I’m chasing down that problem.

The microphone does sound pretty good. The camera and microphones have manual switches to turn them off. They are on the top of the bezel, and strangely, the microphone switch is on when switched to the left, and camera switch is on when switched to the right.

Framework has upgraded the speakers, and these are quite serviceable. Not a patch on the MacBook speakers, but a significant improvement on the Dell XPS 13 speakers.

The upgraded battery is good, and battery life, while not outstanding, is certainly up there. Framework claims around 11 hours.

WiFi and Bluetooth performance is excellent.

The expansion card system is excellent. I purchased 4 USB-C ports, 1 USB-A port, an HDMI port, Ethernet port, and a microSD card reader. It’s really cool to carry extra ports around with you that you can swap out any time. The process is simple, but requires a bit of strength and dexterity.

I am also aware that someone has built a 4G modem which fits into the expansion slot, but I’ve not yet purchased one.

They are in the process of building an SD card reader.

The fingerprint reader in the power button is excellent, and the machine is Windows Hello compatible.

The keyboard is good, although there is no applications key. Sharp keys will assist here. I hope they will come up with a mechanical keyboard option at some stage. I purchased a small Keychron” (that is spelt K-E-Y-C-H-R-O-N) “Bluetooth mechanical keyboard, which works well.

We have considered disconnecting the screen to see if this saves battery life. But I suspect this will disable the camera and microphones, so as the battery life is pretty good, I’m not in a hurry to do that.

I’m really keen to see how the Optima takes advantage of the Framework ecosystem. Their website hasn’t been updated for a while but according to the dimensions listed, the Framework main board would fit into the case with battery below. I wonder if I could buy the Optima components and swap out my Framework main board. That would be interesting to try, as it appears the battery life on the AMD version is better than the Intel version.

I’m very happy with my purchase.

The shape of the machine is slightly different from many other 13-inch machines because the aspect ratio of the screen is 3 by 2. This means that it’s slightly wider from front to back than I expected.

It’s very light and feels absolutely solid.

It has ventilation slots on the bottom, and the processor doesn’t seem to get too hot.

The RAM seems very efficient, and resource monitoring indicates it averages around 4 to 6 GB in use, so 32 GB is probably excessive for me. The processor rarely gets above 3% usage.

The speakers point out the sides, which gives really good stereo separation, but the headphone socket works well, so you might never use them.

I’ve taken to using NVDA, and have had absolutely no issues. Reaper works extremely well, once I remapped that pesky applications key.”

Fantastic review. Thank you very much for doing that, Vaughn.

My mouth’s watering a bit, actually, because I did take a look at Framework over the summer break and found, to my disappointment, that Framework does not yet ship to New Zealand. I’m not sure what that might mean for the Optima, but Framework itself does not sell here. Really disappointing.

But it sounds like an intriguing concept, and the idea that geeks can swap out components easily is really attractive.

Our cup runneth over with Framework at the moment, because Derek Lane has some thoughts on this.

[voice message]

Derek: I cannot believe the fact that I’ve had a Framework laptop for 2 years and haven’t told you. I swore I did at one point, but obviously not.

So here’s the thing about the Frameworks. If you can get someone to help you put them together, or if you’re very savvy with computers on your own, you can totally do it. The results are incredible.

You have a setup where the headphone jack and the built-in speakers show up as distinct devices in Windows Playback.

The chipset for basically most things is class compliant. So when you boot the laptop for the first time, of course with no Windows installation present because you naturally have a blank hard drive, you can press Control-Windows-Enter to get Narrator running and independently install the operating system yourself.

In terms of the speed of this machine and its ability to process data, I went to Tallahassee, Florida some time ago and was part of a team commissioned to record a concert from Marcus Roberts in the Modern Jazz Generation. And it was wonderful. I was recording 40 channels of audio through a Dante setup. Each track encoded by Reaper in real time as a FLAC file in order to save space. And the fan wasn’t even spinning on the laptop, at least most of the time. And when it was, it was so little that it didn’t make any noise that I could hear.

There are a couple of caveats to the frameworks. The first is that to make these machines able to be so modular, there’s some sacrifice in terms of the battery size.

The second caveat is that they appear so similar to a MacBook Air that on a flight, I almost walked away with someone’s MacBook Air and they walked away with my Framework. And we noticed, fortunately, within seconds of doing this, that we had the wrong laptops.

Finally, there is the issue of the fan. To make room for people to actually get inside of this machine and service it, Framework decided to make the fan quite small. And while it gets the job done when it needs to, it’s kind of loud when it does. You need to do a lot of heavy lifting to get the fan to spin up to the degree that it makes the noise. But once it’s reached that threshold, wow! Oh boy!

And those are the only caveats I can really think of.

Incidentally, this is what the internal microphone sounds like on the Framework. Speaking of the internal microphone, there is a switch which, when flipped, will mute the microphone. So, for the incredibly paranoid or security conscious among us, you have a hardware switch to disable the mic, as you do also with the webcam. I don’t know how well that performs, but it exists. And like every other component of the Framework, should they come up with an upgrade to the microphone or webcam, you can remove the old module and put in the new one.

[music]

VoiceOver: Living Blindfully is the podcast with big dreams and no marketing budget.

Oh dear, dear, dear.

This is where you come in. If Living Blindfully is something you value, we really would appreciate you taking a few minutes to write a positive review of the podcast.

Positive testimonials help spread the word, and encourage new listeners to give us a try.

If you have a moment to give us a 5-star review, we really would appreciate that.

Thank you for supporting Living Blindfully.

Inaccessible Date Pickers on Travel Agency Websites

Jonathan: Michael Chopra writes:

“Hello, Jonathan and all Living Blindfully listeners,

I’d like to raise the topic of inaccessible date pickers on travel agency websites.

Recently, I wanted to check flights to Canada for 3 people with my laptop. But the travel agency I used, Flight Center, have an inaccessible date picker. And therefore, I wasn’t able to select the dates I wanted.

NVDA saw this as a read-only edit field. Narrator was no better.”

And Michael has included an email that he sent to Flight Center. It’s a very well-worded email explaining exactly what a screen reader is, and the trouble that he’s experiencing.

Yeah. I can confirm this, Michael. I went to the Flight Center website. It’s not particularly pretty, and I hope they fix it.

In the meantime, go where your business is more welcome and more accessible. There are a number of other options you might use, including things like Expedia, which is in pretty good shape these days. Travelocity is not too bad as well. So there are various other places that you might be able to go.

You shouldn’t have to, but you can also call them on the phone. But that should be a last resort if you absolutely have to use Flight Center for some reason because the website should be just as usable for you as it is for everyone else. I hope that you get some progress with this.

I checked the website, and they haven’t fixed it yet… That doesn’t mean that it’s not in their queue of some kind. I hope you have been responded to.

And actually, he has been. Because with the passage of time and the fact that this came in just before Christmas, we have an update to this.

Michael wrote back and says:

” I have received a response from Flight Centre New Zealand’s customer relations department, and I’m rather happy with it. The response is pasted below. I don’t think I could have asked for anything better, all things considered.

‘Dear Michael,’” (begins the response).

“‘My name is’” (name redacted), “’and I am the customer relations advisor for Flight Centre New Zealand.

I would like to unreservedly apologize that you were unhappy with your recent experience with Flight Centre. I appreciate that you chose to book with us for a seamless, hassle-free booking and travel experience. The fact that you feel yours has been anything less is truly regrettable. For this, I apologize.

I would also like to apologize for the delayed response. Due to staff shortages and sickness,’” (COVID), “’our responses are taking longer than first anticipated.

Thank you for reaching out to us in regards to our website and its accessibility for visually impaired customers.

We take pride on being inclusive for all of our customers, and are very interested to hear of your experience with accessing our website, in particular, the date selector function.

We are pleased to say that we have raised this with our website developer based in Australia, as well as our Diversity, Equity and Inclusion Committee. They are going to investigate the date picker and see what quick wins they can do to make it user-friendly. They are also going to look at the site as a whole to see what else they can do to make it more accessible and implement any changes globally.

Thank you for bringing this to our attention. It is only with feedback like this that we can address these issues and continue to strive towards delivering the best customer experience we can across all our platforms.

This is not an instant fix for sure, but puts us in the right direction to making our platforms a better place to support our customers.’”

That’s an excellent response, Michael, as long as they don’t kick the can down the road too much and the responses are fairly quickly forthcoming. Because it won’t take too long to fix those combo boxes, I would have thought, or fix those controls that actually should be combo boxes. Then, that’s really good.

And I think that was an empathetic response. We’ll just have to see what comes of it.

Do keep us posted, and I’ll do my best to remember to check in on Flight Center from time to time.

Moral of the story for us all: sometimes, we think just pointing these things out in a polite way won’t make a difference? Maybe that’s true, some of the time. But equally some of the time, you might get somebody who cares, and they might make some changes that benefit us all.

Using Emoji on Mac and Windows

Alrighty, then. 😊💝 (smiling face with smiling eyes, heart with ribbon)

This contribution comes from Robin Christopherson. 😄 (grinning face with squinting eyes).

[voice message]

Robin: I admit it. I love emojis. I use them all the time.

And so we’re going to look at how you can put emojis everywhere on the Mac and in Windows as well, everywhere you can type, you can whack some emojis in as well.

Now on Windows, it’s built in. On the Mac, I use a third-party app called Rocket. Rocket is free. It has some in-app purchases, if you want some extra bells and whistles. But to just be able to put in some emojis to enhance your texts, or emails, or whatever it might be, then you can just use the free version. So let’s have a look.

Okay. I’m here on my Mac, and I’m going to move to the Messages app, for example.

VoiceOver: Messages. Steven Scott. iMessage, Window. Message, Edit Text. Is editing.

Robin: Okay. I’m in a message to Steven.

Hey, matey! How are you doing?

VoiceOver: Hey, matey! How are you doing?

Robin: Now, let’s add an emoji as well.

Rocket’s default setting is that you just press the colon key. So if I press that, …

VoiceOver: Popup.

Robin: We have a menu that pops up, and the first item is Thumbs Up. And that’s because by default, you get your most frequently used emojis. If I arrow down, …

VoiceOver: 😄😎❤ (grinning face with big eyes, smiling face with glasses, red heart)

Robin: Etc.

But you can also start typing. So I want to send Steven a hotdog for whatever reason. So if I start typing H-O-T, …

VoiceOver: 🌭 (Hotdog).

Robin: Hotdog is there straight away. Hit Enter.

VoiceOver: Hey, matey! How are you doing? 🌭 (Hotdog).

Robin: Okay, then. there you go. I’ve added a hotdog emoji.

And yes, I know in the Messages app here, there is an Emojis button right next to where I’m typing. Obviously, that’s only here. In email, in the Notes app, wherever you are, you won’t have that Emojis button available.

So you can easily just type colon, and then start typing what you want.

Let’s give him another one. Let’s give him a 😕 (confused face). And I’ll start typing.

VoiceOver: C, 😕 (confused face).

Robin: Confused face. There it is, straight away. Enter.

And now, we’ve got, …

VoiceOver: Hey, matey! How are you doing? 🌭😕 (hotdog, confused face)

Robin: Okay. There you go.

Now, I just wanted to show you the settings, the preferences of Rocket. So if we switch to that, …

VoiceOver: Rocket. Application. Preferences, Window. Preferences.

Robin: So I have the Preferences pane opened. I just wanted to show you. There are 3 tabs.

VoiceOver: General tab. 1 of 3. Advanced tab. 2 of 3. Accessibility, selected tab. 3 of 3.

Robin: And the third tab is all about accessibility.

VoiceOver: Rocket is proud to be the first emoji picker for blind people.

Configure Rocket’s accessibility settings here.

Robin: There you go. So it is very accessible, even to the extent that you can change some of the accessibility settings to optimize it for how you want it to work.

So yeah, Rocket. Brilliant!

Now let’s have a look in Windows.

Okay then. I’m now in Windows. It didn’t actually take me very long because I have Windows on this Mac as well in a virtual machine, but it will behave exactly the same way if I was on a normal Windows PC.

So I’m using JAWS here.

JAWS: Star. Untitled-Notepad.

Robin: Here we are in Notepad, and we’re on a blank line.

JAWS: Blank.

Robin: And if I start typing here something like I love emoji.

JAWS: I love emoji.

Robin: Emoji. [laughs]

Now in Windows, it’s super simple. All you need to do is type Windows key and full stop.

Now, you don’t hear anything. So it’s not as accessible an experience as it is with Rocket on the Mac. But yes, the emoji panel or keyboard is up, and we can just start typing.

Let’s type H-O-T.

JAWS: Emoji category. Search group. Hotdog selected.

Robin: Okay. So I’ve typed H-O-T, and there are 13 possible items starting with that. I’m gonna type an extra letter. I’m gonna type D.

JAWS: Emoji category. Search group. Hotdog selected. Selected contain 1 items.

Robin: Okay. And now, it said hotdog is the only one in this list. Contains 1 items. I can just hit Enter.

JAWS: Star. Untitled-Notepad. Edit.

Robin: And now, if I do say line, …

JAWS: I love emoji. 🌭 (hotdog).

Robin: There we go. We’ve got a hotdog.

And the emoji panel has gone. Now if I want to do another emoji, I will do Windows and full stop again.

JAWS: Emoji panel is closed.

Robin: Don’t worry about when it says that. I was really confused to begin with, but that’s lying.

So we’ve now got the panel up again. And let’s do that confused face. If I start typing C-O-N, …

JAWS: Emoji category. Search group. Salt shaker, selected. Selected contain 13 items.

Robin: F.

JAWS: Emoji category. Search group. Confetti ball, selected.

Robin: U, …

JAWS: Selected emoji category. Search group. Confused face selected. Selected contain 1 items.

Robin: There it is. Enter.

JAWS: Star Untitled-Notepad. Edit.

Robin: So now, we have …

JAWS: I love emoji. 🌭😕 (hotdog, confused face).

Robin: There we go.

Now, if I haven’t got anything in mind, I can just do Windows key and full stop again.

JAWS: Emoji panel is closed.

Robin: Again, it’s lying.

And now if I just start arrowing right, …

JAWS: Hotdog, selected. Selected contain 58 items.

Robin: So there are 58 that it’s offering us here. Keep on arrowing right.

JAWS: Falafel, selected. Selected contain 58 items.

Helicopter selected. Selected contain 58 items.

Robin: These are actually ones that I’ve used recently.

JAWS: Firework sparkler selected. Selected contain 58 items.

Robin: But at any time, you can start typing as well. So let’s say, for example, we were after things to do with hearts. So if I start typing H-E-A-R-T, …

JAWS: category. Search group. Smiling cat face with heart-shaped eyes, selected. Selected contain 13 items.

Robin: There are 13 possibles. Let’s arrow through.

JAWS: Stethoscope, selected. Selected contain 13 items.

Robin: No idea why that’s a hit.

JAWS:: Brown heart, selected.

White heart, selected. Selected contain 13 items.

Robin: Maybe because the stethoscope listens to your heart?

JAWS: Love letter, selected. Selected contain 13 items.

Robin: [laughs]

JAWS: Red heart, selected.

Heart decoration, selected.

Heavy heart exclamation mark ornament.

Black heart suit, selected.

Heart with arrow, selected.

Heart with ribbon, selected.

Beating heart, selected. Selected contain 13 items.

Robin: Let’s go for that one. Hit enter.

JAWS: Star. Untitled-Notepad. Edit.

Robin: And if I do a say line, …

JAWS: I love emoji. hotdog, confused face, beating heart.

Robin: There you go, guys. It’s that easy. Windows key and full stop. And as soon as you’ve hit enter on the one that you want, then it goes away again.

Ooh! Brainwave. Condiment. C-O-N for confused face gave us the salt shaker because it’s a condiment. Maybe.

Okay then, guys. Enjoy peppering all of your tweets, texts, Word documents, whatever it is you want with an abundance of emojis.

Thanks a lot. This is Robin signing off for now, and have a lot of fun playing with emojis.

What Are Accessible Options for Those Interested in Crypto Currency?

Jonathan: Peter is writing in once again from Budapest in Hungary, and he says:

“Hi, Jonathan,

I hope you’re spending your summer holiday the best way possible.”

Mate, it was epic! Epic! Seems like a while ago now.

“A topic that could be discussed in 2024 – how to handle your crypto assets on an accessible platform?

I have just begun to play with it a few weeks ago, and I kind of like it. My platform is Revolut.” (That’s spelled R-E-V-O-L-U-T).

“I don’t know if it is available in New Zealand. It is an application and a bank account through which you can buy and sell crypto currencies. I think Revolut is a Lithuanian firm.

The application is more or less usable with TalkBack, but has some challenges. The learning curve is rather steep at the beginning.

The other problem is that the maximum amount that you can sell or buy crypto currencies is $1,000. It is okay for taking crypto business a bit.

But I would like to know if there is an accessible platform for those who would like to invest on a little higher scale. Let’s say $10,000.

I vaguely remember that you mentioned a platform, or website, or application that you use to control your finances. The best would be a demonstration for beginners of such website or mobile app.

If you or one of your listeners took the challenge, I certainly would listen to it most carefully.

I know crypto is a dangerous area of finances. But if you have some spare money, it may be worth it to dive in a little bit.

Jonathan: Thanks very much, Peter.

I have not touched crypto, and I don’t expect I ever will. But we did have a listener talking about this some time ago, so you may want to do a search on the Living Blindfully website.

But if anyone wants to contribute to this discussion, you’re very welcome. opinion@LivingBlindfully.com, and 864-60-Mosen on the phone. 864-606-6736.

[music]

Voiceover: Stay informed about Living Blindfully by joining our announcements email list. You’ll receive a maximum of a couple of emails a week, and you’ll be the first to learn about upcoming shows and how to have your say. You can opt out anytime you want.

Join today by sending a blank email to announcements-subscribe@LivingBlindfully.com.

Why not join now? That’s announcements-subscribe@LivingBlindfully.com, and be in the know.

Samsung Accessibility Uses Audio Eye

Caller: Hi Jonathan, this is Wesley.

I just wanted to let you know that I called Samsung Accessibility, and I called to encourage them to remove AudioEye from their website. I told them the benefits were minimal, and I told them the track record of AudioEye a little bit, and they were surprised.

They seemed very responsive to my call. So they told me to send an email to them and tell them a little bit more, and I did. We’ll see where it goes from there.

So just thought I’d let you know, and I’ll give you updates.

Questions About the iOS Health App, and the Need for an Apple Watch

We’re going to Israel for this email from Haya Simkin, who says:

“Hi, Jonathan and other blindful listeners,

I decided to exercise more this year.

One reason is that I have taken up tandem biking, and I have trouble pedalling enough on one hand and saving my energy, so that I can pedal properly at the most crucial times. I especially don’t want my captain to work too hard because it isn’t fair to them.

For those who don’t know, captains ride in front and can see, while stokers ride behind and may or may not be able to see.

I have a good friend who is a stoker, and he is the one who got me into tandem biking in the first place. He is really good at being fit in all kinds of ways, and I want him to be my accountability partner.

I read that it helps to make a very specific commitment to a real person if you want to succeed in keeping a resolution. I committed to him that I would walk 4km a day, 4 times a week, and stop off to use the free public exercise equipment in one of the nearby parks.

He and his guide dog have no problem getting in about 6 km a day and 2 visits to the gym a week, and tandem biking for at least 30 km once a week.

He uses an iPhone and an Apple Watch because he’s a fitness geek. I don’t think I have a use case for the Apple Watch, and it’s expensive and so I don’t have one.

I decided to share my walking data with him because I won’t always remember to tell him. I sent him a sharing request through the health app, which he accepted, but he can’t see my data.

He shares his data with another stoker who also has an Apple Watch. And so he thinks that because I don’t have an Apple Watch, he can’t see my data. I haven’t seen any indication of this online, but I did see that you need to have the iCloud email address that the person you are sharing with uses. I have that now, but he still can’t see my data.

Do you or any of the listeners share data using the health data with someone who doesn’t have an Apple Watch? You’re a health geek, and I think you like to be accountable, and maybe even bragging about particularly good results. And so I thought you might know.”

Ha! Well, I don’t know whether I brag about the health stuff. I just like to get on with it because it makes me feel good. But I am actually sharing my health data with a couple of people.

To the best of my knowledge, this is an Apple Watch feature, and I appreciate that the Apple Watch may be out of reach for some people financially. But if you are committed to a health journey and you can possibly manage it, I’d highly recommend it, because you can set all kinds of goals in terms of how much you move. It will measure how much you exercise, and it will take account of all sorts of different workouts that you might do.

Because cardio is one thing, but you also want to make sure you work on your core strength – developing your muscles, your strength training, that’s really critical as well.

So if you can manage it, it can really help with accountability. Because then, your partners will be able to see whether you’re making your goals, exceeding your goals, all that kind of thing.

Is it essential? If you can’t get your hands on an Apple Watch, should you give up?

Absolutely not. I mean, people have been getting fit without these gadgets for centuries. But it does have many benefits.

And I think particularly for the psychology of it, you get encouragement. You can track your progress over time.

One of the things that I would say is it sounds like you’ve got some pretty ambitious goals. But I’d encourage you to consider breaking those goals down to achievable sub-goals. You’re obviously going to take some time to get fit. So if you can measure progress over time and say, “Okay, I moved a little bit more this week without getting completely tuckered out, and I can call that progress.”, then it’s encouraging. It’s motivating. And that’s where I think the Apple Watch can really assist you.

But as far as I’m aware, you’ve got to have the Apple Watch to be able to share those goals because really, other than counting steps, the iPhone can’t do too much on its own in this regard. The Apple Watch is the fitness gadget.

“My second question,” says Haya, “is about Siri and health.

As of 17.2, you can ask Siri about different kinds of data in the health app such as how much you walked today. You are also supposed to be able to add data to the health app, such as logging your medicine as taken. This would be very useful to me.

When I first installed iOS 17.2, I asked Siri how much I had walked that day after having given Siri permission to access health data and settings. It said, “I’m sorry, I can’t help you with that. Please check the health app.”

After a few hours, it did answer that and similar questions, but I have never managed to add data to the health app, and I can’t find a solution online.

Have you ever managed this? It would be handy to add my bike rides, or to log my medicine as taken with Siri because it would be faster, at least in theory.

I don’t know why it doesn’t work. Maybe I’m not saying the command right, but I can’t figure out why.

Perhaps you could even do a demo for the podcast. I’m sure, other people would enjoy that.

Keep up the blindful living. Woo!“, says Haya.

Well good luck with this journey, Haya. I really encourage you to keep it up. You’re gonna feel great, you’re gonna look great, it’s gonna be epic.

Regarding your Siri question, I have not bothered to play with this until I got your email.

I don’t have a compatible Apple Watch because I have the first Apple Watch Ultra still, but I do have a compatible iPhone, and I’m currently running the latest beta of iOS 17.3 as I’m recording this.

What they say is that this does not work by default. And to make it work, you’ve got to go into the settings of your phone, and then go into health, and then choose the option called data access and devices, and then Siri, and sure enough… when I went there and I found the Siri setting, it was set to off.

So I switched it on, and I used some of the commands that it says you should then be able to do, like what’s my heart rate, what’s my blood oxygen. And obviously, I should be able to get that information because it’s going into the health app because I have an Apple Watch, and Siri still responded that it couldn’t help with that, and to check the health app.

So then I thought, “Alright, then I’ll restart the phone.”, and I restarted the phone and still, it doesn’t work. I’m at a bit of a loss to know why. I don’t know whether this is region specific. Is this another feature that’s available in some markets but not others? But the setting was there. I was able to switch Siri access on, and then I would have thought it worked, but it just does not work for me.

It may be a beta issue as well, I have no clue, but it’s not something that I’m able to demonstrate.

Based on what you’re saying though, Haya, it sounds like you’re using it the way it’s supposed to work.

Blind People Generating Images With AI

Here’s an email from Shawn Keen, who says:

“Hi, Jonathan,

I hope you’re well.

I have been delving into some fascinating new technology lately, and it’s honestly mind-blowing. Sometimes, it feels like I’m communicating with something from another world.

But I always want to keep it real, as the kids are fond of saying. Gosh! That sounds like I’m 90.”, he says.

“I often wonder. With tech evolving so rapidly, how soon it will be before people start leaning on AI to craft and filter emails? That’s not my style, though. I put a lot of effort into making my emails genuine, with a little inspiration.”

It’s already happening, Shawn, at least in test, and it’s possibly out there in the wild in some form or another.

But I do know of people who are working on models where you send it a whole lot of sent items from your email in the past, and it will triage your email and even respond.

I think it has 2 stages. First, you review what it’s doing so you know how good it is. And then, as you gain confidence, it just goes ahead and responds.

It is a bit sad, I think, in some ways, but I guess it’s efficient, I guess it is.

“Jonathan,” says Shawn, “your work has always impressed me. Your joy for innovating with blindness technology is inspiring, and I notice that you bring the same ingenuity to the realm of AI.”

Thank you, Shawn! I really appreciate your generosity.

“As much as I geek out over ChatGPT, let’s be honest. Not everyone’s going to dive into AI for a simple picture or a description.

Speaking of this, I’m eagerly anticipating the time when fields like video editing and visual content creation become more accessible to the blind community, thanks to AI.

I had a chat with Dr. White from Vanderbilt, a computer scientist professor, and he was quite impressed with how I’m leveraging AI. He teaches some courses on AI prompting, which I have found educational.

Here’s a fun thing I have been doing with AI – creating pictures. It’s surprisingly therapeutic to turn emotions or life’s ups and downs into virtual art, even if it’s just through simple keywords. I aim to create something extraordinary, like those stunning images you see featured on Bing every week.

Here’s a prompt I created and used recently for an amazing painting, along with a dog picture I crafted. I’m open to feedback if you or anyone spot any flaws, and feel free to share.

So here is the prompt:

Create a stunning oil painting using premium grade oils, canvas, and brushes. The painting should exemplify a harmonious blend of elements, perfect balance in the layout, great texture that can almost be felt, great meticulous brushwork that showcases the artist’s skill. The composition should be strong and engaging, drawing the viewer’s eye across the canvas. Emphasize value with a thoughtful use of light and shadow to give depth and dimension. Showcase technical mastery in every stroke and detail. The subject of the painting is about, …”

And that’s the end of the prompt. So you’ll have to imagine what the subject is.

So this is a really interesting blindness use case for AI, Shawn, and I’m glad that you brought this up because it’s something I’ve been playing with a little bit.

The nervousness I still have is that I like to be able to verify the quality of what I’m producing before I let it out there. This is one of the reasons why, even though I take quite a lot of pictures, I’ve been a bit hesitant in the past to post them on social media, in case they look like a blind person took them, you know. [laughs]

I like to take pride in the quality of my work. It’s why I always put things through a spell-checker when I write, that kind of thing.

But I guess you can use the same AI in reverse. For example, when I take a picture of something, I often ask Be My Eyes Now to tell me about the picture. And if it makes sense, if it’s what I thought I took, then I might share it.

And similarly, I guess, if you don’t have a sighted person to hand, if you give it a prompt like that, you can ask the AI to describe the picture to you and see if it describes what you wanted.

“I would be very interested to hear from anybody else who is playing with this image generation stuff, and whether you’re using it just for fun or maybe it’s playing a part in the work that you do, even.

Changing gears a bit”, says Shawn. “I was wondering how your hearing is doing.

I recently emailed with Jim Snowbarger about hearing loss and his experience with implants.

I have some mild hearing loss myself, and my dad at 77 still uses hearing aids occasionally. The joys of aging.

Jonathan, just wanted to say thanks for what you do. Your contributions to our community don’t go unnoticed. You’re a significant part of our lives, and I appreciate that.”

Well thank you, Shawn! That’s a really lovely email to receive.

And thanks for asking about the hearing. My hearing, knocking on all sorts of things, has stayed stable for quite a few years, even though I have a condition that may well result in further deterioration, and even potentially cochlear implants at some point down the road.

But my hearing stabilized when I went to keto. Now, I can’t prove cause and effect, of course. But since I really started looking after my health, and eating properly, and exercising, and meditating, the deterioration has at least been slowed right down to the point that I go to the audiologist and they go, “Well, you still got the same hearing you had last time we saw you.” And that’s interesting with the condition that I have.

So I’m no medical professional. But what I can say is that really taking care of myself has taken care of my hearing. And I kind of feel it intuitively that I would be hearing a lot less, had I not made that dramatic change of lifestyle. So worth it, eh? Because my hearing is pretty important to me.

Suno.ai, All the Subscriptions, and Improvements in Mainstream Gaming

Markus is writing in from Germany, and says:

“I enjoyed the Living Blindfully episode in which you mentioned Suno AI. I have since tried it, and I’m currently on a monthly subscription.

It is a fun thing to use indeed. I can build my own backing tracks to jam to now.

Being a self-publishing musician myself, I noticed that we don’t need to worry just yet that AI might soon replace real music.

My thoughts:

  1. ai is a great inspiration and as a creative playground for short snippets, it gives us the feeling that we can really create music at the press of a button.

However, I have noticed that the more we expect this tool to do, the longer the song and the higher our expectations, the harder it is to get something seriously good out of it. Example below.

  1. In order to use Suno.ai for something like a proper song, we need to have at least some experience and musical understanding to make sure that the results make sense musically.

I aimed to follow your description of the custom mode, and translate your operation of the webpage to macOS Safari.

I pulled out 2 verses that I wrote in December, the day after our beloved cat Tibbs had passed away. They sounded slightly pathetic, and I wasn’t going to record and produce a song myself. But I still had the verses, so I thought I’d use them to make a country song about our cat with Suno.ai.

I was slightly optimistic when the third result sounded like a song that I could imagine this to become. I quickly wrote another verse, a bridge and a chorus, and had Suno.ai stitch together a song.

Sadly, the intro that I had prompted was missing, and in the outro, the singer sang some unintelligible lyrics.

I tried to remix the original result and start again. No success.

I created another clip after the original result. The voice and music were the same, but the syllable stressing using the very same input were totally different.

Recreating a song using the same parameters yielded results with no lyrics at all, all lyrics that had nothing to do with my source material.

Conclusion? After the farsight of 40 attempts, I took the original, made a proper fade-in in Logic Pro, and cut out these random lyrics at the end. And you know what? Now this very cheesy country song about our cat, which I would not have recorded myself, sounds quite good after I passed it on to AI to sing it for me.

The method in macOS Safari differs from Windows quite a lot, and I use VoiceOver hotspots a lot to skip to certain buttons. It works pretty well.”

Well, a couple of things before I go on.

First and foremost, Markus, my condolences about the loss of your cat. They are companions, aren’t they? And when they go, you feel it deeply. So I do send my very best.

And second, unfortunately, the user interface radically changed, not just on Safari, but everywhere about a week after I recorded that Suno AI demo. [laughs] The timing couldn’t have been worse.

So although the concepts still apply, the actual how you do what you can do with Suno AI has altered quite a bit because they completely redid the interface. Not for the better, in my opinion, but maybe that’s just because I got used to what they had.

Markus continues:

“A friend of mine paid for the 12-month subscription, which inspired me to think and ask you for your opinion. Nowadays, especially with AI clients in mind, do annual subscriptions make sense anymore?

About 10 or 15 years ago, it was definitely worth it to pay for, let’s say, an MS Office 365 subscription because hardly anything would happen during that period.

But now, thinking only 12 months back, this time last year, Be My Eyes had no AI feature, there was no Suno, ElevenLabs hadn’t taken off yet, Microsoft Soundscape hadn’t been reborn as VoiceVista and Soundscape. What a year for blind people.”

I tell you what, Markus. The number of subscriptions just add up, don’t they? And it’s something I’ve been really conscious of lately, because I just seem to have so many subscriptions. I’m going to sit down and go through them all, and make some pretty difficult decisions about which ones I really use that often. Because it’s so easy to get captivated by this rapidly moving technology and to subscribe, so I understand the point you’re making.

The counter-argument could be, none of these services have actually been superseded. They’re just additions. So a subscription you took out, say, a year ago, I don’t think it’s necessarily out of date, and that there’s something better that’s come along. It’s just that new things are coming along in new areas.

“Speaking of Microsoft”, says Markus, “I only learned a few days ago that the racing simulation Forza Motorsport on Microsoft’s Xbox is accessible for blind players, and so are numerous other games such as Mortal Kombat, or Street Fighter. All these classics that I played when I was a teenager and able to see.”

Yes. Mastodon has been full of this for quite some time now. And it is exciting, isn’t it? It’s great to see this gaming domain, which has always been a bit out of reach, becoming more accessible. And I know many of us got by, you sort of got used to the noises sometimes.

When my kids were younger, we had a Wii. And we deliberately chose the Wii because there were certain games that you could play. The tennis game in Wii is just so much fun. There’s a bowling game as well, and a few others that really were quite friendly for blind people to play.

But it’s good to see true accessibility coming into this domain now from mainstream game developers.

[music]

Voiceover: On Living Blindfully, we hear the opinions of blind people from all over the world.

So why not share yours?

Drop us an email. You can write it down, or attach an audio recording. opinion@LivingBlindfully.com Email us today. opinion@LivingBlindfully.com

Or if the phone is more your thing, phone our listener line in the United States: 864-60-Mosen. That’s 864-606-6736.

Shubh Mittal Tells Us About the Seleste Glasses

Seleste Innovations is a company promising to revolutionise assistive technology. The Seleste glasses offer a range of hands-free functions including a text reader, scene description, a music player, and even more.

I’m joined by the CEO of Celeste Innovations and one of its founders, Shubh Mittal. It’s great to have you on the podcast. Thank you for coming on.

Shubh: Thank you for having me on. It’s good to be here.

Jonathan: How did you come to the assistive technology industry?

Shubh: I was researching smart glasses for a while in university, and I had a friend with me that was studying computer science with me at university, and he was blind. And I was just talking to him about smart glasses, and he was talking about how useful something like that would be for him.

So then, I just started talking to other people with vision loss about smart glasses, and just got great feedback.

I’m like, “Okay, I think there’s a real problem that we could solve here.” I just went from there.

How long has it been going for? You’ve had this product idea for a wee while then, I guess, before it finally came to fruition.

Shubh: Yeah. It’s been about 2 years now. I think the first year was really just talking to people, understanding the problem.

I started volunteering with the Paralympic team in Canada that plays goalball. So that was helpful.

And then now, we finally have a product and are able to give it out to people.

Jonathan: I guess you are asked this question a lot but over in the Netherlands, there was another group of people doing something similar, and that is the Envision smart glasses.

Shubh: Yeah.

Jonathan: And they’ve got to a point now where they’re quite a mature product. They’ve got a bit of brand recognition. Can you tell me how what you offer differs from what they do?

Shubh: Yeah, absolutely. I think not even just Envision but everyone in the assistive tech space. It was really interesting when I first got into the market because I was used to just products that are meant for the general public. And the assistive technology space is just so different, where everything in the AT space just seemed to be so expensive, and so bulky, and not as high of a quality that I was used to with general products.

And I realized it didn’t have to be that way. There’s no need for this technology to be that bulky and expensive. And so we set out to make smart glasses that had more functionality and that people could actually afford. So I think that was a huge one.

Jonathan: They’re using Google smart glasses, which are under a bit of a cloud at the moment. In fact, I think they’ve been discontinued, but Envision are reassuring people that they’ve got sufficient stock.

So are you making your own hardware, or have you licensed other hardware?

Shubh: Yeah, we have our own hardware for the glasses, and that’s how we’re able to have it be much more affordable is because we do something unique, which is like having the processing from the glasses be in the phone, and so the glasses can be lightweight and be affordable, and yet have close to 10 times the processing power because we can utilize the processing power of the phone.

Jonathan: How does the glass talk to the phone? Is that a Bluetooth connection?

Shubh: Yes, it’s over Bluetooth and WiFi that it talks to the phone.

Jonathan: Does the app that you offer work stand-alone as well, or do you need the glasses to use your app?

Shubh: No. You need the glasses to use our app.

Jonathan: Right.

Shubh: But it’s something else as well that once the glasses connect to our app, you don’t need to have your phone out, like you need to keep your phone in your pocket and just use the glasses normally.

Jonathan: Hmm. Because I know that EnVision, for example, they use their phone app as a bit of a loss leader, I guess you’d call it now, where they get people into the interface, people understand what it does, and then their pitch is, “Okay. Well now, you can have all of this hands-free so you don’t have to hold your phone.”, take it out, as you say. Have you thought about offering any kind of phone-based service to get people into the ecosystem, as it were?

Shubh: Yeah, that’s interesting. I think we’re genuinely lucky that companies like Envision, and OrCam, and assistive tech companies have come before us because like when we talk about smart glasses, people know what we’re talking about, which is really fascinating.

Like when I go to people not in this space and talk about smart glasses, people don’t really know what I’m talking about. They think about like Google Glass from like 10 years ago, things like this.

But in the visual space, people understand using AI to help you read something, or using AI to help you learn about your world, and using smart glasses for that as well. And so I think people understand that this technology needs to exist, and this technology can be helpful. And hopefully, we’re just going about implementing and executing a better product for that.

Jonathan: Can you describe the appearance and the weight of the glasses a little more? What are they like to wear and use?

Shubh: A big goal for us was to create glasses where, you know, when you’re just walking on the street, like nobody can tell that you’re wearing smart glasses. And so I think we’ve done that, where we have glasses, they’re black frames, they weigh just about 2 ounces, and so they’re fairly light. And they have a camera in the middle of the glasses. And yeah. Other than that, they just look like normal glasses with slightly wider legs.

Jonathan: And an 8 megapixel camera, right? So quite good resolution.

Shubh: Yeah, we have an 8 megapixel autofocus camera, and so it can like read smaller text, and also read like a large billboard across the street.

Jonathan: How long does it take? I mean if you take a snap of a piece of paper, let’s say you’re at a restaurant and you want the menu read, can you give me any kind of indication what’s the lead time between when you take that picture and when the app will start to speak the information back?

Shubh: Well, we have a couple different modes. So for example, if you’re just reading text, you should get a response in like 3 to 5 seconds to just read the text in front of you.

Our most used feature right now is like, you have scan text or instant text as it’s called, where it’ll just keep reading the text around you as you walk around, as you look around. And so every couple seconds, it’ll just read that again.

But if you’re using other things, like we have advanced scene description on the glasses so like in detail, describe your environment. And so that will take longer. That’ll take like closer to 10 seconds, but it’ll provide like a lot more information about the world around you.

And I think what we’ve done that we’re really focusing on is we have a smart assistant on the glasses. And the smart assistant is just a completely different way of using assistive technology, where you can actually ask specific questions, like as an assistant where you can ask it specifically, “Read me the appetizers on this menu.”, or like “Tell me what shirt I am holding.”, and it’ll be able to take a picture, and tell you, and answer that specific question.

Jonathan: Is it your own technology, or are you using GPT AI kind of stuff to leverage that?

Shubh: Yeah. So at the core, it is GPT. But then, we have to do things on top of that.

We also have it so the glasses remember the things that you ask it about, and the things that you look at. And so this means like if I take a picture of my fridge, and it tells me everything that’s in my fridge. A week later, when I’m at a grocery store, I can ask it to remind me what’s in my fridge and it’ll list it out for me. And I know now like what I need to buy, and what’s missing.

And so like these are all like complex problems. And GPT is great, like it’s an amazing technology, but it’s just the base layer of what’s required.

Jonathan: You can also tap into assistance from sighted volunteers. Are you partnering with Be My Eyes for that, or do you have your own way of connecting with volunteers?

Shubh: No, we’re not using Be My Eyes or Aira right now. We’re actually focusing more on being able to use the assistant to get help for things, instead of having a sighted volunteer. I think that is something that we’re going to focus on and improve our technology to have volunteers in the future.

But yeah. Right now, we realize that through the assistant, you can ask questions that you might think that you need a person for. Like if you’re trying to find something, you can have the assistant take multiple pictures as you look around and walk around to find that object.

Or you can ask questions like, “Okay. Can you read me the store names as I walk around this mall?” And then, it’ll just read you the store names.

And so, AI is getting to a point where it can do these active tasks that you didn’t think was possible. And yeah, we’re hoping to leverage that to give people more independence.

Jonathan: How are you finding feedback in that regard?

One of the challenges is that very scenario that you talk about – walking around a shopping mall, for example, (and many of us have got pretty good blind people who’ve been blind a long time or all our lives at telling what store we’re passing, perhaps by the kind of music it’s playing, or the smell coming out of the store, that kind of stuff.) But actually just getting reliable feedback as you walk around a mall about what stores are around you is a really important use case. How’s that working out for you with the Seleste glasses?

Shubh: It is important. And it’s just a matter of improving the assistant where we can do these things now. And I think because technology like GPT has gotten so good, we have detailed information about the world around us. We can enable these kinds of technologies where GPT is able to look and analyze an image almost as well as a person can. And so it’s to the point where these AI are able to tell you about your surroundings almost as well as like having a sighted person there, except you’re able to do that on your own.

The feedback’s been great. We’ve had people ask our assistants for lots of questions like, you know, “Can you tell me where the light switch is down the hall?”

Or a couple of weeks ago, actually. I got on a call with someone, and she’s saying that the day she got the glasses was also the first time in her life she was able to play board games with her family, where the glasses read her cards for her. Because before then, she’d always have to partner up with her husband or partner up with somebody else and play board games. But she was able to do that by herself through the glasses.

I think there’s a lot of really powerful use cases out there.

Jonathan: It really is quite remarkable. I have been in situations where I have had ChatGPT describe a scene to me. I might just take a picture of my environment and find out what’s going on, particularly if I’m in a restaurant or something, and I want to know what that restaurant is like, what kind of decor do they have, that kind of stuff. And sometimes, people have said to me, sighted people who’ve been with me when I’ve done that, you know, it is amazing what it’s picked up. And had you asked me to describe that to you, I would not have picked up on that particular nuance that OpenAI has given you. The tables really have turned.

Where are you at with the product at the moment? Obviously, it’s in the hands of users. Do you still consider it in beta right now, or is it actually officially released?

Shubh: No, we are kind of transitioning between beta. I think it’s still a beta product. We have given it out to our first half of pre-orders, and we’re giving out to the other half of pre-orders just this month.

We’re getting a lot of feedback now and a lot more people using them.

We’re going to be releasing new hardware. Part of our subscription is that we’re constantly releasing new glasses and constantly upgrading. And so we realized that we got a lot better feedback on these glasses than we were expecting, where people are using them a lot more than we thought.

And so we’re going to expand the number of beta users that we’re going to allow. Before, we kept it at 100. Now, we’re going to go more than that. And so it is still in beta.

And I think part of our culture at Seleste is that we’re just updating very often. We update our app about twice a week on average. And that means sometimes, things break, and that sucks. That’s just the part of moving quickly and creating a good product. And so, I think that’s gonna be part of our culture for a while. But at the very least, people can expect like they will just always have the latest technology. There’ll be new features coming out all the time.

Jonathan: Obviously, it sounds like you’ve got a lot of innovation going on there.

You mentioned affordability. So can we talk about the pricing model that you’ve chosen to adopt? How does it work?

Shubh: We have a subscription model instead of this giant upfront payment for the glasses. So the way that works is that it’s $100 USD upfront. And then when you get the glasses, it’ll be $50 every month.

And this is a subscription, so you don’t technically own the glasses. It’s just you’re paying every single month.

But in exchange for that, you get all of these software upgrades, like I said, twice a week, and then also the hardware upgrade.

So the other problem that we realized is that in assistive technology, you’ll get a piece of tech, and it’s just this really old, outdated product. And it’s so expensive that it’s hard to even justify upgrading it.

And so because we’re on that subscription, we can constantly keep upgrading the hardware and make sure people are constantly on that newest technology. And so that’s another part of the subscription.

Jonathan: How have customers responded to that? Because I know sometimes, there’s a bit of hostility about these subscription models. And there were some who say, look, I would rather fork out a lump sum and be done with it, rather than pay across month to month because by the time I get to say my fourth or fifth year, the company’s making extensively on the deal by that point, aren’t they?

Shubh: Yeah. Well, that’s completely fair because we expect that people have our products for a long time. And we hope that that’s the case.

That being said, I mean, for some of these more expensive assistive tech products, you’d have to be paying your subscription for a bit more than 7 years for you to have paid the same as some of these assistive tech products. And within 7 years, hopefully, you will have bought another pair of glasses. And so I get to the point where if you’re actually using the technology, you wouldn’t be spending more than you would otherwise. And that’s what we expect from people that have the glasses.

And I think what’s also unfortunate is that a lot of times, other people buy assistive tech and they’ll use it for a couple months, and then just collect dust somewhere. And our company will go under if that’s the case, you know, like people stop using your glasses, they just return them and we will not be able to sustain as a company.

And so what we’re really focused on is making sure that every single person has an amazing experience, there is amazing customer support, and they continue using it year after year. I’ve gotten calls with lots of people, and I feel like it’s very rare that I’ve seen that in assistive tech that people are continuing to use the product.

Jonathan: Right. So your argument would be that because you’re dependent on subscription revenue, it’s an incentive for you to keep innovating. Otherwise, the subscription revenue dries up.

Shubh: Yeah.

Jonathan: Yeah.

What about blind couples? So I’m in a household where my wife is also blind, and we might want one pair of glasses. But probably, we wouldn’t subscribe with a subscription twice, right? Because that’s $100 a month. Is it easy for 2 people to share a subscription or share the hardware when they’re in a two blind person household like that?

Shubh: If you want to, you could. There’s no reason you couldn’t. You just individually download an app on the phone, and you can do that. But I think, that is more rare.

Actually, just yesterday, I got off a call with a blind couple, and both of them did decide to buy the glasses because the benefit of smart glasses is that you can have it with you throughout the day. And it’s different than a phone, where a phone is very manual. Like, you have to constantly pull it out and tell it exactly what you want it to do when you need it, which is pretty constantly throughout the day.

But with glasses, it’s just always there. And so if you’re cooking, you can just say, “Hey, where’s oregano And these shelf of spices?”, and it’ll help you. But you necessarily won’t know when you need it or how you need it, but it’s always there, ready to help. And so I think that’s the huge benefit of smart glasses and of having your own smart glasses. But I mean, yeah, absolutely. People can have the option to share if they’d want to.

Jonathan: Yeah. I mean I love my wife, but there’s no way that I would let her take my phone away from me. [laughs]

But on the other hand, if it’s the glasses, and we’re sitting there with the glasses on the counter, let’s say in the kitchen, and she says, “I’m off to the mall today, so I’m going to take the glasses with me to the mall.”, that’s a use case that I can understand, that it’s more of a communal item than say a smartphone is.

Shubh: Yeah, I see what you mean, for different tasks that you’re doing throughout the day. Yeah, I guess it just depends on what people are doing.

The way that we see assistive technology in general, the way that we see our product is that our long-term mission which we actually don’t talk about that often, but our long-term mission is to replace your need to use a phone. [laughs] It’s a little bit crazy.

But um, there are so many problems with the way we use technology right now, where it is active, and we have to carry around and to tell it exactly what we needed to do.

I think the future I see with smart glasses is that when it’s just there with you, it can start to understand you and your environment, and be able to help you with things. Like we, for example, internally in the team, we haven’t released this just yet. But um, we have been working with Gmail. And so when you get an email, you’ll just be notified and we can help respond to that email.

Or you know, we can just help you with other tasks. like I’ve talked to people about how annoying it is to have to keep a to-do list. Small things like that are so annoying. If you have this assistant where you can just say the words, “Hey, I need tomatoes.” And then you go to the grocery store, remind you. Stuff like that is, I think more powerful, and I think people are going to slowly start to realize how powerful this technology is.

Like I said, we’re continuing to build this. I don’t think we’re there yet where it can replace a phone. But long-term, that’s what we want to do.

Jonathan: Right. So it’s kind of like part ChatGPT, part Amazon personal assistant, that kind of thing, that you can actually interact with, converse with the environment. Of course, some of the pro phones like the iPhone 15 pro and even earlier, I think going all the way back to the 11 have technology like LIDAR, which makes certain functionality available that maybe glasses at the stage can not. Would that be right? In terms of obstacle detection, the people finder, getting on a bus and finding things, door detection, that kind of thing.

Shubh: Yeah. I mean, like for distances.

Yeah, I think we don’t have LIDAR on the glasses right now. And so we can’t do exact distances. But you can do a lot with cameras, and we’re going to be operating glasses to have multiple sensors so that you can start doing things like that.

But even right now, people do use them actually to, like I said, like that one example of hey, there’s a light switch in this hallway. Help me find it. And the glasses were able to tell him.

It’s similar to saying like, oh well, a person wouldn’t be able to help you find a door. But when a person’s looking through a video call, they just have the image, you know, they don’t have LIDAR information. They don’t have anything else, and they’re still able to help you. And so that’s the same idea of getting help through the glasses.

Jonathan: Do you see this ultimately becoming an ecosystem so that if companies like Be My Eyes or Aira wanted to come on and offer their services via your glasses, they would be able to do that?

Shubh: Yeah, absolutely. We’re definitely talking a lot with these companies and absolutely, we want to do that.

We’re definitely not trying to rebuild all of this technology. They’ve done amazing work, and so we want to bring them on.

Jonathan: So you’ve got a bit of a demo set up there, right? You can show us the glasses working in a practical way?

Shubh: Yeah, absolutely. So like I mentioned, there’s a couple of features on our glasses. There’s the smart assistant, there’s scene description, scan text, things like that.

And so like for scene description, for example, this is the classic, you know, Be My AI describe all this amazing information around me.

There’s buttons on the glasses. So when I press the button on the glasses, it’ll take a picture. If I double press the button on the glasses, it will change modes. And there’s a mode for smart assistant where once I press it, I can start talking and it’ll do a task for me.

So yeah, if you want to do something like smart assistant, you press the button on the glasses.

Seleste smart glasses: Taking picture. Analyze.

Shubh: And it analyzes and will describe the environment. Scene description is one of our longest because it’s trying to describe everything. Whereas, you know, smart assistant just has to describe that one thing, or scan text is just reading that text.

Jonathan: So it’s analyzing right now? Here we go.

Seleste smart glasses: This image shows a work desk environment with various items.

On the left, there is a green potted plant with variegated leaves placed on the desk.

And beyond the plant, a white office chair can be seen, suggesting an office or similar professional setting.

In the center and right of the frame, a laptop is open and appears to be on a webpage or a user interface with a dark theme, and at least 2 items listed vertically. These items are shaded in green, and are likely to be a part of a menu or inbox as they resemble the style of interface used for email or communication applications. The text on the laptop screen is too small to read clearly. In the bottom left corner of the image, a hand is holding a smartphone with a blue screen displaying some options. The screen shows AI modes at the top and below. 2 options are visible, scan text and text recognition, suggesting the use of an application that utilizes artificial intelligence to process text.

In the foreground, a white charging cable is laid out on the desk, creating a somewhat circular pattern due to its coiled nature. It’s connected to the laptop but not to the phone, implying either the phone has finished charging or it’s not the device currently being charged.

The environment appears to be well lit, possibly with overhead lighting, which suggests an indoor setting that is used for work or study purposes.

Jonathan: That is quite exceptional. [laughs]

What’s the text-to-speech that you have going on there? That’s very nice TTS as well.

Shubh: We decided that this week, having it sound a lot more natural. It’s a bit slow. That took a while. But it sounds like a person. I think that’s really cool as well. We’re going to give people options to control speed, and all that.

Jonathan: Are there any other things you want to show us with the glasses?

Shubh: Yeah. So when I took that picture, I was looking kind of in between my laptop, this plant, and my phone. And so if I looked at my laptop, for example, and took the photo, it could read the text on my laptop. I could ask it things.

So there’s a lot of information there. And it didn’t dive deep. It could have dived deeper into specific things.

I could have asked it, what is the species of that plant? What site am I on? Who am I talking to?

Because it was describing my laptop, which is the platform that we’re on right now. Or like, is my laptop charging?

The way that people have been using this technology is just to describe everything around you. And I think that’s cool, and interesting, and fun to play with. But the real benefit and power comes from being able to ask specific questions about all of this information and get specific feedback back to you, and then also being able to have this stuff run in the background, like having it just keep taking photos as you’re moving around, and keep describing what’s happening around you. Kind of like having audio description for a movie, but in, you know, real time in front of you. And so those are the kinds of things that we’re working on building.

Jonathan: I mean, that’s going to be the holy grail, right? If you can get to the point where that sort of ChatGPT style or OpenAI style description is actually available in real time, so that if I’m walking through an airport, for example, and I’ve never been to that particular airport before, and I can get guidance in real time about what’s going on, that will be huge. But I’m not sure where the computing is quite there yet. Is that a fair assessment?

Shubh: I think we’re really really close. I mean, we’re going to actually add that feature, like exactly talked about as, you know, as in when I’m in the airport, give me guidance as I walk around. Or, you know, tell me what’s going on as I walk around. We’re going to add this scan environment feature in the next 2 weeks on the glasses.

Jonathan: [laughs] Okay.

Shubh: Where it is constantly taking photos. I mean, you’re right. What is the latency in this? But when you ask specific questions, you get a much faster latency. What you just saw, the latency for that was the max that will ever get. You know, like when you ask specific things, it’ll be much faster

Technology is getting much faster. Like what you just saw, if people heard of Gemini which is Google’s version of GPT, that’s already much faster with these image descriptions. So yeah, I think we can get much faster.

And you know, if you have a latency of five seconds, as you’re walking around, that isn’t insane.

This isn’t replacing a cane. You’re about to hit something right now, in the next second, you obviously use those devices.

But for broader things, like I am walking around a mall, I’m walking around an airport, it doesn’t need to be every second. If it’s every 5 seconds, like you won’t notice that it’s happening. And it works really well. So this technology is here right now.

Jonathan: How do you, or how does the assistive technology industry, I guess, set expectations around this? Because there are times when these LLMs just hallucinate. That’s a fancy word of saying they just make stuff up, and it’s completely wrong. [laughs] And obviously, if you’re walking around an airport or a mall, you know, that could be annoying and potentially even a safety issue.

Shubh: Yeah, absolutely. And hallucination is so much scarier than before, when it’s just like, I just don’t know, you know. Saying I don’t know is so much better than saying the wrong thing very confidently.

And so there’s lots of work that needs to be done for this technology. One base thing that we do is we tell people, and we have this in our terms on our app as well that it will get things wrong. And so if you have things that are really important, like you’re looking at medicine, you’re going to check if this food that you’re holding has something that you’re allergic to, things like that, absolutely get somebody to help you with that. Don’t rely on any AI, really, for things that are that important.

At least don’t use our app to tell you if you should walk down the street or not. I mean, there’s amazing companies like Oko that have been able to do that really well. And hopefully, we can integrate with them.

But um, we try and be really careful. And we try and program that into the assistant as well. So when it is describing something around you, great. But when you tried asking it can you read this medicine bottle to me? Or like, does this have peanut butter? Because I’m allergic.

We’ve programmed it, in that it will say like, “Hey, I don’t think so. But I’m also not sure. Get a second opinion. This isn’t a question you should be asking me.”, and we’ll try and do as much as possible where we’ll kind of censor it to make sure that it doesn’t give advice that people shouldn’t be taking, the same way that you don’t call Ira to ask them if you should walk across the street. Aira tells you don’t do that. And it’s the same thing because even if a person on the other end, things can happen.

You mentioned that the glasses have like a headphone functionality. At least, your website does. So it’s also a music player on top of everything else. Is that correct?

Shubh: Yeah. Honestly, isn’t too accurate. You can play music through the glasses. I don’t think you’d want to. It’s mainly for text and it could play that kind of sound.

But that’s just on the current hardware. We’re releasing new hardware to people on the subscription next year. And so that will have a better speaker, where you can actually use it to listen to music, or hear VoiceOver and things like that.

I mean, you can do that with our glasses. But our speaker right now, just like all the other devices on the market right now, only have a speaker on one side. It’s not on both sides. And so it’s only in your right ear you’re gonna be able to hear audio. And that’s not great for hearing music. So we’re upgrading the glasses in the hardware, to be able to have it be on both sides, and we’re gonna be the first ones to have that once we get those glasses out. But yeah. For right now, it’s great for understanding your environment.

Jonathan: Okay. That could be quite significant because if you’ve got stereo, then you may be able to use the speakers on the glasses to provide some sort of spatial awareness as well about the environment.

Shubh: Yeah, absolutely. We spent a lot of time thinking about that.

I’ve actually been talking to the previous founder of Soundscape, and he was giving us advice about how we can develop this because they did a great job with spatial audio.

So absolutely, we want spatial audio. We wanna be able to tell you objects and make it sound like they’re coming from that direction. And we can do that with 2 speakers that we’ll have in the future.

Jonathan: And do the glasses independently connect to a WiFi network? Where are they getting their internet connection from?

Shubh: When you’re setting up the glasses like when you download the app, you give it the password for the WiFi connection. And then in the future, it’ll just automatically connect to that WiFi, or WiFi that you’ve connected to. So when you go from like your home to the office, it’ll just automatically connect to your office WiFi and back and forth.

And same with hotspot. When you turn on your hotspot on your phone, it’ll connect to that. And so once you kind of set it up, it’ll just automatically connect to a different WiFi.

Jonathan: Right. So you’d be using, say on an iPhone, you’d be using the personal hotspot feature when you’re traveling out and about to get the internet connection?

Shubh: Yeah, yeah, exactly.

Jonathan: iOS and Android at this point?

Shubh: We’re focusing mainly on iOS. We can go to Android soon. We have an old Android app that we haven’t updated in a while. But we’re mainly focused on iOS right now.

Jonathan: And if people want to experience the Seleste Glasses, what’s the current state of play? Can people get on a waiting list, or how does it work?

Shubh: Yeah. I was mentioning we previously have had long wait times. We’ve had a lot of pre-orders. But we are expanding that now.

So I think for right now, if people are placing an order, we should be able to get them the glasses within 2 months.

You also place an order through our website. That $100 deposit you pay, if you decided you don’t want the glasses, that’s just fully refundable. So you can just email us, and we’ll refund that for you. No questions asked.

And we’re also doing promotion right now, where the first month when you get the glasses is also free. And so you get the glasses, you can try them out, see if you like them or not. And if not, you can return them.

Jonathan: Any kind of geo restrictions, or are you shipping worldwide?

Shubh: So on our website, you can only order if you’re in North America. But we’ve been taking orders in the UK and Australia. And so if you’re in that place, which I suspect actually a lot of people listening to this podcast maybe are because you aren’t in North America.

Jonathan: Yeah. [laughs]

Shubh: So yeah. I think we would ship internationally, but you just have to email us because there’s other things, like it would take longer to ship, if you want to return, that stuff would take longer and so you just have to be aware of that.

We also do shipping free within North America. But internationally, we wouldn’t be able to do that. So yeah, that’s a special thing.

Jonathan: Why Seleste? How did you come up with that name?

Shubh: [laughs] I don’t know. I like the idea of space and having an angel on your shoulder, things like that, and Celestial. And so Seleste is just like a nice sounding name that just embodies this broader vision that we have for this product, and not just the way people see assistive devices right now, just like a tool. So that was just what I was trying to embody.

Jonathan: It’s really interesting to hear this product being developed, and I look forward to seeing what happens next.

I appreciate you coming on the podcast, and we’ll definitely watch it with interest.

Shubh: Yeah, perfect. Thank you for having me on.

If anyone wants to get more information, you can shoot me an email or go on our website. So I don’t know if you can share that some way, but my email is shubh@seleste.co. So that’s S-H-U-B-H @S-E-L-E-S-T-E.c-o.

Jonathan: And that’s the website that people can go to to find out more information as well. seleste.co. I encourage people to check that out.

So let’s keep in touch, because I’m sure that exciting things will be happening in future for you guys. And all the very best for the future.

Shubh: Yeah, thanks so much. I’ll definitely shoot you an email in 2 weeks, when we have that mode where the system’s able to tell you things as you walk around.

Jonathan: Yeah, brilliant.

Shubh: I’ll keep you updated.

Feedback on the Seleste Glasses, and a Minecraft Accessibility Mod

Hello, Elijah Massey, who says:

“I have never really been very much into gaming. But recently, I found a Mod that makes Minecraft accessible.

For those who are not familiar with it, Minecraft is a game where you can mine and gather materials, craft and build a lot of things, and explore the world. You also have to survive monsters and other hazards.”

Sounds like life in general.

Actually, I do know a lot about Minecraft because my kids love Minecraft. They’ve got their own Minecraft server, and they do all these things.

And I’ve always felt a bit sad that I can’t participate in it with them. I thought that when Microsoft acquired it, that things might have moved on a bit further than they have.

Elijah continues:

“The Mod allows you to move the camera or the direction your character is looking using the keyboard, and it will speak which block your character is looking at. It also adds keyboard commands for breaking blocks, placing and interacting with blocks, managing your inventory and crafting, getting other information about your surroundings and some other things.

It also makes sounds for valuable ores, objects you can pick up or creatures around you. And if you’re wearing headphones, the sound comes from the direction of the object. The sounds of the game itself are also directional, so you can hear where they’re coming from. And it must be surround sound or some kind of spatial audio because the accuracy is way better than stereo.

I am still learning and practicing. But so far, I have found the game very playable. I’ve even fought a few monsters successfully, although I still have trouble with that sometimes.

The Mod works well on both Windows and Linux, and it is called Minecraft Access or Blind Accessibility on CurseForge.

There are installation instructions on its website. And although it was a little difficult to install, the instructions were very helpful. There are also lists of keyboard commands and other documentation. If you try to install it, make sure you also install the Architectury Mod.” (That’s A-R-C-H-I-T-E-C-T-U-R-Y) “Mod. Or else, it will not work.

It talks through your screen reader, and it works with JAWS, NVDA, and Narrator on Windows, or Speech Dispatcher on Linux.

You can also play on servers just fine without other people having to install the Mod.

I have also received my Seleste glasses a few weeks ago. And so far, I am very impressed. I am still figuring some things out. And since they are still in beta, there are some improvements that could be made, but there is a big update to the app coming soon that should make them even better. And after that comes through and I test them more, I will send a review.”

I wish I’d got this email in December, Elijah, because I might have spent some time over my summer break setting up that Minecraft Mod and finding out from my kids what all the fuss is about. It might be a next summer project now, but thank you for that update. It’s great to see gaming becoming more accessible.

[music]

Voiceover: If you’re a member of Living Blindfully plus, thanks for helping to keep the podcast viable.

If you haven’t yet subscribed, why not do it today?

Get access to episodes 3 full days ahead of their release to the public, you’ll get advanced notice of some of our interviews so you can have a say in what we ask, and you’ll help keep the podcast viable by helping to fund the team to do their work.

Our guarantee to you is that everyone who works on the podcast is blind or low vision, so we’re keeping it in the community.

Find out more. Visit LivingBlindfully.com/plus. That’s LivingBlindfully.com/P-L-U-S.

Wanting to Learn More About Multiline Braille Displays

“Hi, Jonathan!”, says Christopher Wright.

“I’m curious to learn more about multi-line displays, particularly how the orbit slate interacts with screen readers. I’d be more interested in the Orbit Slate as it’s portable, and can communicate with screen readers.

The Canute is interesting, but it’s not portable and only supports reading properly formatted braille files.

Have you and or other listeners used any of these devices? What’s the reading experience like?

I’m more interested in the Orbit Slate 340 rather than 520, as you get more cells per line which seems better for reading. From what I can gather, I have a mental picture of stacking three 40 cell displays on top of each other, though I don’t know if this is completely accurate.

How is the experience with NVDA, VoiceOver and TalkBack? Does the text wrap to the other two lines similar to using a single-line display? How does it handle displaying control information such as dialog box items or webpage elements?

I’ve heard screen readers will need to be updated to take full advantage of this new technology so perhaps, I’m expecting too much.”

Thanks, Christopher!

I’ve not had any direct experience at length of these devices. So if anyone can answer your questions, I’m sure others would appreciate those answers too. opinion@LivingBlindfully.com, or give me a call 864-60-Mosen, 864-606-6736.

How Can I Advocate to Save Gmail’s basic HTML?

John O’Regan says:

“Hi, Jonathan,

As you know, Google are shutting down Gmail’s basic HTML view soon.

As a veteran of previous campaigns, please advise me on the best way to gain Google’s attention, and tell me how I can persuade them to reconsider.”

I wouldn’t say this is impossible, John, but I would call it a really tall order because Google are notorious for dropping things all the time, and it might be very difficult to get them to take any notice. But there’s no harm in trying.

I believe that there is a blindness-related petition floating around on this subject so if anyone has more details about that, feel free to share. I can’t locate that petition. I’ve done a quick search and can’t find it now, but I do believe I saw something somewhere.

You may be able to get sufficient signatures, and then draw the attention of someone at Google Accessibility.

If you were really campaigning about this, you might also want to reach out to some technology journalists, which can always put some pressure on when elements like this get in the media.

So those are a few tips that I can immediately think of. I don’t know how impactful this is, actually, and I’d be really interested to know.

I don’t use Gmail. If I did use Gmail, I would definitely use it through Microsoft Outlook and my phone rather than on the website, but it should be accessible. It should be an option available to blind people because it’s an option available to sighted people.

I believe the new Gmail is accessible, like if you go beyond the basic HTML view, and I’m basing this on the fact that I have seen in the JAWS podcast training feed, a series of modules which describe how a blind person can use the non-basic view of Gmail effectively. I presume that these modules would apply no matter what screen reader you’re using, particularly if you’re in Windows. So you may want, as a precautionary measure just in case any advocacy doesn’t pay off, to check out those training modules and see how you find it.

But as a non-Gmail user, I would say, if you want to comment and educate me on how impactful or otherwise you think this move away from the basic HTML view will be, the abolition of the basic HTML view, then of course, feel free to share. opinion@LivingBlindfully.com, 864-60-Mosen’s the number, 864-606-6736.

Volunteering to be in a Be My Eyes Group Gives People More Than They Bargained For

Pam MacNeill is writing in again and says:

“Hello, Jonathan,

I do hope you had a great time away from all the different work you do for the disability community.”

Oh it was glorious, mate, glorious! thank you. Good to be back, though.

“I am wondering”, she says, “if you have had any feedback from others about the latest offering from Be My Eyes, with respect to users being encouraged to build their own groups of volunteers.

3 of my close friends agreed to sign up to Be My Volunteer via the BME app. However, they then started getting calls from random people around the world they don’t know.

I queried this with BME, and was advised that this section of the app is still in beta testing. However, they did not answer the question of whether volunteers in private groups are expected to also volunteer for others globally.

I’ll be interested in the experiences of others, and it would be good to get clarification from BME about the expectations of people joining private groups as volunteers.”

Thank you for the email, Pam.

I reached out to Mike Buckley. He’s CEO of Be My Eyes, and he said,

“thanks for sharing the question. We’ve heard this feedback from others as well.

As of now, we cannot segment a volunteer and make them only available to their friend or family member.

We will build this eventually. But it’s quite likely that those who have signed up to assist a friend will receive very few calls over the long term.

There are two reasons for that.

The first is that there is such a large volunteer community, that receiving calls is very infrequent.

The second is that when a new volunteer signs up, we bump them up in the call queue, so a new volunteer is more likely to get a call in their first 30 days. We do this to give new volunteers an early experience of assistance. But more likely than not, over time, the likelihood of calls will go down.

I think it’s a better experience to allow volunteers the power to only participate for friends and family if that’s their preference, so we will work on this. It will just take some time.”

Thank you, Mike, for the very comprehensive and helpful reply, as always. I hope that answers your question, Pam.

It would be good to hear from listeners. Are you using this new function of Be My Eyes, and how is it working out for you? Let us know. opinion@LivingBlindfully.com, 864-60-Mosen is the number, 864-606-6736.

Update on New Guide Dog and Technology for Developing Countries

Voice message: Hello everybody, this is George. I hope everyone is doing well. I hope everybody had a fantastic Christmas and New Year.

Some of you may remember me from quite a few episodes ago. I wrote in asking about the possibility of giving easier access to assistive technology to those who live in third world countries. I’ve also been talking about getting a guide dog. So before I commence with the big reason that I decided to send this in, I thought I would give you guys a little update on that.

So the application for my guide dog has been sent in. It has been examined. I did the preliminary interview and at the time of this recording, I should be told within 2 to 4 days, giving a bit of an estimated guess here, whether or not I will be accepted to get one or not, which is very exciting.

So I just want to say thank you very very much to everybody who has answered all of my questions. Pete from Australia, if I remember correctly, Jonathan and Bonnie, thank you very very much. If I’ve forgotten anybody, please, I do apologize. You know who you are, and your help has very very much been appreciated.

But I do have another question for the guide dog owners who may be listening. I’m going to be doing quite a few things during the week. However, I don’t know whether I’m going to be going out all day and every day. Now, my concern, obviously, maybe it’s a slightly unfounded one, is that I’m going to have to spend 24 hours a day and seven days a week trying to keep a guide dog entertained. Now I know that’s not the case, but I want to ask, what is a good balance between taking a guide dog out and letting it rest at home? Obviously, as I said, I’m going to be quite busy during the week, so I don’t think that is a concern, but I do think it would be a good thing to have done.

So if anyone could answer that question, that would be absolutely phenomenal. Thank you very much.

Now, the major reason that I wanted to send this in is to discuss again the assistive tech issue. In, I think it must have been the penultimate episode of the year, or perhaps the last one, I’m not sure, Peter from Hungary spoke in his message, first of all, about Marmite, which we also have here. I don’t know whether South African and New Zealand and British Marmite are quite the same.

But Peter, if you do want something to try, maybe try some biltong. It is some dried meat, as far as I understand it, it’s some dried and cured meat. We love it over here. It is absolutely delicious. I think maybe that would be, you don’t cook it, you just buy it in the packet and you can eat it. You may have to import it, though. I’m not sure.

Rusks are also quite nice. They’re like, I want to say long square shaped biscuits, oblong biscuits that you traditionally would dip in your coffee or your tea. You shouldn’t eat a rusk dry. I mean, you can if you really want to, but it’s not really something I would recommend. It’s normally you dip the rounded end into your tea, you hold it there for about 20 seconds, you lift it out, making sure of course that the end doesn’t come off in your tea because then you’re having soup, not tea. And then, you can take a bite and it’s lovely when it dissolves in your mouth.

Something else which isn’t necessarily South African is Tim Tams. We love Tim Tams. Anyway, excuse me.

So in terms of the assistive tech issue, Peter was talking about a country license for JAWS. Now from what I managed to gain from what he said is it is a license that can be provided to anybody in the country. I know Hungary’s got one. I believe Columbia’s got one. I’m not sure.

So I would like to propose to anybody, any assistive tech organizations but specifically to Freedom Scientific, to create a continental license for the African Union. Now, you could either charge the African Union for it or, and this would be of course the best, although this is a bit of a pathos argument, so everybody do excuse me, but or they could fund it themselves.

Now I know what you’re probably thinking. You’re probably thinking, oh but you know, they need to make money. They’re a commercial enterprise. They need to make a profit. And I do understand that. That does make sense.

However, I don’t have data to back this up, but many people in Africa cannot afford, not just Africa, but in the third world, cannot afford to buy themselves a new copy of JAWS. So if a continental license was created to cover the entire AU, just think of all the opportunities that that could provide to many many people.

I mean yes, people have NVDA. Yes, NVDA is fantastic. It is my screen reader of choice.

However, JAWS is good for some things, and NVDA is good for others.

JAWS, from what I can tell, has better support for various Office suites, Microsoft Office, and maybe a few others. And to give access to this product to people who otherwise wouldn’t necessarily be able to buy it, I think would potentially have the chance to lift many people out of poverty.

If Freedom Scientific does not wish to fund it themselves, then would they be willing, and would people be willing to set up a donation portal that anybody could put some money into to attempt to fund a license for the AU?

Now, if they don’t want to cover the entire AU, then just cover the ECOWAS regional block, that’s the Economic Community of West African States to start with. That, I think, would be a good starting point.

Jonathan: Thank you very much, George!

I don’t know what a countrywide license would cost. I don’t actually know what Colombia and Hungary paid for their countrywide license. But it’s a really interesting question that you raise, so I appreciate you throwing that into the mix.

Regarding your guide dog question, now, I’ve only ever had one guide dog in my life, and I know many of our listeners have had like 50 million guide dogs. Oh, hyperbole! There are a lot of listeners who’ve had lots of guide dogs, including my amazing darling wife. She’s, I don’t know, how many has she had? Half a dozen, maybe, over her life?

But I would say this from my experience of being married to, and being a guide dog handler. They kind of just blend in as part of the family. You can’t ignore them, obviously, because they’ve got needs. They’ve got to be taken out to do the park time, or the busy busy, or whatever it’s called, depending on where in the world and what guide dog school you are at.

Dogs like to sleep. A lot of the time, I read some amazing statistic about how much of their lives dogs spend asleep, and I thought well, it’s a nice life.

Some dogs will need to be more active than others. And hopefully, if your guide dog school is doing matching right, they will match you with a dog that has similar requirements in terms of how often they like to get out and about.

Sometimes you might need, if you’ve not been out of the house for a few days, to take them for a walk. In fact, I’d say that would be essential, but that’s good for you, and it’s good for the dog as well. They do blend in.

Dogs like to curl up. I mean Eclipse, Bonnie’s current Seeing Eye dog, is very much bonded to Bonnie. That’s the way it should work. But sometimes when we’re at home, Eclipse will come down to the studio, and she’ll just curl up beside me here, beside the chair, when I’m putting Living Blindfully together. And she’s just happy hearing me yammer away. Glad someone is. And she just sort of sleeps, and dozes, and makes a few noises and stirs from time to time, and she’s perfectly happy.

Problem with My iPhone SE

Hello to Phillip Brown, who says,

“Hi, Jonathan,

I’m not a tech-savvy person, so I hope you can help me.

I have an iPhone SE third generation. I’m currently running iOS 17.3, but the bug started in one of the last updates before 17.3.

When I unlock the phone, VoiceOver stops working normally and speaks items from the rotor when you swipe left and right. None of the normal VoiceOver gestures work. It will not even allow me to answer an incoming call. I have to lock the phone again and unlock it, and maybe VoiceOver will work correctly.

However, using a keyboard like the Hable Oone or the NLS Reader, voiceover works fine. It’s just the gestures on the screen don’t work. I thought when I upgraded to 17.3, this would fix the issue, but it didn’t.

I don’t know if this is a bug or if I may have accidentally done something wrong. Do you have any advice that will help? This is my first time writing so if it doesn’t make sense, I’m sorry.

I have been listening since mid-2019, and have learned a great deal from you and your listeners.”

Philip, thank you for listening all of this time, and I’m glad you wrote in from Georgia, even though it’s a frustrating reason that you’ve had to write in.

Based on your description, my best guess is that you’ve got orientation unlocked and that at times, you’ve got the phone in portrait mode. And at other times, you’ve got the phone in landscape mode. And that’s why it works sometimes and not other times.

What you want to do is hold the phone upright and see if it comes right so that it’s working the way you want, and then you want to lock your orientation. To do this, you bring up the control center. And I don’t have an SE to know whether they’ve implemented the new gestures for control center or not. But one thing that is sure to work is that if you put your finger on the status bar where it shows the time and how much signal you have from your various network options, if you then swipe up with three fingers, you’ll get into the control center.

You’ll find an item eventually once you scroll past WiFi and Bluetooth, and some of those settings that says lock orientation. What you want to do is make sure that it is behaving the way you want so that when you flick right with the phone held in portrait mode, it is actually doing what you expect. And then double tap that lock orientation option. At that point, it should stay that way. And you won’t have it going into landscape mode when the phone is at a certain angle.

Sound on my Dell Computer

[Jonathan sings] The Byron and the Dell, the Byron and the Dell.

Yes, it’s Byron Sykes. This time, he’s got a problem with his Dell, as you might have gathered. He says:

“Recently, my refurbished Dell from Computers for the Blind has taken the candid camera approach when it comes to speech. This program’s theme, paraphrased, said,”When it’s least expected, you’re elected. It’s your lucky day. Smile, your computer’s not talking again.”

I think what he’s saying is his computer stopped speaking intermittently. [laughs]

“Anyway”, he says, “in an effort to fix this, the Microsoft Disability Answer Desk thought we should reinstall the Realtek drivers. That, in turn, brought back the echo effect, making the speech sound like the barker at a fair whose voice bounced off nearby buildings.

What is the secret to fixing this? Somehow, the MS DAD, the Microsoft Disability Answer Desk, wouldn’t fix it this time. They did before. Not sure what the problem was.”

Byron, the fix to this is to press Windows with R to get to the Run dialog, and then type mmsys.cpl, and then press Enter. That’ll take you into the full sound control features of Windows.

And if you choose the device that is causing the problems and go into Properties, on the Advanced tab, there will be a checkbox to disable all the effects. If you turn that off and press Enter, it’ll be back to a standard sound card without any weird spatial audio stuff going on.

Blindness terminology and new iPhone 15

To Ireland we go for this email from Edvard Navackas. He says:

“Hi, Jonathan,

I have just listened to your latest episode. And firstly, I want to say that again, I am not against the word blind.

However, I may have said this in one of my comments that personally, I prefer the term visually impaired as I do still have light perception, and I count that as something.

The other thing I will mention is that yes, I am aware of NCBI being changed to Vision Ireland. I actually live in Ireland, and I will agree that for some people that can’t see at all, the new name might sound ridiculous because yeah, why would you include the word Vision for such an organisation?

I also got an iPhone 15 this Christmas, and it is awesome. I don’t really see myself using the pro features such as Action Button or the Door Detection Because I heard in some podcast and some research I’ve done, apparently, VoiceOver doesn’t speak the action that something is on, the only way to find out is to check the setting.

Also regarding the door detection, correct me if I’m wrong, but would I have to take my phone out of my pocket in order to detect a door? Because if I’m walking with my cane, that doesn’t sound safe to do.

Overall, I’m happy with just a regular 15 without those features. I also like the way the low battery alert pops up as a banner instead of a pop up, so to dismiss it I would swipe up and down, and to turn on low power mode, I would swipe up or down until I hear expand. I think this sounds cool.”

Good to hear from you, Edvard. Thank you very much for writing in.

Regarding the iPhone 15 Pro Action Button, I think it’s one of the coolest features that Apple have added for a long time. And it is true that you may not get feedback about when something is on or off, but you can use the Action Button for anything.

What suits my personal use case is that because I listen to so many podcasts, I have the Action Button always playing my podcast app of choice, which at the moment at least is Overcast. I may well get back to Castro if they get their act together.

So it doesn’t matter what app I’m in, where I’m at in my phone. If I just hold down that Action Button, my current podcast starts to play.

I’ve also enhanced this by having a double back tap of the phone take me forward 30 seconds, and a triple back tap of the phone taking me back 30 seconds. So essentially it’s a stand-alone podcast app right in my iPhone, and that is fantastic.

Yes, you are absolutely right, you do need to take your phone out of your pocket, or at least have an unobscured view for the camera and the LiDAR sensor to tell you when you’re coming within range of a door. I have found it useful in some situations.

I haven’t found it as reliable as I would like, but the people detection feature can be useful as well. For example, when you’re on a bus and you’re looking for a spare seat, it’s kind of cool to be able to walk down the aisle and not have to put your hand on occupied seats to find a spare seat. So it does have some uses, but its value will vary very much, depending on how much you value it and how much you do certain things.

Enjoy that iPhone. It sounds like you’re happy with the one you have, and that’s the most important thing.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.

Closing and Contact Info

Really enjoyed getting a wide range of perspectives from all around the world. Thank you so much! Keep the contributions coming, and we look forward to doing it all again next week.

In the meantime, remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: opinion@LivingBlindfully.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.

[music]