Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.
Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast living your best life with blindness or low vision. Here is your host, Jonathan Mosen.
As we begin recording this episode, the curtain’s just come down on Apple’s Worldwide Developers Conference keynote. Tim Cook and his team had a lot to say with much anticipated new hardware and updates to Apple’s operating systems. Our panel’s here to make sense of it all from a blindness perspective.
Let me welcome you briefly to episode 233. And there is no area code 233 in the United States, as far as I can find. So there you go. It could be yours one day.
There is a country code 233, and it belongs to Ghana, where there are over 31 million people. If you are one of them, enjoy the moment. Enjoy the recognition from the Living Blindfully podcast.
Our standard panel is here. We have Judy Dixon, Mike Feir, and Heidi Taylor, who’s not in the same studio as me. So welcome to all of you.
Judy: Welcome. Hello, Jonathan!
Jonathan: No one can say that this was a light WWDC, right? I mean, this was probably one of the most juicy WWDC keynotes there’s been in a long, long time.
Judy: It’s amazing!
Heidi: It was cool. Like for most of it, I was like, “Uh, there’s some good nuggets, but it’s not great overall.” And then, it got really good. [laughs]
Jonathan: Yeah. All right. So what we’re going to do is break with tradition because we normally just go methodically through in order.
But I think, the Apple Vision Pro announcement, in my view, is the biggest thing Apple has announced since iPhone itself. Would everyone agree with that, or am I overstating it?
Judy: To some degree, I think that remains to be seen. At least, I mean, for everybody, you’re probably right. But for blind people, we don’t know.
Jonathan: Right. This is a thing.
Heidi: Blind people are not part of everybody?
Judy: Well, they’re part of everybody, but they aren’t everybody.
Jonathan: Let’s talk about this. I don’t know much about some of the other products in this space, so I can’t tell how revolutionary all of this is. So I might hand over to you, Heidi, to perhaps describe Apple Vision Pro visually, and maybe if you’ve seen other products that are in this reality space, whether you think it’s a major step forward?
Heidi: Okay. Well unfortunately, I’m not very familiar with this category.
Heidi: I know most products focus on virtual reality instead of augmented reality, so that’s a big difference.
Jonathan: And for those who aren’t familiar, to be honest, like me,
Jonathan: how do you distinguish between virtual reality and augmented reality?
Heidi: Virtual reality what you see is completely generated. It’s not where you are. So you’re immersed in a space that’s been generated for you.
Whereas augmented reality it includes your actual surroundings, and adds the virtual elements into the physical surroundings.
So like, examples are like big windows, like a safari window looks like it’s floating over your coffee table, instead of you being inside like a matrix-like space where it’s just all windows and nothing of your actual surroundings.
Jonathan: They have done some of this on iPhone in the past, haven’t they, the augmented reality thing? So you can go and shop, and you can sort of imagine how things are going to look in your living room.
Heidi: Yeah. Yeah, they have. So this is just like the end product of all that work, I think.
Jonathan: Alright. So let’s look at what it actually is. I imagine this thing that straps to your head that’s kind of like a slick piece of glass.
Heidi: Uh, Yep. That’s sort of the vibe. Have you ever seen like goggles people wear when they go snowboarding?
Heidi: Okay, so this is the sort of vibe we’re getting. So it’s a big panel that covers sort of the eyes.
Judy: Is it curved?
Heidi: It’s curved to like contour to the face. It protrudes from the face maybe an inch or so. It’s hard to tell from the pictures. That’s so they can get all the senses and stuff in there. And then, it’s got like a big strap that straps around the back of your head.
Judy: Where do the ear audio pieces go? Are they on the strap?
Heidi: They’re on a strap part that’s actually part of the main interface. They don’t sit on your ears. They sit like on your head near your ears, which is interesting. So I assume it’s like bone vibration sort of vibes, rather than direct in the ear canal.
Jonathan: Oh, so it might be bone conducting. It’s not bone conducting, is it?
Heidi: I don’t know, but they weren’t in the ears.
Judy: They also have ear conducting, so it could be that.
Mike: It must include both because I can’t see how you could get the spatial audio they’re talking about. You can’t do that with bone conduction. Like there’d be no way to, unless you had like, I guess, elements all around the ear. I suppose something like that could work.
Judy: Two different times, they mentioned using AirPods.
Heidi: Yes. So you can use AirPods with it, which I assume is how you get the spatial sound. But it’s hard to know because Yeah, you could use AirPods and the built in speaker bits.
Jonathan: Right. And my immediate question was, is this going to work with made for iPhone hearing aids, of course?
Judy: [laughs] Yes.
Jonathan: Yeah. And there’ll be a lot of people interested in that.
So then, you’ve got these changeable headbands. It kind of reminded me a bit of the Apple Watch where you can personalize the band a bit for your personality. Is that a similar thing?
Heidi: I think it’s less for personality and more for just fit.
Jonathan: Right. OK.
Heidi: They only showed the one style, so it may be that down the line, there are more styles available.
Heidi: Whereas right now, it’s just different sizes.
Jonathan: And the thing about this device is that this is Tim Cook’s famous post-iPhone era he’s trying for here. This is a very big play on Apple’s part. I’m going to be monitoring my Apple stock with considerable interest over the next few days. [laughs]
So my idea is that when you’re wearing this as a sighted person, suddenly, even though you’ve just got this tiny thing on your head, or relatively tiny thing, it appears like you’ve got a big screen because when you watch things, it’s like they appear in your environment.
So there was a pretty impressive presentation from Disney there where the Disney characters, it seems to a sighted person like they’re actually in your living room.
Heidi: Yes. Yeah. They did do some things with that. Yeah. Like I think they had an example of Mickey, and he was like dancing on a chair or something, like so they can lock to actual objects in your environment. So it looks like they’re on them.
Jonathan: And the audio description said that one of the characters actually tapped you on the wrist at one point. So it’s incredibly immersive from a visual point of view.
And I actually got a text message. I was down here taking notes. I got a text message from Bonnie saying this thing’s kind of freaking me out [laughs] because it is quite a big change.
I guess the question is, what does it mean for blind people? And I’m not overly worried about whether this thing is going to have accessibility features. I think even though people are a bit grumpy with Apple right now for a very buggy iOS 16, I don’t think they will have left us out in the cold entirely. And I think there are some quite interesting use cases, particularly for apps like Aira. If third-party apps can get access to all of this, you could have some incredible visual interpretation.
I don’t recall them mentioning specifically that LiDAR is in there but it would be, wouldn’t it?
Judy: They didn’t mention LiDAR. But what I’m wondering is for $3,499, what are we paying for that we can’t use?
Mike: It did say that there were something like 12 sensors.
Judy: 12 cameras, 5 sensors, and 6 microphones.
Mike: Yeah. So if one of those sensors would be for distance, you’d need to be able to really accurately measure distance to do some of the image placements and things.
Judy: [laughs] Oh boy!
Heidi: Okay. So it does have a LiDAR.
Jonathan: It does? Okay.
Heidi: And 2 true-depth cameras.
Jonathan: Okay. So this is really interesting because with the LiDAR features announced at Global Accessibility Awareness Day ad that release, and the idea that you can walk up to different appliances and say touch touchscreen appliances, or even buttons on a device you’re not familiar with, and have this thing tell you precisely what it’s doing. You’ve got people detection. You’ve got door detection. If you’re walking around with this thing and it’s giving you all of that, it could really be a companion when you’re hooning around, getting on the bus, doing day-to-day things, trying to find a door in an unfamiliar place.
Judy: It’d be nice not to have to hold your iPhone up.
Jonathan: Yeah. I mean, you look a bit sort of dweebish holding your phone out doing those things, don’t you?
Jonathan: You get on a bus and you’re trying to find a spare seat, and you’re holding your iPhone out in front of you, and you kind of think okay. I mean, nobody cares anymore about what geeky things people do, but that’s a little bit ostentatious and potentially risky to be just holding your phone out in front of you like that, scanning for a seat.
So, I mean, it’s an enormously high price tag $3,400 US, and they’re doing an Optima, (Sorry, Optima.), and pitching it now, but they’re not selling it till early into next year. So we have to wait. And even then, it’s only available in the United States, initially.
Judy: I wonder if we looked at the program for all of WWDC, if there’s going to be a session on the accessibility features in it.
Jonathan: For those listening to this, we are recording this immediately after the keynote. And so I would encourage people to check the Apple Accessibility website, where there’s more coming on Apple Vision Pro and how it works, what VoiceOver on this thing is like, Presumably, there is voiceover on this thing.
That’d be interesting, because one of the things you do to control it is use your eyes, but you can also use your hands. So I imagine that you might just hold your hand out in front of you and use the VoiceOver gestures that we’re all familiar with. Would that be possible?
Heidi: I think so. They showed a lot of gestures that like, It’s got cameras that point in different directions to see your hands, so like facing straight down, so your hands can be in your lap while you’re looking forward, for example. And a lot of the gestures they used had the thumb and the pointer finger used. So like you’d hold them together, and then scroll like that. So moving your hands doesn’t accidentally trigger it. But there’s not a lot of information yet about that.
Jonathan: I mean, it would be amazing if I can perform, say, a two finger double tap on Bonnie’s knee and mute her.
Oh no, no, no. This is a bit beyond that. Yeah.
So this is going to be interesting. I mean look, the possibilities, This excites me, this stuff because it’s been a very long time since we’ve had a whole new category of device like this, something very very new for the last many years. It’s been bigger, better, more powerful, faster, all that kind of stuff. But this is something very, very exciting!
Mike: I can certainly imagine a lot. Like they could do either gestures that mimic the VoiceOver taps and just see where your fingers are pointing, or they could go another direction, just have totally more sort of gestures that might be more natural. Now, whether they’re natural to sighted versus blind people, that’s another thing that they’re going to have to think through pretty carefully.
And then, there’s typing, of course. If it can see exactly where your fingers are, you don’t need a keyboard. You could just type in the air. I wonder how well that will work. [laughs]
Jonathan: Yeah. So how was that working visually, Heidi? You can do dictation on this thing, but then is there kind of a virtual keyboard in the air as well?
Heidi: Yeah. So they showed a virtual keyboard. So it just sort of like appeared, and the guy was typing on it, even though it wasn’t there. So I assume, as long as you could like line it up with a gesture or something, so like it knew where your fingers were, you should be able to just type on a virtual keyboard.
Jonathan: So you’d have to have an understanding of that. Well, could you choose the size of the keys then, potentially? Like you could potentially determine how big your keyboard is, which may have some advantages.
Heidi: I don’t know, but that would be cool.
Heidi: They also showed that you could just use like a Bluetooth keyboard with it as well.
Jonathan: Well, how very sort of 2022?
Heidi: Yes. But just as another thing.
Jonathan: Yeah, alright. What else can we say about this? There are many more questions than answers.
Can I get your impression of the device that has to go in your pocket? Actually, it reminded me of the Horizon smart glasses. [laughs]
Judy: Yes. With a flexible cable.
Jonathan: [laughs] Yeah. For those who didn’t use that, you had a pair of glasses lightweight on your head, but then you had a cable going to a Samsung device that you’d have to put on your person somewhere.
Judy: It wasn’t all that flexible. [laughs]
Jonathan: Right, right. So what’s this like, Heidi? Did you get a visual sense of that?
Heidi: Essentially, there’s a fun new-style port on the side of the device, and then it has the fun cable coming off it. And then, it’s got like a battery bank was what I saw, like the size of an iPhone. And then, they just slipped it in their pocket, like their back pocket, or whatever.
Jonathan: Yeah, yeah. So it’s a bit unwieldy. This has always been a challenge for headsets that you don’t want all that bulk, particularly when it comes to wireless devices.
So presumably, this thing has Wi-Fi, does it? I mean, where’s it getting its connectivity from? They didn’t mention whether it had Wi-Fi, or cellular, or anything like that.
Heidi: No, it didn’t.
Mike: Not that I heard.
Heidi: Oh, back on the speakers. Sorry, I’m just doing research as we talk.
Heidi: It’s not a conduction speaker. It’s like a
Heidi: Down-facing because it points towards your ear, but it’s not in your ear.
Jonathan: Right. Well, it’s a lot of money, and I will be interested to see who gets this in the blind community to have a play. You up for it, Judy?
Judy: Yes. [laughs]
Jonathan: [laughs] Me, too. Yeah. Judy: Yeah. If there’s anything we can do with it.
Jonathan: I want to be on the ground floor with this thing. Although I won’t be able to be, of course, because it’s only available in the US initially.
Mike: In the US, yeah.
Judy: And other countries soon thereafter.
Jonathan: Yes. Yeah.
Mike: The first thought I had with this is this would be great for anyone with low vision, you’d think, because there’s so much there to customize and it would keep the text really sharp. Like you could do so much resizing and customizing that surely, that would be a benefit, you’d think.
Jonathan: Hmm. Because you can use corrective lenses as well. And then, you can presumably make a bit of the screen as large as you want, if the gestures exist to do that. I mean, it could be a huge deal for people with low vision.
Jonathan: Yeah. So it’ll be interesting to see where this all goes.
Optic ID is another thing that caught my attention for people with prosthetic eyes, or for some reason, who don’t have eyes that focus too well. This is, I guess, another version of face ID, except it’s much more personal. They say even identical twins have different whatever they’re measuring in the eye. So I presume there will be a way to work around that, just like there was a way to turn off attention mode with face ID.
Heidi: Yeah. I wonder how it would work for people like you, Dad, who have cataracts.
Heidi: Like you can’t see the iris at all. I wonder if it would still work sort of thing.
Jonathan: Well, I hope it doesn’t lock me out or I’ll be sending it back, I tell you.
Judy: I would think not. But those of us who have completely artificial eyes, it’s not going to work either. So they need to come up with something because you’re right, Jonathan, they certainly managed to do it for face ID.
Jonathan: Yeah. I have confidence that we won’t be locked out of this completely.
I think, what will be telling is what will third-party apps do with this? What can we do with all the LiDAR functions? That kind of thing. But I think it has some promise. I guess the jury’s going to be out on is it 3 and a half thousand US dollars worth of promise? I mean, gosh! That’s going to be like well over $5,000 in New Zealand, probably close to $6,000 New Zealand. That’s an awful lot to pay.
Judy: Maybe we can buy it without the display.
Jonathan: [laughs] Yeah, right.
This is obviously a new era and this is version 1, so you are going to have some challenges, I’m sure, with this first device, and they’ll keep improving on it. But it’s a bold play from Apple.
And what was interesting is they have a lot of confidence in this thing, obviously. Clearly, they’ve got some of the best marketing in the world, and they spent a lot of time thinking about how to sell this.
But I found it interesting. The rest of the world is talking about large language models, and they’re going AI-crazy, and Apple’s just doing its own thing. They weren’t sucked into that today.
Judy: Didn’t even mention it.
Jonathan: No, they didn’t.
Anything else on Vision Pro before we move on?
Right. Well, we will watch it with interest. We really will. I’m sure there’ll be a lot more information in the coming days and weeks.
Let’s go back to the beginning then, and talk about new Macs.
There’s a 15-inch MacBook Air this time, with the M2 processor. It’s the world’s thinnest 15-inch laptop with 2 Thunderbolt ports and a headphone jack. What an innovative feature!
Jonathan: Immersive spatial audio, 18 hours of battery life. And it’s available next week, so you don’t have to kind of hold your breath like you do with Vision Pro.
There’s a completion here today of the transition to Apple Silicon. It’s been a very carefully managed transition. And now, all the Macs, including Mac Studio and the Mac Pro, which just sounds absolutely eye-watering really in terms of all that Macs can do, they’ve all been transitioned now. They’re lovely machines, lots of lovely battery life, and they’ve really set the benchmark that other laptops from Intel and AMD need to emulate.
It’s just a shame I still feel concerned about the quality of VoiceOver on Mac.
Any thoughts from anybody on the Macs? You want one, Heidi?
Heidi: I mean, I’ve been wanting a Mac since my Mac died, but I can’t justify it. [laughs]
Jonathan: Right. You didn’t choose a Mac when it died because of the butterfly keyboard. And I see Apple has now settled a lawsuit over the butterfly keyboard because they agree it was dodgy.
Heidi: Yeah, it was a really bad keyboard at the time, so I couldn’t justify getting a new Mac.
But now, I see all the cool new M2 Macs and stuff, and it’s just like, oh, so tempting, but don’t have the money.
Jonathan: And you’re in the ecosystem, that’s the thing. I mean, with things like handoff and continuity, all that kind of stuff, it’s a very attractive proposition. Apple just makes everything so seamless.
You know, now there is this new app. What do they call it? Phone Link in Windows, that lets you pair your iPhone to your PC, and you can do iMessages, sort of, and get your notifications on your PC. And I really like that when I’m working around my ThinkPad.
Judy: Have you tried it?
Jonathan: Yeah, yeah. It’s great because I can be working around my ThinkPad, something plugged in to the headphone jack and I get my notifications spoken by Jaws.
Judy: Ooh, that’s exciting.
Jonathan: It is, but it’s got limitations. It’s nowhere near as cool as the handoff feature on the Mac. But it’s something. So that’s built into Windows now, for people who want to check out Phone Link.
So that’s the Mac. I guess it’s typical WWDC, really bigger, better, faster. Not too much else to say about that. But we’ll come back to Mac OS when we get there.
So then we talked about iOS 17. And who knew? The iPhone actually can make old phone calls. I’d forgotten that it even did that.
Yeah. So now, we have some improvements to the phone app. I can’t even remember the last time there were improvements to the phone app, apart from banners and notifications.
So the first thing is these personal contact posters that you can now make. And it works with Call Kit. So not only will people see these (those who can see them), when you’ve designed a poster for yourself and you make a call in the phone app. It also works with Call Kit. So that means that when you make a WhatsApp call, or signal, or something that supports Call Kit, people will get this as well.
Heidi, I feel like we need to rely on you here to talk about these posters, and what you can do, and how you design them.
Heidi: So the design process feels very similar to customizing the lock screen, as was introduced in iOS 16. So you choose your photo, and you can type out what name you want it to display, whether it’s like your full name or a nickname, and you choose the fonts, and the background colors, and things like that. And then really, it’s just a pretty picture that people see.
Jonathan: Okay. So how would you do that, making it your own in terms of a blind person? You specify, I guess, you’ll be able to scroll through color palettes, different things like that. Judy: Does it have information on it?
Heidi: No, it’s just something to look at.
Heidi: It’s not something useful. It’s not useful outside of like, seeing the name and a picture of the person.
Jonathan: In the good old days, if somebody called and you were expecting someone to call that you didn’t want to hear from necessarily, you would be sitting there and the phone would ring for say, 4 or 5 times. And then, your answering machine would pick up and eventually, you’d hear beep. And then, you could hear the person on the speaker who was calling.
And if it wasn’t the person you were trying to avoid, you could just pick the phone up and say hi, and the answer phone would switch off.
And now, what’s old is new again because that’s exactly what we’ve got with this new thing called live voicemail.
Jonathan: So you get a transcription in real time as they speak. And then if you decide that they’re worth talking to, then you can actually interrupt and answer the call. That’s cool!
Jonathan: Yeah. I wonder whether that will work everywhere, because some carriers don’t support visual voicemail. And I wonder whether this is all built into the visual voicemail type platforms. We don’t have any carriers in New Zealand that do visual voicemail, so I’ll be intrigued by the rollout of this and where it’s available.
Mike: Oh wow! I’ve had it for since I’ve Geez! For years and years now, I’ve had it. It’s so convenient.
Judy: Yeah, so have I.
Mike: Now the transcripts, I can see, you know, that should work well with VoiceOver. I guess, it’s coming to FaceTime, too. You’ll be able to leave messages the same way. Like, people can get them later, if they miss your call. So that I can see as being pretty cool.
Jonathan: I don’t know whether it was visually clear or not, but I wondered whether when you leave a message from FaceTime, which is a very welcome feature when, you know, I mean, in the past it just times out. And if you wanted, you could send a message separately. But I wonder whether that uses iMessage. You just record a message at that point, and it gets sent as an iMessage.
Judy: Or is there actually a voicemail now in FaceTime?
Jonathan: Record a greeting, yeah.
Heidi: It didn’t show where you’d receive them, but it is a video message that you record.
Heidi: At least, with what they showed with just FaceTime. They didn’t do a FaceTime audio example.
Jonathan: Right. So that’s good. That’s a bit of a gap in Apple’s offerings that they’ve now fixed the ability to leave messages.
And obviously, the transcription feature is quite nice for those with hearing impairments as well, who may like to know who is calling before they pick up. Because sometimes, a hearing impaired person will pick up and somebody will introduce themselves in a mumbly, garbled way, and you don’t know who you’re talking to still. So with this transcription feature, that’s a lovely accessibility touch as well.
The messages app. We’ve got more powerful search, which will be great because I have to tell you, I have had experiences where I’ve searched for things that I knew I’d messaged in an iMessage, and I just haven’t been able to find it very easily. So that’s good.
Do we see much about that user interface, Heidi?
Heidi: So it’s just using the same search field that’s already in iMessages, I think. But what you can do is you can add multiple terms. So you could add someone’s name, and then a specific topic. So like you could search for Bonnie and dog or something, to get just messages where she mentions dogs or things like that.
Jonathan: OK, right. So that’s good. So a more complex search.
Jonathan: You can jump to the first message that you haven’t seen yet in a group chat. That is very, very handy.
Mike: I like that. Yeah. Judy: Yes.
Jonathan: Yeah. It can be very difficult to scroll backwards and try and remember what you’ve read and what you haven’t.
What else is there?
Oh, the transcription of audio messages. So again, you know, a really good accessibility addition here. So for people who find it difficult to hear audio messages, and maybe they’re in a group chat, then this transcription feature will be available.
The check-in feature. This is another cool little one. You can check in with a family member or friend to let them know when you get home safely. I guess this works so you check in with them at the beginning of your journey, and then you can track it to the end. Is that how it works, Heidi?
Heidi: Pretty much, yeah. So you add the check-in. It’s like a widget-y, fun thing inside the message. And you add your check-in, and you say what time you expect to be in what location. So that person knows what you expect.
And then when you actually get there, it notifies them that you’ve made it. And it also notifies them if you don’t make it and you don’t update the check-in.
Jonathan: Good for the parents of teenage kids as well.
Does anybody here use stickers?
Jonathan: Oh, that makes me feel better.
Do you use stickers, Heidi?
Jonathan: You don’t? Okay. I thought it might be a cool kids thing.
Heidi: Well, I’m not a cool kid.
Jonathan: Yes, you are.
Jonathan: So new sticker experience.
Every time I talk to people on software these days, they always talk about the “experience” of whatever. So this is the buzzword.
You can react with messages to a sticker. You can basically turn anything into a sticker.
Why? I mean, why? Why? What’s the point of this?
Heidi: Because it’s what the cool kids are doing, I guess?
Jonathan: Okay then. Right. Enough said. Enough said.
Okay. So you can take a picture of yourself, and then you turn it into a sticker. And then, what do you do with it? I mean, you just put it in the message just because?
Heidi: So the way stickers work is you can put them on top of other people’s messages. It’s like they’re like reactions, but not quite the same as reactions.
But now, you can make stickers a reaction. So there’s that, too.
Jonathan: Right. Okay.
Mike: Well, the one thing I got with that bit of section with the iMessages too, is that it sounds like it’ll be a bit more organized. Like the apps and the messages, they’ll be sort of neatly hidden away until you really need them. And it kind of sounded like they updated that a bit.
Judy: That’s actually good.
Heidi: Yeah. So now, they’re hidden away under a single button next to the text field, I think, instead of being a big ribbon of apps as they’re currently displayed.
Jonathan: That’s good because it is quite time consuming to flick through them if you have them visible, so I tend not to.
Jonathan: But one real regression, (well to me, it’s a regression. I guess they just consider it a feature change about iOS 16.) that I didn’t like, was that they took the record button away generally, and they kind of made it an app in the long, long list of apps unless, you had received an audio message. And then the record button was there to reply. And I found that quite unwieldy.
Alright, let’s talk about AirDrop. Now, I remember in the very early days of iPhone, there was an app called Bump. Does anyone else remember Bump?
Judy: I do. Yes.
Jonathan: Yeah, yeah. Is that still around?
Judy: I don’t think so.
Jonathan: The idea of Bump was that you would bump your iPhones together.
Judy: Your other phone.
Judy: And then, that would transfer something.
Jonathan: Yeah. Transfer your contacts, I think, predominantly.
Jonathan: Now, AirDrop has Sherlocked Bump.
Judy: I love it!
Jonathan: I don’t know whether Bump’s in the front or not.
So the idea is that you can now easily exchange contact information, and it works with the Apple Watch as well. Did you see much about this, Heidi, in terms of I guess it just does what it says on the tin, right?
Heidi: Yeah. You bring the devices close together, and then you can transfer stuff. That’s right.
Judy: Do you think you have to touch them to each other, or you just bring them close?
Heidi: It didn’t have them touching. It just had them like, really close.
Jonathan: It’s in the share sheet, I presume, somewhere there that you would initiate this.
Heidi: I think so.
Heidi: It wasn’t entirely clear.
Mike: And then if you move out of range, it just continues with internet transfer. I like that idea. If you start something and you’re not quite finished, and you move out of range, it’s not a lost process by the sound of it.
Jonathan: Yeah. And it’s good that they’ve made SharePlay aware of AirDrop as well, so that if, for example, you’re watching a stream with one significant other and you just want to share that with them, then you can just use AirDrop, and that sounds a lot easier to get that SharePlay going.
Now, there is a change to report with respect to input in iOS 17. AutoCorrect is more accurate. I thought that was a bit edgy of Craig Federighi to talk about the ducking.
He’s good. I always give a little cheer when Craig comes on because he’s quite a good presenter.
Although, one of the things I liked the slickness of the presentations they have now, and the fact that they’re audio-described. But it takes all the unpredictability completely away, doesn’t it? Because every so often, something would go wrong on stage, you know. And now, obviously, that can’t happen.
Jonathan: But AutoCorrect is more accurate. You’ve got sentence-level AutoCorrection, so that fixes more types of grammatical mistakes. That’s a really cool thing.
Heidi: Yeah, like using the wrong kind of your.
Jonathan: Right. I wonder if it’ll go as far as correcting when people use the word I when they mean to use me, when they should be using me, like between you and I, which should be between you and me. That’s one of my pet peeves. But they might get in trouble if they start correcting that one.
What else have we got?
Now, dictation apparently has a new speech recognition model to make it more accurate. Hallelujah!
Mike: Yes. [laughs]
Jonathan: Because there are a lot of people who use dictation, and appear not to want to correct their dictation errors.
Judy: Oh, yes.
Jonathan: So this will be interesting to try out. And the developer beta is available today, so I’ll put that in my test machine and have a play.
Mike: I like the dictation correction. I’ll certainly, probably make extensive use of that. But I’ve always basically had autocorrect mainly off. I just trust my own typing and the spell check. I’d rather do that than autocorrect and have it automagically start, you know, sixing up my millables and langing my, you know. [laughs]
Judy: I also keep it off because I don’t want it to think it knows more than it might.
Jonathan: Yes, I do, too. I don’t use autocorrect very much.
I hope that they don’t further break Braille screen input because for the last few releases of iOS, it sounds like when you’re Brailling away on Braille screen input, there’s some sort of bubble that pops up. It sounds like it’s an autocorrection type noise to me, but I don’t know what’s causing it and what it’s doing. But sometimes, you can even be popped right out of Braille screen input. And sometimes, it gets its grade of Braille wrong, and all sorts of things. So I hope they address that.
Mike: Well, maybe this new transformer engine will work on that. [laughs]
Jonathan: Yeah, dude.
Now, you got me into the Day One Journal app, Mike.
Jonathan: And (I wish I had a button to press,.) But Day One Journal, you have been Sherlocked. Yes.
Here we go. This is a new journaling app.
Now, Mike, are you excited about this? Would you switch from Day One?
Mike: Absolutely, because this keeps everything on your device. And I trust iCloud enough to trust it with anything I would journal to keep it safe and secure. So I have good vibes about that.
It’s built in, so you’re not going to have to pay for a subscription. And it sounds like there’s all sorts of It ties in photos, people, locations. It gets data from your phone, from your activities, and suggests topics. I’ll be very interested to see what it makes of me in terms of that, and what it might suggest for me to write about prompt-wise. So that, I think, could really help.
I’m glad there is that focus on kind of the mental health aspect. I think that could be of serious benefit to people, keeping track of moments in life and have it learn from you, make suggestions, try to be helpful. I’ll be very interested to see what Apple incorporates.
Judy: But is it going to look at your text messages? If you write to someone and said, I had a really bad day, “Would you like to write about your bad day?”
Mike: Well, I did notice that they were very clear to say that if you don’t want some aspect of life monitored, you can exclude it from what it uses to make suggestions.
Mike: So you have that power. And everything is on your device. So nothing, in theory, is going back to Apple.
Jonathan: I keep a gratitude journal, and I do write in that every day. I absolutely require myself to write down 10 things that I’m grateful for today, and I find that very grounding and very helpful.
But then, I also keep a sort of more long-form journal. And every so often, life gets in the way. And I find that a few weeks have gone by and I haven’t written in that big journal to chronicle my very mundane, boring life.
And so I think if there were these prompts that popped up, or you could simply just add certain events to the journal so there was some degree of AI involved and it chronicled it, it would probably incentivize you to use the journal more.
For me, a key thing is going to be, can I import all the stuff I’ve written in day one?
Mike: [laughs] Well, it has good export features. So theoretically, you’d think that would be possible to do. I guess we’ll find out.
It might be like the podcast app, where they don’t sort of let you import/export stuff. I hope they don’t take that approach, especially if they’re Sherlocking day one.
Mike: There are going to be loads of people just like me, will want to try it just to see how it stacks up, if nothing else.
Jonathan: Yeah. I’d be really intrigued to play with this journal app, and it’s one of the first things I’ll check out once I install the developer beta.
The first thing is always looking to see whether they’ve changed anything in VoiceOver. I mean, nobody was expecting Eloquence last year.
Mike: No, that’s true. [laughs]
Jonathan: And there it was. So who knows what else might be lurking about in VoiceOver?
So that’s the journal app. Really looking forward to it.
There’s a standby mode that kind of sounds like the same as the Apple Watch standby mode, where you turn the thing on its side. And I actually always turn mine off. But yeah, I guess it’s designed to be used so that if you wake up in the night and you’re sighted, and you can, well, if you woke up in the night sighted and you were blind, that would probably keep you awake.
Mike: That would wake you up really well.
Jonathan: But anyway, if you’re permanently sighted and you wake up in the night, you can look at the phone on its side because you turn it on its side when it’s charging. And particularly with the new iPhone model starting with the, what are we up to, 14? You’ve got the always on display.
So I can see that’s quite useful. You can use it as a clock. You can put widgets on there.
Any thoughts on that, Heidi? Live activities?
Heidi: I think you covered it. It feels like you covered it.
Jonathan: Yeah, okay. Yeah.
Mike: I don’t see how that, You wouldn’t want it working with VoiceOver and announcing stuff at regular intervals, or something like that. [laughs]
Jonathan: No, this one’s a highly visual feature.
Jonathan: But it might be nice for those who just wake up and just want to check the time. I do that with my watch. I sleep with my watch on because it’s monitoring my sleep and everything. But every so often, when I wake up, I find that when you’re in sleep mode, the ability to tap for the haptic time or taptic time, whatever they call it, is unreliable in sleep mode. And sometimes, it speaks when I want vibrations.
Mike: The tapping? Yeah, I’ve found that as well. [laughs]
Jonathan: Yeah. Now, how do we explain this without breaking the world? The H-E-Y bit of the H-E-Y Siri command is being deleted. And from now on, all you have to do to summon your personal assistant is say, Siri. I think this is a terrible idea.
Judy: This is a bad idea. Yeah. [laughs]
Judy: Why would they be dong that?
Jonathan: Because now, whenever I’m talking about Siri in my podcast, I’m going to have to give it another name. So I’ll find one. I mean, we’ve got Soup Drinker for the other one.
Jonathan: I’ll find one for Siri. But I don’t know why. You could easily say to somebody, You’re walking with your cane, right? And you accidentally tap them, or bump into them, or whatever. You say sorry, or something. And it sounds to the phone like Siri.
And next thing you know, you’ve got your phone waffling away at you. What’s the point of this?
Judy: You’ve got to wonder what problem they’re trying to solve.
Jonathan: Yes. Yeah, precisely. So I don’t know.
Anyway, iPad OS is coming to the number 17 iPad OS 17. And I have to say, it’s kind of a catch-up release, isn’t it?
Jonathan: We’ve got widgets. Well, iOS has had widgets for a while.
The Health app finally comes to the iPad. It seems to be a glaring omission to have left Health off the iPad for so long.
Improved editing of PDFs. I don’t know what the accessibility implications of that might be.
It looked pretty good. Like, it could automatically detect fields and detect the label of the fields, and then auto-fill the information. So if it said name, it could fill in Jonathan Mosen in the field, like on the line.
Heidi: Yeah. And it said it worked with scanned documents, as well as properly accessible generated ones.
Jonathan: Okay. That could be huge potentially then, for filling in forms. Now, I guess the key will be whether you, as a blind person, can independently verify that the fields are what they should be.
Heidi: Yeah, that would be the fun part.
Jonathan: Yeah, yeah. Interesting.
That was about it with the iPad, wasn’t it? Basically, we’re catching up on putting things in the iPad that we’ve had in the phone for a wee while.
Heidi: It’s been like that for a while, though. iPad’s been a year behind on a lot of features for a while.
Jonathan: You invested in an iPad, didn’t you?
Heidi: I did. I love my iPad.
Jonathan: Yeah, yeah.
Heidi: I use it all the time.
Jonathan: What do you use it for?
Heidi: I use it for all sorts of stuff. I play games on it, I watch movies and TV shows, I’ve been drawing up floor plans of potential houses I’m looking at, I filled out PDF documents. I use it for all sorts of things.
Jonathan: Okay. So essentially, it’s taking the place of your Apple Vision Pro headset until you get one.
Heidi: Yeah. It’s like that.
Mike, you’re not an iPad fan, are you? You ditched your iPad.
Mike: Yeah, I really had no way to really use it productively. My spatial sense is, for some reason, the iPad is just big enough to throw me. I haven’t really tried a mini. That would be interesting. It might be small enough. But I’m not willing to spend the money to chance it. The phone kind of does everything I need it to do.
Jonathan: It’s about the size of an iPad mini now.
Mike: Oh, I hope not.
Judy: Not much difference between the Macs and a mini.
Jonathan: Yeah, yeah, yeah. Exactly.
How do you feel about yours, Judy?
Judy: I don’t use it very much. The only real advantage is having a built-in speaker that’s significantly better than the phone. I sometimes play podcasts with it and just kind of put it on the desk or something, but I generally don’t use it much.
Jonathan: So I will name drop someone. I’ve been very interested in reading on Mastodon a woman named Wenwei Fisher who is blind, and uses her iPad extensively in really interesting ways. I mean, she’s quite inspired. (I shouldn’t use that word in the context of any blind person.)
But I do find it interesting that I guess the way that her mind works, she’s got that spatial awareness and really prefers knowing exactly where everything’s laid out on the screen. She’s making extensive use of iPad for a lot of content creation, and that’s interesting.
For me, I sold my iPad when Apple decided to take the headphone jack away and moved us on to MFI hearing aids, that kind of thing. Because for me, the handoff between iPhone and iPad for made for iPhone hearing aids never worked properly. And basically, the first device to grab it was the one that kept it. And that made it really difficult to use the iPad. So I no longer have one.
Alright. Let’s talk MacOS Sonoma. Sonoma. I looked this up on the web. So it’s S-O-N-O-M-A, right?
Mike: Oh, okay.
Judy: Right. Sonoma, like Sonoma County.
Jonathan: Right. You’ve been there, Judy?
Jonathan: Apparently, the wine’s good there.
Judy: That’s right. [laughs]
Jonathan: And again, some of the things that we’ve talked about with iOS are in MacOS Sonoma.
Widget. So one thing that’s absolutely clear, Apple’s done their research and they’re all in on Widgets. Widgets are everywhere. And Widgets are now coming to the Mac. We’ll talk about the watch in a little bit. I guess it will work in a very similar way, Heidi, to the way it works on the iPhone?
Heidi: Yeah. So there were Widgets on the Mac, but they were hidden away in a little sidebar thing.
Jonathan: It’s True. Yeah, there were.
Heidi: And now they can be on the desktop, and they’re interactive. So you can check something off a checklist, or things like that.
Jonathan: But am I right that Mac Widgets were different?
Mac actually has had Widgets for a while, now that you mention it, but they weren’t the same as the iOS Widgets. And now, it seems like they’re more like the iOS Widgets. Is that right?
Heidi: I think they’ve slowly been becoming more and more iOS-like. And now, they’re very iOS-like.
Jonathan: I love the little signals that Apple sends sometimes. And the one that intrigued me that they sent today is that they specifically called out Apollo for Reddit. And that is a very political statement to make right now because Reddit is under fire for making it virtually impossible for app developers to continue with Reddit because they’re charging a bomb coming next month for access to Reddit’s API. And Apollo, which is a much-loved app, is under threat.
You know, Apple never does these things by accident. For them to specifically mention Apollo for Reddit today, they will have done that deliberately. So I found that very, very cool because once the third-party apps disappear from Reddit, blind people are essentially locked out of Reddit because the default properties that Reddit produce are not accessible.
So it’s an existential threat to the access that blind people have to Reddit. So good on Apple for doing that.
What else have we got? I’m just going through my notes here.
Oh, so there’s also this beautiful sort of seamlessness. You can see Mac widgets on your iPhone, and iPhone widgets on your Mac. So that’s cool.
When you share content in a video conference, there’s some sort of improved experience that looks highly visual. Did you capture this, Heidi, in terms of what that’s all about?
Heidi: I managed to capture some of it. So essentially, it’s just a new way of displaying the content.
So say, you’re sharing your screen or your presentation. And right now, what happens is it takes up the whole screen, and maybe there’s like a little picture of you in the video in a box somewhere.
And so what Apple’s done is tried to make it all lovely and beautiful. So they’ve got like a little circle of just you that you can move around that can be placed anywhere on the screen to not block the presentation. But it’s inside the same screen. So the presentation can be full screen.
And then, the other version they have is where it’s almost playing on that sort of AR vibe where it looks like it’s in the room with you. So like you’re sitting in front of the presentation, and the screen is behind you.
Jonathan: Okay. That sounds pretty impressive for those who need that sort of thing.
Jonathan: Now, I’ve been waiting to say this all day because I thought of it when I was watching the keynote, and I thought, oh man, that’s clever.
So in terms of the Safari team, we can honestly say Safari’s busy.
This is a complaint that blind Mac users have all the time. You run Safari, and it says Safari busy all the time.
Jonathan: So anyway, they’ve been busy with Safari. Private browsing locks the window when you’re not using it. That’s a great security feature.
You can securely share passwords and pass keys with people you’re closest with. Have any of you dabbled in pass keys at this point?
Jonathan: Yeah. Something I mean to do.
You can now set up multiple profiles. That’s a really nice feature for those people who use their device for work and for home. So if you switch for example to your work profile, your cookies, the extensions that you use, all those sorts of things are specific to that profile. So if you have a work GSuite account for example and a home Gmail account, then that’s really handy when you can switch from one to the other just by changing profiles. That’s great.
Web apps on Mac. You can make a website appear in the dock, and you can create a web app for any website that you want. You can also get notifications as well.
So unless I’m misunderstanding it, this is a little bit of catch up, I think. I think Chrome and Edge have had these sorts of things for a while. Is that right, Heidi?
Heidi: I don’t know if they do. I don’t use them. [laughs]
Jonathan: Alright, then. What are you using?
Heidi: Well, I use Chrome, but I don’t use web apps.
Jonathan: Right, right, right. Okay, yeah.
And you have been able to save for quite a long time, I think since the beginning of iPhone, to the best of my knowledge, a website to your home screen.
Judy: That’s right, you can.
Jonathan: Yeah, yeah. Not many people know that.
Judy: It’s in the share sheet.
Jonathan: Yes, yeah. And it’s really handy.
Judy: It is very handy.
Jonathan: Yeah. I have a folder of websites that I visit regularly, and I just put them in there. Is that in your book, Mike?
Mike: I believe I had it in there. [laughs] Certainly, I can include it in the next edition.
Jonathan: Oh good, good, good.
Mike: I’m sure I’ll be dreading starting, but yeah. [laughs]
Jonathan: So here’s that word experiences again, but you’ve got to say it with a California accent. Experiences. Oh my word!
Jonathan: So now, we’re talking about home and audio experiences. [California accent]
And some very cool things happening to AirPods. You know, when I listened to this thing about AirPods, I thought, they are so close to making a hearing aid and totally disrupting the hearing aid industry.
Judy: That’s just cool!
Jonathan: I wonder if they ever will, because that could be very exciting in terms of price and functionality.
Jonathan: So there’s a thing called adaptive audio dynamic. Dynamic? Yeah, yeah. It dynamically blends transparency and active noise cancellation.
You’ve got to be careful as a blind person using this, right? Because it sounds like it does some quite aggressive filtering of traffic noise.
Judy: I mean, how does it know what you want to hear and what you don’t want to hear?
Judy: I mean it sounds pretty exciting, but I just don’t know that I can trust it.
Jonathan: Yeah. Well, in the sense that obviously, blind people are using their ears for different things.
Judy: And we want to hear different things.
Judy: What a sighted person might think is annoying, we might really need to know about.
Jonathan: Yes. And this is the challenge for blind hearing aid wearers, of course, that a lot of these programs like the automatic programs are optimized for making unpleasant traffic noise go away. But of course, that unpleasant traffic noise is an absolutely essential cue for a blind person.
Jonathan: So you know, proceed with caution, but that will be interesting.
I like the idea of when somebody starts speaking, you have this conversational awareness thing where it will lower the music that you’re listening to, and also lower background noise and focus on the person speaking in front of you.
Judy: I like that, too. But I wonder if you can pick and choose which aspects of this whole thing you want to use and which ones you don’t.
Mike: And which AirPods? Like I can use 3rd-gen AirPods along with my hearing aids. It’s a tight fit, but I can do it.
I can’t do that with AirPods Pros, like the ones that go right into your ear canal. I would have to forego my hearing aids.
So either Apple takes a full leap and makes AirPods full hearing aids, or I might be just locked out of some of those features if they’re limited to the more advanced, the Pros.
Heidi: And I think, this is a Pro thing because I don’t think there’s a noise cancellation on the non-Pro ones.
Jonathan: Yeah, that’s so close. They’re tantalizingly close to redoing it.
Judy: Did it say which AirPods this was for? I don’t think they spoke it.
Heidi: They demoed it with AirPods Pro.
Jonathan: Right. How much do they cost?
Mike: Like, $200 and something?
Jonathan: They’re incredibly popular AirPods, aren’t they?
Judy: They are.
Judy: And they are really nice.
Jonathan: There’s improving automatic switching across the ecosystem, so that’s nice. Wouldn’t it be nice if all of that improvement trickled down to made for iPhone hearing aids? [laughs]
Jonathan: Maybe I could get an iPad back. So that’s AirPods.
Now onto AirPlay, there’s this on-device intelligence thing which memorizes your AirPlay behavior. So if you are cooking dinner and you like to have something going to your HomePod, or your Apple TV, your Sonos, or whatever, then it will suggest that, I guess.
Judy: So is it going to give you a notification, or is it just your list of AirPlay devices reorders itself depending on the time of day?
Heidi: The example popped up like a little notification at the top of the screen.
Jonathan: Right, like a Siri suggestion.
Jonathan: Sometimes I get those, you know, inviting me to do some radical thing.
Judy: I turn them off.
Jonathan: Yes. Yeah. [laughs]
AirPlay in hotels. This is nice.
Judy: I love this! I love this! This is great.
Judy: But where is this QR? They said all you had to do is scan the QR code. Where’s the QR code going to be? Is it on the television?
Jonathan: Yeah, I think so.
Heidi: It appears on the screen of the television.
Jonathan: Ah. Because you know, you walk into a hotel room when your room’s been reset and normally, there’s some default thing on the screen that welcomes you, and that kind of thing.
Jonathan: So the QR code is going to be on that screen.
Judy: I hope it stays in the same place.
Jonathan: I read this really cool book once called Get the Picture.
Judy: Ah, yes.
Jonathan: And what I learned from that was if I just step back far enough away from the screen, I’ll get a full view of the screen and it should capture the QR code.
Judy: And it probably will. I’ve scanned QR codes from being in the audience in PowerPoint presentations, you know, 20 feet away and it’s been absolutely amazing.
Jonathan: Yeah. It’s cool, isn’t it?
Judy: I love QR codes. I get such a kick out of scanning them.
Jonathan: Yeah. And in fact, that’s how you set up this new What’s it called? Your Phone Link app is that they put a QR code on the screen, and you just snap the QR code, and it sets it all up for you. It does the pairing. It does the whole thing.
Judy: I thought I had to wait for the newest version of Windows 11.
Jonathan: I believe it’s been rolled out to all the Windows 11.
Judy: I don’t think it’s come to my computer yet.
Jonathan: What? What? How can they do that?
Judy: How can they do that?
Jonathan: It’s come to mine.
Now, what else can I tell you?Oh, so you can beam stuff to the big screen in your room with this. So a lot of the entertainment systems in hotels are not accessible, so this is very nice.
Heidi: I was going to say, it also automatically connects you to the Wi-Fi.
Jonathan: That’s right. So that’s also good.
Judy: Oh, that’s cool.
Judy: I didn’t know that. That’s neat.
Jonathan: Yes. Because that can be a bit of a hassle when you’re trying to find in the room that little piece of paper that tells you what the passphrase is, or something like that. So that’s good.
Jonathan: Apple Music. SharePlay is coming to the car. Okay. This is going to be a recipe for family friction.
If you’re in a car with say, two parents or one parent and a bunch of kids, and they’ve all got access to what gets played on the system in the car, oh, fun and games. So you can use SharePlay, and everybody can have input into the music being played.
Right. Nicola will want Taylor Swift. Heidi will want the Abba, maybe. Nicola like Abba, too.
And on Apple TV, there’s a redesigned TVOS, it looks like. Did you get much of a visual look at this redesign, Heidi?
Heidi: I don’t know how different it is because I haven’t looked at an Apple TV in a while.
Heidi: I can do some comparisons, actually, but it’ll take me a minute. But it looks pretty. I don’t know.
Judy: What I was intrigued about in this Apple TV section was talking about apps like Zoom are now going to be on the Apple TV. Well, my Apple TV doesn’t have a microphone.
Judy: So either I’m going to have to pair it with headphones, or I use my phone also, which kind of somewhat Defeats the purpose. I think the whole purpose of Zoom being on the Apple TV is so people can watch their Zoom meeting on a big screen.
Jonathan: Yes. So no Apple TV has a mic. And the idea is that you will use your phone or your iPad as what they call a continuity camera. And you use the camera of your device and the microphone. And that’s how you’ll do your conferencing.
Judy: So there’s no advantage for us?
Jonathan: No, no. I suppose one thing is, I was thinking about this with a newly minted granddaughter. And I was thinking that if Bonnie and I kind of want to FaceTime with her and the family as they get older, and it’s kind of on the big screen and on the Sonos, of course, (because the Sonos is connected to the Apple TV) so it will probably sound quite realistic. But most of it’s visual, isn’t it?
Judy: I would have thought.
Jonathan: Yeah, yeah. I mean, because you could just airplay to those speakers, to be fair.
Heidi: Okay. So the redesigned bit just seems to be that there’s easier access to some of the little controls. So to like, the headphones that are synced, or the home controls, like setting scenes and stuff, and changing the do not disturb settings, and switching profiles.
Judy: I wonder what the interface will be like. Because right now, all we have access to I use an Apple TV a lot. But on the home screen of the Apple TV, you just have the apps.
Heidi: Yes. So they’ve added this to the home screen. Like at the top of the home screen above the apps are some quick options to use, rather than diving into settings.
Judy: Ah. Well, that’s nice.
Jonathan: As an avid Apple TV user, Judy, do you think that some of the user experience has kind of regressed a little bit over time? It doesn’t seem as intuitive to use as it used to be. And some of the apps, they’ve gone self-voicing, and it’s all gone a bit strange.
Judy: Some of the apps are self-voicing, and I wish they wouldn’t do that. I really just prefer to use the Apple TV voice for everything. But I mean, still, I really like my Apple TV. I have it paired to two large homepods, and it just sounds great. It’s just such a little compact thing that does so much. It’s amazing!
Jonathan: [laughs] Yes.
Let’s talk watchOS 10.
And here we go. Widgets. Yay! Widgets are also coming to the Apple Watch. Yes. Is it very similar to the way that it works on iOS, Heidi?
Heidi: I don’t know. So the way they showed it is you have your watch face, and then you scroll the digital crown, and you get like a whole bunch of little baby widgets that you can then tap on to open up the related apps, or you can just glance at the information on it.
Jonathan: What’s the difference between this and Apple Watch complications?
Heidi: These are not always showing, like you have to scroll up to see.
Judy: Or the dock. Well, the dock’s just where you can put a few more apps that you want to get to quickly, which sounds like what all this is, except you use the crown to get there instead of the side button.
Jonathan: Right. Yeah.
Heidi: When you use the dock, does it tell you information, or just tell you what the apps are?
Judy: No. it just tells you the name of the apps.
Heidi: So this one would also give you information like it says how long is remaining on your timer, for example, or that you have an appointment from 12 to 1 on your calendar, that sort of thing.
Heidi: That’s permanently shown as a complication.
Judy: Yeah. For a VoiceOver user, the fact that they’re small isn’t going to be relevant.
Mike: Well, and then there’s whether you want the crown navigation on or off. Maybe that’s going to be changing a bit. But yeah, like I can sort of see, you know, turning the crown to get at different bits of information, and presuming you can rearrange them to your liking, and things like that. I guess with the mindfulness app, it sounds like they’re really going into the mood tracking kind of stuff, which apparently is quite helpful for people as well.
Jonathan: Yeah. So we’ll talk about mindfulness in a sec. Just going through in the order that they covered it.
Cycling workouts now show up as a live activity on the iPhone. And I realize why that would be a very cool thing visually because you could just have your iPhone propped up somewhere, particularly on a stationary bike, and glance at it. So that makes sense.
One thing that really caught my attention that I thought was super handy. And this is that when you’re doing a hiking workout, when you’re out and about, a cellular connection waypoint gets set and that marks the last place that you had cell service. That’s brilliant.
Judy: Ah, yes.
Jonathan: Yup. And then another one, you can get an
Judy: Emergency waypoint.
Jonathan: Yes, which will show the last time that you could get an emergency call out, presumably through satellite. We’ve now got this here. We’ve just got the satellite coverage a few weeks ago.
Heidi: I think this was separate to satellite. They were talking about The way emergency calling works is it lets you use any network, not just the network your sim connects to. So I think it’s the last time it detected any possible network, which you can make emergency calls on.
Judy: That makes sense.
Jonathan: Right. Yeah, that makes a lot more sense. Okay. Right. Now, that’s very handy. Yes. Because when you dial whatever your code is in your country, normally, there’s a sort of a law or an understanding that even if you don’t have an account with the network that it can find, you can still get an emergency call through. Very well thought through.
Now, developers have access to the high frequency motion data, so we’ll be interested in how that works.
Let’s talk about this mental health stuff. This is great to see, because there’s still so much unfortunate stigma associated with mental health things.
If people break their arm, or hurt their leg, or do anything like that, people understand. They go to the doctor. They get the appropriate treatment.
And yet, for mental health, we’re still not there, sadly, as much as we should be. So this is good stuff.
You can log your emotions and your daily mood from the mindfulness app in iOS 10. I’m sorry, watchOS 10. [laughs] And not only how you’re feeling, but what’s making you feel that way.
And the state of mind logging also is available in the Apple Health app. So if you don’t have a watch, you’ll still have this in health as well. Presumably now on the iPad as well.
You can take standardized assessments that are often used in clinics from the health app.
So yeah, brilliant. It’s good to see mindfulness expanding because the meditation, the breathing stuff has been very good.
Mike: And this really directs people to, like it was talking about directing you to articles and further resources if you need more help as well. So this is a nice private bridge that you can sort of approach knowing that your information is safe, and kind of explore those options without having to go to a psychologist or something first. You can see, you know, get an idea of what you need. And then, work from there, which is, I think, a lot more approachable for people than just jumping to going to see this other person who is now going to help you with this stuff.
Jonathan: Yeah, because certain mental health conditions may actually act against talking to anybody at all. And so if you can do that, as you say, in the privacy of your home and investigate these things, it’s a really good thing.
So well done to Apple for doing this.
They’ve got a couple of features here relating to vision health so that hopefully, fewer people will be around to complain about the VoiceOver bugs.
Heidi: Oh I guess that’s their real take on it.
Judy: That’s their motivation, yeah. [laughs]
Jonathan: Not that I’m cynical.
You can measure the amount of time spent in daylight. And I mean, what parent doesn’t want their kid to get more sunshine?
Jonathan: You can monitor your child’s focus on What am I saying? You can monitor your child’s ability to see a device, so how closely they’re holding it to the screen.
Judy: Are you going to buy them an Apple Watch to do this? [laughs]
Jonathan: Yeah. [laughs]
Heidi: The Apple Watch is for the daylight.
Judy: The daylight, for the daylight, even if they don’t have an iPhone.
Jonathan: Right. Yeah.
So it’s good to see those health categories expanding.
I must say, I really feel some sense of security wearing the watch with all the things it’s monitoring, you know, and it’ll only get better. Of course, the big one that so many people are waiting for is the diabetes stuff.
Judy: Oh, yes. I really look forward to that.
Jonathan: It will be a big game-changer for many, many blind people.
And that takes us around, essentially, to what was covered in WWDC. It was extraordinarily action-packed.
Heidi: I have one more thing for watchOS.
Jonathan: One more thing?
Heidi: Yeah. They put up all the fun things at the end, and they don’t always mention everything.
One thing they added but didn’t talk about is that you can now take group FaceTime audio calls on the watch.
Jonathan: Oh, good.
Judy: Oh, that is cool.
Jonathan: That is a good little nugget.
Jonathan: No mention of pairing made for iPhone hearing aids with the watch yet. And I think, that is a very regrettable oversight because for many people with hearing impairments who use VoiceOver, even on the Ultra, which even though I said I wouldn’t get when I got, the speakers still, you know, it’s tinny and in loud, noisy environments, it can be difficult. So it is unfortunate that Apple is dragging the chain.
I don’t think there’s a technical reason for it because surely, the watch is powerful enough to pair with made for iPhone hearing aids.
Judy: I bought one and gave it to my husband.
Jonathan: Oh, you bought one. Oh, so hang on. So you bought one, and then you used it. Is that right?
Judy: I bought one, and I hated it. I gave it to Doug.
Jonathan: Oh, so it wasn’t some sort of altruistic thing. You didn’t buy it for Doug.
Judy: No, I hated it. I didn’t buy it for him. I bought it for me. [laughs]
Jonathan: Right, right. Why did you hate it?
Judy: It’s just huge.
Jonathan: It is.
Judy: I mean, it was, I just, I felt like I was wearing an iPhone on my wrist. I mean, it was just, no, it was way too big and way too heavy. I did not like it.
Jonathan: But the battery life’s phenomenal.
Judy: Oh, my battery life is just fine. I still have a 6. I haven’t bought an Apple watch in 3 years.
Jonathan: I mean, but I can go on a trip for the weekend and not even take a charger with me.
Judy: Yeah, I can do that.
Shall we do a round robin and just summarise what is a very busy WWDC?
And I might just start this off by saying it’s so easy to get sucked into the vortex. And there are such capable, talented people at Apple doing really innovative things that actually improve people’s lives.
But I really hope that in iOS 17, we get the much long awaited, desperately needed cleanup of VoiceOver because it’s getting worse and worse. And it’s tough because at least for me, as a user of a HID Braille display, there’s nowhere else to go. Android is not an option for me yet. And not to mention that I’ve got made for iPhone hearing aids. So there’s 2 really significant reasons where even in a perfect world, it would be quite a big change for me to make.
So I’m reliant on Apple to actually sort out their soup, and it’s taking way too long. And it’s disappointing because I think there’s probably 2 departments here. You’ve got the people designing the features and doing all the innovation. And then somehow, when it comes to the QA, the quality assurance of the accessibility features, something’s letting the side down. And I think, there is some sort of systemic problem between the reports that we also hopefully send in. You know, blind people take their time to submit such detailed bug reports and they’re hopeful it’s going to get addressed. And for yonks, nothing happens of consequence. And if anything, things can get worse. And it is demoralizing when there’s so much innovation going on.
But we’re reaching another inflection point that I think hasn’t been seen since about iOS 8 or 9, whenever it was that we got that business where you couldn’t even answer a call. And we’re almost back there. So I’m hopeful that iOS 17 cleans things up. I’m forever the optimist.
Judy: It’s not just VoiceOver that’s suffering right now. I mean, Siri has also sunk to an amazingly low level.
Judy: And there are a number of things that are really not going that well. And you’ve got to wonder what’s going on.
Jonathan: Siri’s developed a hearing impairment much like mine.
Jonathan: I mean, so I’ve got the WaterMinder app. I love my WaterMinder app.
And I say to it, log 450 milliliters of water. And sometimes, it will say, OK, log 400 milliliters of water because I’ve got a Siri shortcut.
But then sometimes, when I say it in exactly the same way, it will say, sorry, none of your devices can do this. Or I found this on the web for, you know.
And I can give it multiple commands in the way I clearly always have. And it simply doesn’t hear. It’s just not acceptable. Rant over.
Heidi, what are your thoughts on the whole keynote as we wrap?
Heidi: I mean, a lot of it was The exciting stuff was the Apple Vision Pro for me. But there were some pretty good parts in there, too, with all the just little things.
Jonathan: Would you like to get an Apple Vision Pro?
Heidi: I would like to try one out. I don’t know if I would use it long-term, but like I’d have to trial it first.
Jonathan: Yeah. So obviously, as the one sighted person on the call here who could fully experience this, do you want it because it’s just cool, it’s very new, it’s on the cutting edge, or can you see it actually benefiting you? You know what I’m saying? A bit of a distinction.
Judy: Or would it just be fun?
Heidi: I think right now, it’s very much a fun factor, but I can see how it could be really functional, how it could like replace how I use my iPad, for example. But I don’t know if it would do it so much better that I could justify the price tag.
Jonathan: Alright. And Judy, what’s your take on it all?
Judy: Yeah, my take is pretty similar to Heidi’s. I think the Vision Pro is just so exciting, and it has so much potential. And I think, it may take several years before we realize the benefits that it could be to us. And I mean, it’s all incremental.
And there were a few little nuggets in the other stuff. Not much, really.
But I’m not a Mac user, so I didn’t get very excited about the Mac parts.
But I do have a watch, an Apple TV and an iPhone, so they were pretty cool.
Jonathan: How might this work, or would it work in the context of rehabilitation programs in the United States? Could there ever be a case where the Vision Pro might be so consequential for some blind people, in particular use cases, that the rehab system might fund one of these things for a blind person?
Judy: Time was when $3,500 devices were part of the norm. I mean, think of the Opticon, think of something like that. And what a valuable device that was.
Jonathan: And Braille displays.
Judy: And Braille displays.
Judy: So the price tag may, I mean, if this does cool enough things, the price tag may not be the biggest stumbling factor.
Jonathan: Because there’s always been this reluctance on the part of most agencies that fund these things to go for mainstream devices. I know that there was a feeling for a very long time, we can’t give iPhones to blind people. Because somehow, people make a distinction between a mainstream-manufactured device and something blindness-specific, not really having caught up with the fact that these companies like Apple and Google have become assistive technology companies.
Judy: They have.
Jonathan: And Mike, what’s your thinking on it all? Anything? You have to rewrite your book.
Mike: Well, you know, probably not. There’s nothing that screams at me, get on it right now. Like, none of these things are going to really fundamentally change the beginners’ experience.
What really would set my alarm bells off is if something came out that totally changed how VoiceOver behaved as you started using it, to the point where nothing I’d written previously could match.
I didn’t hear anything in that which said that yeah, this is the time. So I think I’ve got another year to coast.
You know, some of the things like the Mac stuff, I mean, I can especially for power users, you know, the Mac Studio and everything about the massive speed, the power that unleashes, you know, for high-end creative work. Like that’s so far above what the average user would need. It’s beyond belief. So I don’t even know. I can’t imagine a use case where a blind person could put that power to use really that they couldn’t do with a lower-end machine. I could be wrong about that.
The changes to iOS, I am hopeful that the help they’re giving with dictation and things like that will be beneficial in a major way. And maybe there’s more lurking that they didn’t mention in terms of artificial intelligence improvement that will like this is something I will be paying careful attention to the rest of this week as they go through all the presentations, like trying to fair it out. You know, in terms of are there more accessibility things that haven’t been mentioned yet? Are there aspects of these changes that There might be, you know, things that would help us that just aren’t obvious from the very brief glossing over that they get with these presentations.
So it’s a case where I think the keynote gives us a peek. But really to get a fuller picture, we’re going to have to pay attention to the next sessions for the rest of the week that are coming to developers, and hope that we can, you know, understand enough of it as lay people and as mere mortals to figure out what else is coming.
And that kind of gives me hope, you know, that the fact that there are no massive new ground-breaking things other than this vision pro kind of says, you know, they might be focusing a lot more on cleaning up the mess, right? On getting rid of the longstanding bugs, on really getting us back to a stage where things work.
They did have that artificial intelligence conference earlier, and we didn’t really hear publicly what came out of that. But, you know, time is certainly ticking for Apple on that score and I’ll be again paying attention for what’s going to come out of WWDC about that, if anything does.
So certainly, we’ve got indications of some of the big things coming. I’ll be interested in watchOS 10, seeing how much that changes the VoiceOver experience of using the watch with all these navigational changes and such. So that’ll be a big thing for me.
But iOS is looking pretty stable in terms of nothing earth-shattering going to pop out at us in the fall. I do like the journal app. I do like some of the other things that they’re going to do.
So yeah, it was interesting. I wasn’t bored, you know, even through the real tech stuff. [laughs]
Jonathan: That is an interesting point you make. Because when you actually dissect what’s new in iOS 17, there’s not a lot of really big features at all. There are kind of nice little tweaks that hopefully will be useful. And so you’re right. I mean, I guess there are 2 options. One is that all the software development team have been siphoned off onto Vision Pro. Or the other is that there is some work going on to try and straighten iOS out, and it badly needs straightening out. So we’ll be forever the optimist.
We are, of course, interested in finding out what you think. And if you would like to give us your comments on WWDC, on Vision Pro, on any of these issues, you’re welcome to be in touch. opinion@LivingBlindfully.com is the email address. You can attach an audio clip, or write it down. You can also call the listener line. 864-60-Mosen in the US. We’d be delighted to hear from you and get your perspectives.
And thank you to our most erudite panel for an interesting discussion.
We will also keep a very close check on Apple’s website for further information about accessibility pertaining to Vision Pro.
So thank you all very much for being a part of it once again.
Judy: Thank you!
Mike: It’s a pleasure.
Voiceover: If youve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.
If youd like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: email@example.com. Or phone us. The number in the United States is 864-60-Mosen. Thats 864-606-6736.