Transcripts of Living Blindfully are made possible by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at http://PneumaSolutions.com.

You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

Contents

Welcome to Our 250th Episode.. 3

Country and Area Code 250.. 4

Recording Using the Samson Q2U on the iPhone 15 Pro Max. 5

The Focusrite VoCaster 2 Connected to iPhone 15 Pro Max in the Studio.. 6

The Front Cardioid Microphone in the iPhone 15 Pro Max. 7

The Bottom Omnidirectional Mic in the iPhone 15 Pro Max. 7

The Front Omnidirectional Microphone of the iPhone 15 Pro Max. 7

Configuring and Using the Action Button.. 8

Greater Control of Battery Health in iPhone 15 Models. 14

VoiceOver Point and Speak and Magnifier Features. 16

Improved Configurability of VoiceOver Voices. 20

More Control Over VoiceOver Verbosity. 26

Braille Improvements. 30

Locate the Centre of the Screen From Your Keyboard.. 32

Lock Your Rotor in Place.. 33

Siri Loses the Hey in Some Countries. 34

Two New UK Siri Voices. 35

Vary Siri’s Speaking Speed.. 36

Personal Voice.. 36

Liven Up Your Home Screen With Interactive Widgets. 41

New Ring and Alert Tones. 51

Changes to the Messages App.. 52

Manage Your Grocery List in the Reminders App.. 55

Notes in Notes Can Now Link to Other Notes. 60

Track Your Mental Health in the Health App.. 63

Live Voicemail and FaceTime Voicemail 70

Improved Predictive Text and Dictation.. 71

Improved Sharing With People Close By. 72

Sharing AirTags. 74

Standby Mode.. 74

Apple Music Changes. 74

Some Safari Improvements. 75

Closing and Contact Info.. 76

 

 

 

[music]

Advertisement: Chris Peltz here with the Blind Grilling Experience – where we talk all things cooking, grilling, and barbecue; tips, tricks, and techniques for the blind and visually impaired, all things accessible technology centered around food.

If you like brisket and breads, you like pizzas and pies, folks, we will leave you hungry and wanting more.

Check out the Blind Grilling Experience on your favorite podcast app, or visit our website at BlindGrilling.com.

Welcome to Our 250th Episode

[music]

Voiceover: From Wellington, New Zealand, to the world, it’s the Living Blindfully podcast – living your best life with blindness or low vision. Here is your host, Jonathan Mosen.

Jonathan: Hello, and a special welcome to this – our 250th episode.

This week: my iPhone 15 Pro Max has arrived. I’ll give you my initial impressions and perform a couple of demos. And a comprehensive look at some of the new iOS 17 features from a blindness perspective.

I am feeling absolutely full of gratitude that we have made it to episode 250. It wouldn’t be possible if we hadn’t built this incredible community of people who responded so well to the Living Blindfully changes earlier this year.

To everybody who contributes to Living Blindfully plus, whether it be $1 a month (which is actually now at the current exchange rate, about $0.59 US), or very very generously (and we do have some remarkably generous Living Blindfully plus contributors), thank you very much. We are in a cost of living crisis, and the fact that you are willing to support the podcast financially with your plus subscriptions means an awful lot. And it means, in fact, that the podcast has been able to continue because the load that I carry is a little less now.

And in that regard, I want to thank in particular, Derek, who’s been editing our interviews.

And also to Hannah, who has been doing a tremendous job with the transcription. If you have noticed a significant improvement in the quality of the transcription, you have Hannah to thank for that, as do I. I really appreciate all the work that she is putting in.

I also want to thank our advertisers; and we’ve got a wonderful little pool of advertisers who are contributing to the show and helping to keep it afloat as well.

I think, current events in our community are so important to cover. This podcast is a very niche podcast. We know what we’re here to do, and that is to discuss a range of perspectives on living life as a blind person.

We do talk a lot of tech. Oh boy! Are we going to talk a lot of tech today.

But we talk other things, too. We’ve been talking about guide dogs recently, we’ve been talking about disclosure when it comes to seeking work, and we’ve talked about many different issues over the years.

And even though it is a bit of effort to put this together (laughs], that’s what keeps me here. I think we’re providing a really important service.

It’s thanks to the contributions of our community that it’s all possible. And by contributions, I don’t just mean financial, although that has made a big difference. I also mean people who contribute content and their range of perspectives to the podcast.

So yes, I’m a bit sentimental because it’s episode 250. I really appreciate everybody’s support of this effort so so much. It’s kind of like a dream, really.

I was saying to Bonnie the other day, “If I could go back to my 10-year-old self who was so obsessed with radio, and telephones and different things like that [laughs], and say in decades from now, you will have your own studio with lots of great equipment (and my 10-year-old self would have thought that was pretty exciting because I was fascinated by all the gear as well), and that you would be able to do a program that would be heard in 113 countries, all from a studio in your house, I would have thought that was just nonsense and science fiction, and pretty unbelievable.”

So we’ve got many struggles in our community, but we’ve also come a heck of a long way in quite a short time.

Thank you very much for allowing me to live this dream, and I hope I’ll continue to be able to provide content that you find useful and worth your while.

Country and Area Code 250

Now, episode 250 – that means we should look at area code 250.

We’re going way up north to Alaska. [laughs] It’s an Alaskan area code. You can see Russia from your house, possibly.

So welcome if you are in Alaska, where it will be starting to get cool, I guess now. Welcome to episode 250, and enjoy your moment in the sun.

And as we look at country codes, we continue our trip around Africa. Rwanda is area code 250. They have, like much of Africa, completed a census just this year, in 2023, and there are around about 14.1 million people in Rwanda.

Advertisement: We can make transcripts of Living Blindfully available, thanks to the generous sponsorship of Pneuma Solutions. Numa Solutions, among other things, are the RIM people.

If you haven’t used Remote Incident Manager yet, you really want to give it a try. It is a fully accessible, screen-reader-agnostic way to either get or provide remote assistance.

It’s not only screen-reader-agnostic. It’s multi-platform, too. You can use it on your Windows or your Mac machine, and people can assist each other regardless of the platform they’re on.

These days, not a day goes by that I’m not using RIM. And one of the ways I use it is to either receive or provide technical support from family members.

I’m kind of the tech support guy in our family, so I quite often get questions from family members that they want me to solve. It’s not realistic to expect them to install a specific screen reader, even the demo. So before RIM came along, I found myself having to try and talk them through what they needed to do.

Now, I can tell them to go to GetRIM.app. that’s G-E-T-R-I-M.app, install a simple application on their Windows PC, and just by exchanging a code word, I can have a look at what’s going on. I can either run Narrator on their system, or if you’re using NVDA, you don’t even have to do that.

It’s an amazing tool, so do check it out. RIM from Pneuma Solutions, available now for your PC or your Mac at at getRIM.app. that’s G-E-T-R-I-M.app.

Recording Using the Samson Q2U on the iPhone 15 Pro Max

We’ll begin our look at the iPhone 15 Pro Max on the iPhone 15 Pro Max. And you may find it hard to believe that we’re recording on the iPhone, but we actually are. This recording is being made on my brand new, shiny iPhone 15 Pro Max using ferrite. And all I had to do was connect a Samson Q2U microphone to the USB-C port. It was that simple. No need for any kind of adapters.

I’ve got a USB-C cable that connects to the other end of the Samson Q2U. I plugged it in. Immediately, VoiceOver started coming through the headphone jack of the microphone. I was able to load Ferrite, press record.

I’m getting monitoring in, in my case, my hearing aids. It could be headphones if you don’t wear hearing aids. That tells me if ever the signal starts to clip.

And this is the hallmark feature of the iPhone 15 range in my view – the USB-C port, and the fact that it is so easy to plug peripherals in.

So I am just standing here in the living room, and talking into the Samsung Q2U. We’ll do some more recording experiments in a minute. But first, let me talk about my first impressions while they’re fresh in my mind.

It’s noticeably lighter. This is the iPhone 15 Pro Max white, and I wasn’t really expecting it to feel as light as it does. It’s a trick. I don’t think it’s that much lighter. But apparently, the ergonomics are slightly different, the way the phone is balanced. And if you have the 14 Pro Max in one hand and the 15 Pro Max in the other, you definitely notice a difference.

The action button, which we’ll come back to a little bit later, is at the top of the phone. It feels really a lot like the old slider switch, except that it presses instead of slides. And we will customize that a little bit later.

Set up was relatively straightforward. It was kind of weird. The first thing I plugged into the iPhone was the cable that went from my PC where I have my iTunes encrypted backup to the phone. And of course, that had to be a new cable, not lightning anymore, but USB-C.

And I still restore from an encrypted backup on my computer because I find it does require me to enter fewer passwords. It doesn’t mean I don’t get away with it completely, but it is significantly better than any other option that I’ve tried.

In terms of the USB port, it really does some pretty cool stuff. For example, if you connect an iPhone 15 with its USB-C port to an older iPhone, the iPhone 15 will supply charge to the iPhone that has the lightning port. So if you’re out and about and you see somebody is running short on battery life, you can donate a bit of juice to it. Bonnie made the comment that is a bit like the old jumper lead, you know – [laughs] when somebody needs some sort of assistance to get their car going. So you can charge someone else’s phone for them that way.

If you have 2 iPhone 15s or an iPhone 15 and an Android device that supports this particular standard that’s been developed, what happens is that you connect the 2 devices together, information is exchanged about the battery capacity of both devices, and it will charge the battery that has the least capacity at the time that the connection is made. So that’s pretty cool.

While it is cool, bear in mind that you’re only getting about 4 and a half watts maximum of charging, so it’s not a super fast way to charge by any means. It will take some time, but it’s better than running out of juice completely.

You can also use this method to charge some smaller accessories, including Apple Watches. So if your Apple Watch goes flat, you can use the pack with the USB-C cable. And also, the AirPods with the USB-C case.

So I’m now going to go down to the studio, and the next thing I’m going to try is to connect an audio interface to the iPhone and see if we can make that work.

The Focusrite VoCaster 2 Connected to iPhone 15 Pro Max in the Studio

Hello from the studio.

I’m using the same microphone that I would normally be using if I were recording in Reaper – the Heil PR40, and I’ve got it plugged into the Focusrite VoCaster 2 which is now plugged into the iPhone, and it’s getting its power from the iPhone.

I’m using Ferrite and it is seeing 8 Focusrite VoCaster 2 channels at the moment. You can toggle each on and off. So that’s pretty impressive.

The downside is that there is not a huge amount of gain at all. With the Q2U which is just a simple USB mic, there was heaps of gain, to the point that I could take the maximum level down from 100% in Ferrite to stop the clipping. In this case, everything’s maxed out, and we don’t have a lot of gain.

That could be because the Heil PR40 is a dynamic mic. It doesn’t have a lot of power. It also could be because there are ways of getting improved performance that I’m still going to have to play with later.

The key thing is, though, that this audio interface is connected to the iPhone. The VoCaster 2 itself is quite portable. So if you can get this working correctly, you have quite an accessible, portable little setup. And it’s very powerful as well.

The USB-C thing is just such a significant development. I mean, anytime you look like going flat, you can take anything from a laptop charger to just the simple braided cable that comes with the iPhone, which is actually the cable I’m using at the moment, and get some charge. Or plug in a peripheral.

So there’ll be plenty more playing, I’m sure, with different audio accessories and options like this. But so far, it’s extremely promising. This is just your basic USB-C port, Apple doesn’t seem to have done anything proprietary with it. And that gives you a lot of flexibility.

The Front Cardioid Microphone in the iPhone 15 Pro Max

Now most of the time, you won’t want a fancy schmancy peripheral connected. You just want to record on the microphones that your iPhone has. And yes, there are microphones (plural). So let’s have a listen to some of them.

Currently, this is the front microphone. It’s a cardioid mic.

And while I’m recording on it, I will make the point that I’ve been doing further experiments, and I had the Samsung Q2U connected when I made a phone call. That worked beautifully. I could hear the phone call through the microphone’s headphone jack, and the person I was calling could hear me from the microphone. So the audio was great.

Not so much luck with FaceTime audio, though. When I made a FaceTime audio call, everything sounded very odd and distorted to me, and people reported I sounded like I was running on slow speed. In other words, there’s some sort of sampling incompatibility there.

That may vary from accessory to accessory or app to app, and I will have to experiment more with apps like Teams and Zoom and see what’s possible.

Let’s go through some of the other microphones now.

The Bottom Omnidirectional Mic in the iPhone 15 Pro Max

Now, I’m speaking through the bottom omnidirectional microphone in the iPhone 15 Pro Max. So we’ll just talk a little bit longer, and you can get a feel for how this microphone sounds. and then, we’ll switch to another microphone.

The Front Omnidirectional Microphone of the iPhone 15 Pro Max

Now, we’re recording on the front omnidirectional microphone of the 15 Pro Max.

Of course most of the time, you don’t have to worry about all of this sort of stuff. The appropriate microphone is selected for whatever you are trying to do. But the audio geeks among us find this a fascinating endeavour.

Configuring and Using the Action Button

Jonathan: Now, I’m back in the studio on the PC, recording the normal way.

I’ve got my iPhone connected to the mixer, as usual. Well, not quite as usual because of course, I’ve had to get a USB-C to 3.5mm adapter (since lightning is a thing of the past for me). But it is all connected, and this is the first time that I have used the adapter to record any kind of demo.

Let’s take a look at some of the notable new features in iPhone 15.

We’ll go to the Control Centre first.

VoiceOver: Control center.

Jonathan: And it looks very much like it always does on any other phone except that when we get past the lock rotation button, we’ve got this.

VoiceOver: Mute switch button, off.

Jonathan: And that’s because of the action button, which is assignable. And we’ll come to that in just a moment.

So if I double tap, …

VoiceOver: Silent mode.

Jonathan: And I can feel the normal haptic vibration that would have occurred, had I flicked the slider switch in previous versions of the iPhone to silent.

I’ll double tap.

VoiceOver: On.

Jonathan: That would appear to be a bug, because actually, if I go back and forward, …

VoiceOver: Mute. Switch button, off.

Jonathan: It’s off. The mute is now off, and the ringer is no longer silent.

Next to the Control Centre icon, in the Settings app, we now have this.

VoiceOver: Action button, button.

Jonathan: This will only appear, obviously if you have an iPhone 15.

I’ll double tap.

VoiceOver: Selected.

Jonathan: And now, I’ll go to the top of the screen.

VoiceOver: Settings. Back button.

Jonathan: We’ll flick right.

VoiceOver: Silent mode. Adjustable.

Jonathan: There’s a slider control which determines what the action button will do. If I flick right, we’ll get an explanation.

VoiceOver: Silent mode. Switch between silent and ring for calls and alerts.

Jonathan: I’ll flick left now.

VoiceOver: Silent mode. Adjustable.

Jonathan: And I’ll flick up.

VoiceOver: Focus.

Jonathan: Now, let’s see what focuses we can assign. I’ll flick right.

VoiceOver: Focus. Turn focus on to silence notifications and filter out distractions.

Jonathan: How do we determine what focus is turned on when we press the action button? That’s done by the next control.

VoiceOver: Do not disturb button.

Jonathan: I’ll double tap.

VoiceOver: Minimal notifications button. Personal button. DND with VIP button.

Jonathan: What I’m really pleased about (because this wasn’t clear to me when I read some of the reviews) is that if you’ve set up custom focuses like I have (and I’ve spent quite a bit of time configuring them), you can assign one of them to this action button.

I have one called DND with VIP, which I use quite a bit when I’m recording. It means that Bonnie and the kids can text me or contact me by phone, but no one else can. I also get the video doorbell notifications and other important things.

Other than that, my notifications are right off because I’m busy recording and I don’t want to be distracted. So it’s cool that I can assign the action button to any of these focuses.

I’ll go back now, and let’s have a look at what else is here.

VoiceOver: Focus. Adjustable. Camera.

Jonathan: If you assign the camera to the action button, the first long press of it will launch the camera button. And then, the second press will actually take the photo.

VoiceOver: Torch.

Jonathan: And you can turn the torch on. In some markets, this is called flashlight.

VoiceOver: Voice memo.

Jonathan: This is a handy blindness-related feature in my view, because many people do like to take a quick voice note and act on it later.

Or you may just want a way to record something in case anything bad happens. I have Just Press Record on my watch face for this very purpose so that if we get a guide dog refusal from an Uber driver, I can start recording and it’s pretty unobtrusive.

Well, you can do all of this with voice memos as well, if you want to, and assign it to the action button.

VoiceOver: Magnifier.

Jonathan: As we’ll be discussing very soon, the magnifier has all sorts of VoiceOver-specific capabilities. And certainly, if you have a Pro version of the iPhone 15, you’re getting more and more things, thanks to LiDAR.

VoiceOver: Shortcut.

Jonathan: You can assign a shortcut to the action button. Now, the world’s your oyster here because there are so many things that you can do when you create a shortcut.

Because I like being a rebel, one of the first things I thought when I saw that you can assign a shortcut to an action button is, you know what I’m thinking of doing? I might assign the Google Assistant to that button because you can launch it with a shortcut. And then if you want to get something from Google Assistant, you’ve got it right there on your phone. It’s just as convenient to launch as Siri is.

VoiceOver: Accessibility.

Jonathan: Now, let’s see what we can do here. I will flick right.

VoiceOver: Accessibility. Quickly use an accessibility feature. Choose a feature button.

Jonathan: I’ll double tap.

VoiceOver: Double tap to dismiss pop-up window button.

Jonathan: I’m going to perform a read all, and we will hear all of the accessibility features that you can assign to the action button.

VoiceOver: Double tap to dismiss pop-up window button.

Accessibility heading. Heading. Image.

Vision.

Classic invert. Classic invert reverses the colours of the display button.

Colour filters. Colour filters can be used to differentiate colours by users who are colour blind, and to aid users who have difficulty reading text on the display button.

Detection mode. Start detection mode on the magnifier app button.

Increase contrast. Increase colour contrast between app foreground and background colours button.

Reduce motion. Reduce the motion of the user interface, including the parallax effect of icons button.

Reduce transparency. Improve contrast by reducing transparency and blows on some backgrounds to increase legibility button.

Reduce white point. Reduce the intensity of bright colours button.

Smart invert. Smart invert reverses the colours of the display, except for images, media, and some apps that use dark colour styles button.

VoiceOver. VoiceOver speaks items on the screen button.

Zoom. Zoom magnifies the entire screen button.

Motor.

Switch control. Switch control allows you to use your iPhone by sequentially highlighting items on the screen that can be activated through an adaptive accessory button.

Voice control. Voice control allows you to use your voice to control your iPhone, even when you are on calls button.

Full keyboard access. Use an external keyboard to control your iPhone button.

Assistive touch. Assistive touch allows you to use your iPhone if you have difficulty touching the screen, or if you require an adaptive accessory button.

Apple Watch mirroring. Share and control your Apple Watch screen from a paired iPhone button.

Control nearby devices. Control nearby devices allows you to use another device on the same iCloud account from this iPhone button.

Hearing.

Background sounds. Plays background sounds to mask unwanted environmental noise. These sounds can minimise distractions and help you to focus, be calm, or rest button.

Left/right balance. Adjust the audio volume balance between left and right channels button.

Live captions. Your iPhone will use on-device intelligence to automatically display captions across all apps. Accuracy of live captions may vary, and should not be relied upon in high-risk or emergency situations button.

General.

Guided access. Keep iPhone in a single app and control which features are available button.

Live speech. Type phrases for your iPhone to speak aloud button.

Jonathan: Apple, no doubt, has a very comprehensive suite of accessibility tools. Those are the ones that can be assigned to the action button.

As I was listening to that, there was one that was absent that I’m disappointed about. And that is live listen. I was hoping that one option I had would be to be able to switch that on from the action button whenever live listen might be helpful.

So let’s go back.

VoiceOver: Settings. Back button. Accessibility. Adjustable. No action.

Jonathan: Now, you can actually turn the action button off. It’s like the thing in the Hitchhiker’s Guide to the Galaxy where they say, a sign it up saying, “Please do not press this button again.”, or something. [laughs]

But surely, you can find a use for this action button, especially since you can make a shortcut to pretty much make it do what you want.

So I think for now, I will set this up to turn on my DND with VIP focus. I really like this idea because I use this sometimes for important meetings as well, where I just don’t want to be disturbed. So just having that on a handy button would be super.

So I’ll flick right.

VoiceOver: Focus. Turn focus on to silence notifications and filter out distractions. Do not disturb button.

Jonathan: and double tap.

VoiceOver: Minimal notifications button. Personal button. DND with VIP button.

Jonathan: Double tap to select it.

VoiceOver: Settings. Back button.

Jonathan: And it’s all good to go, so we’ll go back out of this.

Now, I’ll go to the home screen settings and I’m going to press the little action button, which is right at the top above the volume up key. I’ll press and hold that.

I don’t get any confirmation at all other than a little bit of haptic feedback. But VoiceOver isn’t saying anything. If I check in control center, …

VoiceOver: Focus button.

Jonathan: Focus is now off. So there’s no focus set.

There are too many focus references at once here, because I was going to say I’m going to keep VoiceOver focus on the focus button and see if the focus changes. [laughs] Hopefully, you get what I mean. I’m going to keep VoiceOver’s cursor on this button.

And now, I’m going to press the button again.

VoiceOver still didn’t speak anything. But I think if I go forward and back, …

VoiceOver: Selected. Focus, DND with VIP button.

Jonathan: So that’s a bit disappointing, because I would like some sort of verbal confirmation. So what I’d like it to say is “DND with VIP on”. And then, “DND with VIP off” when I toggle it.

Let me push it again.

Yeah. Unfortunately, the haptic feedback when you toggle the focus on and off again is exactly the same. So a blind user has no way of telling other than going into control center to verify whether they have just toggled this focus on or off.

I’m sure this is something that could be remedied in a future version of iOS, if Apple has a mind to do that.

I paused the recording. And just to experiment, I’ve set this now to voice memos.

So I’m going to tap and hold that button. You do have to press it in for a bit. And that’s, I guess, to avoid having the button pressed accidentally.

VoiceOver: Action button.

Jonathan: And that looks like we are recording. And we definitely got some feedback there.

VoiceOver: Current position: 4 seconds.

Jonathan: Yeah, that’s fine. It just started to record a voice memo. So all you have to do is hold the button down and start recording.

Can I hold the button down and stop recording?

VoiceOver: Aeroplane mode. Switch button, off.

Jonathan: So that’s very nice. Being able to tap and hold this action button, you immediately start to record. If you press the button again, you’ll drop back to where you were before you started recording. And the recording is saved, and it remains in your voice memos app.

So that is the action button. I quite like this.

I know some people are lamenting the lack of a slide switch. But I mean, if you want to mute your phone still, you can assign the button to that very thing. And it’s always now in control center, similar to the way that iPad has worked.

Greater Control of Battery Health in iPhone 15 Models

Finally, before we move away from iPhone 15-specific things and we move on to iOS 17 things, let’s talk about battery health.

Apple has been increasingly interested in providing users with more data and protection around battery health since the famous BatteryGate stuff of a few years ago. Now, there are a couple of interesting things pertaining to battery health this time around.

Let’s go and explore settings, battery, and what is at least in the iPhone 15, now called battery health and charging. I’ll flick right.

VoiceOver: Battery health and charging, heading.

Phone batteries, like all rechargeable batteries, are consumable components that become less effective as they age. Learn more.

Maximum capacity: 100%.

Jonathan: Well, I’d be pretty gutted if that wasn’t the case, given that I just got the phone today.

VoiceOver: This is a measure of battery capacity compared to when it was new. Lower capacity may result in less usage hours between charges.

Peak performance capability. Built in dynamic software and hardware systems will help counter performance impacts that may be noticed as your iPhone battery chemically ages.

Charging optimization, optimized button.

Jonathan: Now, I can only presume that there’s something new about the battery technology and iPhone 15 because this is not available anywhere else in iOS 17 on earlier models.

You’ve got some choice now about the way that you can try and optimize your battery performance. I’ll double tap.

VoiceOver: Selected. Optimized battery charging.

Jonathan: This is something that’s been in iPhone for quite a few years now, where it tries to learn your habits. And if you choose to have your iPhone plugged in overnight, it will look at, say, if an alarm is set, or when you tend to wake up. And if your phone reaches 80%, it’ll sort of be like, you know, the old space shuttle launches where they would often go into a hold at T-9 minutes. And then, just before you wake, it will start to charge up the remaining 20% (because it’s not a good idea to keep your phone plugged in and charged at 100% for a long time).

Now, there’s a second option here which I’m going to enable.

VoiceOver: 80% limit.

Jonathan: I’ll double tap to enable it.

VoiceOver: Selected. 80% limit.

Jonathan: And the phone may well explain this, so we’ll keep going.

VoiceOver: None.

Jonathan: And you can turn any optimization off altogether.

VoiceOver: Your iPhone will only charge to about 80%. Learn more.

Jonathan: And you can read the article if you have an iPhone 15, or you will be getting one in the near future.

But the key thing here is that if you ever do want to give your phone a full 100% boost, (and I know I certainly will when I want to do some extensive international travel and I just need my phone to work), then I can always just come in here and switch it off. Otherwise, I’m quite happy to leave my phone plugged in, knowing that it will never get above 80% when it is. So I think this is a great addition.

There’s also more information that you can glean about your battery if you have an iPhone 15. To find it, we’ll go into general, and then about.

And while we’re in here, I should say that every year, I try to give my phone an inventive name. My last iPhone was called Black Pearl because it was a black iPhone 14 Pro Max, and Pearl was my guide dog a long time ago now. And I used to call her Black Pearl because, as we all know, black pearls are valuable.

So this time, with the iPhone being the 15 and, you know, they call the 15th of March the Ides of March, so I decided to call my phone …

VoiceOver: Name: the Ides of Phone button.

Jonathan: The Ides of Phone. Tremendous!

Let’s navigate by heading.

VoiceOver: Physical SIM, heading. Available SIM heading. Battery, heading.

Jonathan: Now, there’s some interesting stuff here about batteries.

VoiceOver: Manufacture date: August 2023. First use: September 2023. Cycle count: 1.

Jonathan: And it’s that cycle count that’s the interesting data because the more cycles your battery has completed, the more likely you are to start seeing some degradation in the time that your iPhone runs for. And eventually, the iPhone might start to slow down to try and manage performance that way. So you can keep track of this.

The combination of the battery health that’s been there for a while now and this definitive data on how many cycles the battery has had should be very helpful.

So far, so good with the iPhone 15 Pro Max. It is very very early days. The moment I got this, I started to work on this demo for you, so I could make it available to our plus subscribers just as quickly as possible.

But now, let’s segue into some general iOS 17 things.

We have, by the way, been segmenting and will continue to segment this episode extensively by chapter. So if you want to skip around, you can use this as a kind of a reference because each major section is denoted by a chapter.

And if you’ve got a podcast app or device that supports chapters, you can whiz around this audio and get what you need.

VoiceOver Point and Speak and Magnifier Features

Jonathan: We’ll look now at the VoiceOver features that are new in iOS 17.

The one that a lot of people were excited about and may have caused people to go out and purchase Pro models is the point and speak feature. Point and speak only works on iPhone Pro models because it requires LiDAR, and you’re only going to get LiDAR on iPhone Pro models.

In recent versions of iOS, we’ve seen features like people detection and door detection come to the iPhone Pro models for VoiceOver users, and that’s all handled in the magnifier app. iOS 17 brings us another LiDAR-based feature – point and speak. And I have to say, from my preliminary testing, it is a real disappointment.

Apple first announced this back in May, when it previewed some new accessibility features coming to iOS 17 as part of its work on Global Accessibility Awareness day. It highlighted the benefits for blind people of being able to point your finger at something unfamiliar like an inaccessible touchscreen, or even buttons on a device. You may not know what those buttons do, and it will tell you what the buttons do.

In my admittedly fairly limited testing of this, it really has not worked very well at all. I must say, I was somewhat relieved to hear Judy Dixon saying the same thing in episode 248, because Judy’s one of the best totally blind people with an iPhone camera that I know, and she was having similar problems to me.

I would happily have taken you to the kitchen and done a recording of touching the microwave, or going to the laundry to show you the washing machine, but it’s very much hit and miss for me and quite unreliable.

If you want to try this for yourself, though, then you can perform a 4-finger triple tap to invoke the magnifier. Let’s do that now.

VoiceOver: Detection mode. Magnifier. View finder. Image. Detection mode on. Done. Point and speak on. Detection mode settings.

Hand too close. Hand too close. Hold your finger further from the camera. Line flat on the text.

Jonathan: I’ll just pause this and let you know what’s going on. I’ve been using point and speak, and it’s currently the only option that I have on in the magnifier. We’ll show you how to toggle the various options in just a moment.

The phone is lying on the desk, so it’s not getting a view of anything. And it’s saying that the hand’s too close, and all those things because really, we’re not set up to use the magnifier at the moment.

But the first thing to do if you want to try this out is to select the correct mode. And probably to avoid some clutter, deselect the modes that you don’t want.

I’m going to perform a 4-finger tap to get to the top of the screen.

VoiceOver: Detection mode settings button.

Jonathan: And we’ll come back to that in a moment. But I’ll flick right.

VoiceOver: Torch off button. Done button. People detection off button.

Jonathan: Now we have the different modes, and you can select them. You heard iPhone playing a sound there because I have the actions available message turned off in favour of that little sound telling me when there are actions. So I can flick down.

VoiceOver: Expand. Hide controls. Pause detection mode. Activate, default.

Jonathan: If I want to turn people detection mode on, which can be very useful when you’re out travelling, when you’re trying to find a seat on a bus, that kind of thing, then you can double tap to turn this on.

If I flick right, …

VoiceOver: Door detection off button.

Jonathan: Door detection is off at the moment.

VoiceOver: Image descriptions off button.

Jonathan: As you travel around, you can get descriptions of what’s going on around you. We have demonstrated these features in previous editions of this feature where we’ve gone through new versions of iOS.

VoiceOver: Text detection off button. Point and speak on button.

Jonathan: And point and speak is on at the moment.

So if you want to try this, you might want to disable some of the other modes so you don’t get a lot of verbiage, and turn point and speak on.

Let me know how you get on with this. I’d be curious to know what testing Apple did with blind people in real world conditions on this because I’m just not able to make it do anything of particular value. And if anything, the VisLens app seems to be a bit more useful. You do have considerable configurability, though.

So we’ll go to the top of the screen again.

VoiceOver: Detection mode settings button.

Jonathan: And double tap that button.

I’ll flick right.

VoiceOver: Detection modes, heading. Done button. People detection button. Door detection button. Image descriptions button. Text detection button. Point and speak button.

Jonathan: There are options in here for each of these modes. We’re interested in point and speak, which is new in iOS 17. So I’ll double tap.

VoiceOver: Magnifier. Detection feedback, heading.

Jonathan: And we’ll flick right and have a look at the detection feedback options.

VoiceOver: Sound switch button, on. Labels switch button, off. Speech switch button, on. Haptics switch button, off.

Point and speak will detect the position of your finger and describe what is being pointed at by the finger in the camera view. Point and speak should not be used for navigation, or in circumstances where you could be harmed or injured.

Jonathan: There’s more on the screen.

VoiceOver: Location, heading. Selected. Under. Above.

Jonathan: I haven’t tried changing the setting. So it might be that if you change this so that point and speak is telling you what’s above your finger, it may be a bit easier to try and get some useful information from touchscreens where obviously, if you’ve got something that’s switched on and you touch, then you’re going to be activating something and you may not be sure what it is you’ve activated.

VoiceOver: Point and speak will detect text that is near your finger.

Automatic torch switch button, on. Magnifier will identify when the environment is too dark to detect text, and will turn the torch on for 10 seconds to illuminate the surroundings.

Jonathan: I imagine in American English, that’ll be saying flashlight and not torch, which is what we say here.

VoiceOver: Border switch button, on.

Color: vibrant yellow 82 button. Select a color to outline detected text.

Jonathan: That’s what’s on this screen. If I go to the previous screen by performing a 2-finger scrub, …

VoiceOver: Point and speak button.

Jonathan: We’ll go to the top.

VoiceOver: More info button.

Jonathan: And there’s a more info button. I’m going to double tap that, and you’ll hear that it gives us a description of what some of these features do.

VoiceOver: Detection mode, heading.

Jonathan: I’m going to perform a 2-finger flick down for a read all now.

VoiceOver: Detection mode, heading. Point and speak. To use point and speak, point to a specific section of text, or move your pointer finger around in the camera view. Your iPhone will detect the text near your finger, and read this text to you.

text detection. Use your iPhone’s camera to hear and read the text detected nearby.

People detection. Your iPhone can let you know when it detects people nearby with sound, speech, and haptic feedback. Feedback will be more frequent when the person is closer to you.

Door detection. Your iPhone can let you know when it detects doors nearby with sound, speech, and haptic feedback. Feedback will be more frequent when the door is closer to you.

Image descriptions. Image descriptions provide a live description of the scenes detected in the camera view.

All detection modes. Double tap the screen with 2 fingers to pause or resume detection. Continue button.

Jonathan: That’s what’s on that screen. If I perform a double tap on the continue button, we’ll go back to the previous screen.

So that’s point and speak in iOS 17. A little bit underwhelming in my early tests. But hopefully, we can crowdsource the best way to use this. And if you’re having some great luck with it, we’d all be interested, I’m sure, in hearing about that.

Another new thing you’ll notice for VoiceOver users is that when you go into VoiceOver settings, you do get tips. Currently, this is the tip I’m getting.

VoiceOver: tip. Detection mode. Use magnifier to speak text under your finger, describe scenes, and detect people, doors, and text using your iPhone’s camera. 4-finger triple tap to start detection mode.

Jonathan: Now you heard that sound, indicating that actions are available. I’ll flick down.

VoiceOver: Use detection mode in magnifier.

Jonathan: If I double tap at this point, …

VoiceOver: Magnifier. Viewfinder. Detection.

Jonathan: It’s actually context-aware, so it’s taken me back into the magnifier app. So I’m going to close the magnifier. I’ll go to the app switcher by performing a long flick up from the bottom of the screen.

VoiceOver: App switcher. Magnifier. Close magnifier.

Jonathan: And double tap.

VoiceOver: Settings. App switcher. settings. Activate, default.

Jonathan: We’ll go back into settings.

VoiceOver: Settings. Tip. Detection mode.

Jonathan: Now, I can flick down.

VoiceOver: Use detection mode in magnifier. Hide tip.

Jonathan: And hide this tip.

VoiceOver: Voiceover switch button, on.

Jonathan: And at the moment, there are no other tips that are displaying. So perhaps, that’s the only one I’m ever going to get, or maybe other tips will come up periodically.

Improved Configurability of VoiceOver Voices

Jonathan: One feature that many users who like their speech sounding exactly the way they like will appreciate is that there’s much more flexibility for many iOS 17 voices, in terms of how they sound.

For this demonstration, I’m using the Vocalizer Karen voice. So let’s take a look at how she can be customized first.

Not too surprisingly, to do this, you will double tap the speech settings under VoiceOver settings. And then, you can do this for either your primary voice, or for any English voice that you have on your rotor.

I’ve highlighted Karen now, so I’ll double tap.

VoiceOver: English Australia, heading. Selected. Karen, premium.

Jonathan: Now this is actually quite buried, and some people have been quite surprised when I’ve pointed out to them that you can do this.

So you’ve now got to double tap again.

VoiceOver: English Australia, heading.

Jonathan: I’ll flick right.

VoiceOver: Karen. Download Karen enhanced, 157.3 megabytes button.

Selected. Karen, premium. Using 213.5 megabytes.

Jonathan: Now, that’s the key message, the actions are available. So I’ll flick down.

VoiceOver: Speak sample.

Jonathan: We don’t need to do that right now because we’re hearing Karen speak as we do this demo. But if I flick down one more time, …

VoiceOver: Open the voice settings.

Jonathan: We can open the voice settings for Karen and indeed, for most other voices. I’ll show you a couple of other examples of this.

I’ll double tap.

VoiceOver: Preview settings button.

Jonathan: This has actually set focus to the last control on the screen, which is to preview settings. That’s handy if you’ve made a bunch of changes, and you want to hear the accumulative effect. You can double tap and do so.

But I’m going to go to the top of the screen now by performing a 4-finger single tap on the top half of the screen.

VoiceOver: Karen. Back button.

Jonathan: And I’ll flick right.

VoiceOver: Voice settings, heading. Timbre: 100. Decrement 100 button.

Jonathan: I’m a big fan of the flexibility. Not a big fan of the user interface that Apple have put together here, because I think it would be far easier to just have a slider so you can slide up and down to control each of these. Instead, you have to find the decrement button to double tap that and decrease, and the increment button to double tap and increase.

So let’s have a look at what the timbre does. I’m going to double tap.

VoiceOver: Timbre: 99. Decrement 99. timbre: 98. Decrement 98.

Jonathan: So it’s a slow process, but I’m going to double tap quite a lot of times.

VoiceOver: Timbre, timbre, timbre, timbre, timbre, timbre, timbre: 90. Decrement 90.

Jonathan: I paused the recording and double tap lots of times. And now, we can really hear it happening.

VoiceOver: Timbre 67.

Jonathan: I’m kind of curious how low we can go. I’ll pause the recording again and keep tapping until I can’t tap anymore.

Alright. It’s gone all the way down to 50. And now, it sounds like this. [laughs]

VoiceOver: Timbre: 50. Decrement 50 dimmed button. Timbre: 50. Increment 50 button.

Jonathan: So it was saying that the decrement button is now dimmed because this is how low it can go. I’ve quickly pressed the reset button which has got the timbre back to 100.

We can go higher as well.

VoiceOver: Timbre: 101.

Jonathan: I’ll pause the recording again, and we’ll see how high the timbre can go and what that sounds like.

And oh my word, it does sound pretty peculiar.

VoiceOver: Reset voice defaults button. Reset previous settings button.

Jonathan: The timbre is set to 200 now.

VoiceOver: Reset voice defaults button.

Jonathan: Let’s do that reset to voice defaults. [laughs]

VoiceOver: Reset. Timbre: 100. Decrement 100 button.

Jonathan: The next control we have in this screen …

VoiceOver: Sentence pause: 1. Decrement 1 button.

Jonathan: is the sentence pause option. I find this quite useful because I’m a JAWS user, and JAWS tends not to have a very long pause between sentences which I appreciate. I can tell from the inflection that a sentence has ended. I don’t need myself slowed down by an unduly long pause.

So we can turn this down to 0, if we want. If you prefer longer pauses between sentences, you can increase that as well.

VoiceOver: Sentence pause: 1. Increment one button. Preview settings button.

Jonathan: And there’s the preview settings button. So let’s just see what happens.

VoiceOver: Hello! My name is Karen. I am an Australian English voice.

Jonathan: I haven’t looked at every single Vocalizer voice on the system, but this seems pretty typical of the options available to Vocalizer voices.

For other non-Vocalizer voices, you’ll have different options. Let’s have a look at eloquence because there is a lot of change there.

I’ve got reed highlighted in my US English voice. So if I flick down, …

VoiceOver: Speak sample.

Eloquence: Hello! My name is reed.

Jonathan: I’ll flick down again.

VoiceOver: open the voice settings.

Jonathan: And we can open the voice settings, and there’s a lot to play with here.

VoiceOver: Preview settings button.

Jonathan: And now, I’ll go to the top of the screen,…

VoiceOver: eloquence Back button.

Jonathan: and flick right.

VoiceOver: voice settings, heading. Rate multiplier: 110 decrement 110.

Jonathan: Some of these parameters have quite cryptic names, actually. So I’m going to take this down a few, and we’ll see what difference it makes.

And actually, to make that more apparent, I will switch to eloquence for this part of the demo so you can hear what’s happening.

The lowest that this parameter goes …

VoiceOver: Rate multiplier: 50. Decrement 50, dimmed.

Jonathan: I’m gonna listen to that.

VoiceOver: Reset to voice default. Preview settings button. Hello! My name is reed.

Jonathan: So that seems to have slowed the voice down. Let’s go to the top end of this one, which is 200.

VoiceOver: Rate multiplier: 200. Increment. Head size: 50.

Jonathan: And now, if I do a preview, …

VoiceOver: Teset to voice defaults. Preview settings button. Hello! My name is Reed.

Jonathan: So what appears to be happening here is you have your main speed control which is on the rotor and also in the iOS VoiceOver settings, and then the multiplier has an influence on that setting as well. In other words, if you’ve got the multiplier high, then 55 on the rotor is going to be much faster than when you’ve got the multiplier low.

The next option is …

VoiceOver: Head size: 50. Decrement 50 button.

Jonathan: the head size. And if we take that all the way down, …

VoiceOver: Head size: 0. Decrement 0 dimmed, button.

Jonathan: He does have a small head now, doesn’t he? [laughs]

If we go down, …

VoiceOver: Reset to voice default. Preview settings button.

Jonathan: And preview …

VoiceOver: Hello! My name is reed.

Jonathan: If we go all the way up, he’s obviously going to have a very big head.

VoiceOver: Head size: 100. Increment 100 dimmed, button.

Jonathan: And if we preview, …

VoiceOver: Preview settings. Hello! My name is reed.

Jonathan: We’ll reset it to defaults.

VoiceOver: reset to voice defaults button. Rate multiplier: 100. Head size. Pitch bass: 65 hertz.

Jonathan: We have pitch bass.

VoiceOver: Pitch base: 65 hertz. Pitch range: 30.

Jonathan: Pitch range, which will give you greater inflection.

VoiceOver: Breathiness: 0. decrement zero dimmed, button.

Jonathan: And breathiness. Let’s see what happens if we increase the breathiness of eloquence.

VoiceOver: Breathiness: 100. decrement 100 button. breathiness 100. Increment 100. Reset to voice defaults button. rate multiplier.

Jonathan: And obviously, you can add a little bit of breathiness because that was way at the other end. So if you like your eloquence to sound a little breathy, you can give that a little bit more.

You can also adjust …

VoiceOver: Roughness: 0. Decrement 0. roughness: 0. Increment 0 button.

Jonathan: Let’s increase the roughness and see what it’s like.

VoiceOver: Roughness: 100. Increment 100 dimmed, button.

Jonathan: Reset that to default.

VoiceOver: Reset to voice defaults button. Rate multiplier: 100. Decrement 100 button.

Jonathan: You can also enable …

VoiceOver: Higher sample rate switch button, off.

Jonathan: If we do that, this is what it sounds like.

VoiceOver: On. Roughness: 0. Increment. Higher sample rate switch button, on. Phrase prediction switch button, off.

Jonathan: We’ll turn that off.

VoiceOver: Higher sample rate switch button, on. Off.

Jonathan: There’s also phrase prediction which is designed to make things sound a bit more natural.

VoiceOver: Phrase prediction switch button, off.

Jonathan: I’ll flick right.

VoiceOver: Abbreviation dictionary switch button, on. Community dictionary switch button, off.

Jonathan: You can disable the abbreviation dictionary which I will do, actually…

VoiceOver: Abbreviation dictionary switch button, on. Off.

Jonathan: because I prefer to hear actually what’s written on the screen, not have my text-to-speech engine assume it knows what’s on the screen. The classic example of this is the old KeyNote. I can’t remember whether Pulse Data International had switched to the KeyNote Gold, or whether it was still using an arctic 263-based text-to-speech synthesizer. But it used to talk about phone number aviv whenever it saw televieve.

VoiceOver: Community dictionary switch button, off.

Jonathan: There is a community dictionary. You can enable that, if you wish.

VoiceOver: Preview settings button.

Jonathan: And then, there’s a preview settings so you can have a lot of fun customizing eloquence exactly the way you like it.

Another popular voice is Alex, and there’s some customization there.

I have switched to alex now, and let’s see what customizations are available for that voice.

VoiceOver: Pitch range: 100. Decrement 100 button. Pitch range: 100. Increment 100 button.

Jonathan: Let’s take that way up and see what a wider pitch range sounds like for Alex.

VoiceOver: Pitch range: 200. Decrement 200 button. Pitch range: 200. Increment 200 dimmed, button.

Jonathan: Actually, I don’t think that sounds too bad at all. Let’s just go and do a preview.

VoiceOver: Reset to voice defaults button. Preview settings button. Most people recognize me by my voice.

Jonathan: And we’ll reset.

VoiceOver: Reset to voice defaults. Pitch range: 100. Decrement 100 button.

Jonathan: I’ll flick right.

VoiceOver: Pitch range: 100. Increment 100 button. WPM minimum: 50. WPM decrement 50 dimmed, button. WPM minimum: 50. WPM increment 50 button.

Jonathan: There’s words per minute minimum.

VoiceOver: WPM maximum: 500. WPM decrement 500 button. WPM maximum: 500. Preview settings button.

Jonathan: And that’s all we have in terms of Alex. I wish there’d be some kind of community dictionary for Alex because i quite like the voice in some ways. But I found that it mispronounces more than any other text-to-speech engine that I’ve come across, so a community dictionary would be great.

So this is fun if you like to tinker with your voices and having it sound just so. Apple has given us a lot of flexibility in iOS 17.

More Control Over VoiceOver Verbosity

Those of us who enjoy customizing exactly what our screen readers say will always celebrate new features in that regard. And there are some to talk about in iOS 17. You can find these new features in VoiceOver’s main settings screen, and then going to …

VoiceOver: Verbosity button.

Jonathan: Verbosity.

I’m repeating it because I’m not sure Karen is saying that very clearly.

I’ll double tap.

VoiceOver: Punctuation button.

Jonathan: And flick right.

VoiceOver: Customize how punctuation is output.

Speak hints switch button, off. System notifications button.

Jonathan: And here, we have system notifications.

This is something that we’re all familiar with. You could be reading a book or a news article, and a notification pops up, and it interrupts your speech, and you really wish it didn’t.

There was some customization for this scenario before. But now, there’s a lot more, and it seems to be far more effective with one massive disappointment. So significant, in fact, that I do hope it’s fixed sometime during the iOS 17 cycle. I’ll explain that in a bit.

But I’m going to double tap on this.

VoiceOver: Notifications when locked, heading.

Jonathan: The first heading governs what VoiceOver will speak when your phone is locked and a notification comes in.

In my experience, these settings are honored most of the time, but not all the time. I have seen some notifications that come in that don’t seem to wake the phone up, and I can’t really account for that. But most of the time, it works.

VoiceOver: Selected. Speak.

Jonathan: This means that if i lock my screen and a notification comes in, it will wake up and speak the notification.

VoiceOver: Speak count.

Jonathan: I’m not really concerned with how many notifications there are. That’s just verbiage I don’t want, so i’ve unselected that. But if you want to hear that number of course, enable it.

VoiceOver: Selected. Braille.

Jonathan: And I want notifications sent to my Braille display.

The final option is…

VoiceOver: Do nothing.

Jonathan: So you get no verbosity at all when you get a notification when your phone is locked. And some people may definitely prefer that.

I like the setting to be the way I’ve got it because I usually use my iPhone with made-for-iPhone hearing aids, and I don’t mind having notifications coming to my ears. But if you use your phone on speaker and your phone is locked, and this thing is unlocking all the time to speak, that could be not only distracting, but actually inappropriate in certain circumstances. So you can switch this off.

So all this governs what happens when your phone is in standby mode.

If we continue to flick right, …

VoiceOver: Determines what VoiceOver will output for notifications when your iPhone is locked.

Jonathan: Well, exactly.

VoiceOver: Banner notifications, heading.

Jonathan: And now, we’re on to banner notifications. These are the ones that pop up at the top of your screen when you’ve got your phone unlocked.

VoiceOver: Selected. Speak.

Jonathan: I have mine speaking.

And it’s worth noting that, at least when I upgraded to iOS 17, speak was not enabled. And I was quite concerned thinking that there was a bug because I would hear the notification sounds, but I’d no longer hear the notifications.

Turns out there’s this new option. And for whatever reason, they’ve chosen not to automatically speak notifications anymore. I don’t know whether that’s by design, or some sort of bug that might be fixed eventually.

But if you’re not hearing your notifications, you need to go in here and fix that to get things to the way they were before.

VoiceOver: Selected. Play haptic.

Jonathan: And a haptic as well.

VoiceOver: Selected. Braille.

Jonathan: You can have any number of these selected, or not selected. So for example, your preference could be that you don’t want to be interrupted by a banner notification, but you would like a nice haptic to tell you when a notification has come in, so you can tend to it at your leisure.

VoiceOver: Do nothing. Determines what VoiceOver will output when a banner notification appears.

Use ring switch switch button, off. When the ring switch is set to silent, no notifications will be spoken.

Jonathan: Now, I’m actually recording this part of the demonstration before my iPhone 15 has arrived. So I’m using an iPhone 14 Pro Max, and I imagine the language will be different when i get the iPhone 15. But since most people will have older devices in the initial stages, I’m going to leave that in.

Pretty cool that you can use the ringer switch to silence your notifications when that’s appropriate, and you can enable that feature if you want.

I mentioned that there’s one big disappointment, and that is that these features are not configurable in activities. If you’ve not used activities before, they allow you to customize the behavior of many VoiceOver settings, depending on a scenario like the application that you’re running.

One really easy example is that I usually have hints disabled on VoiceOver because I’ve been using it a long time and navigating the home screen, and hearing it double tap to open all the time is tedium. Tedium, I tell you. So I have it switched off.

However, when I use the third-party podcast app Overcast, they use hints to speak an excerpt from the show notes when you flick through a podcast or a playlist of podcasts. I want to hear those show notes. So I’ve got an activity set up that when i go into Overcast, will enable verbosity. And I don’t have to think about it, now that it’s set up. It just happens in the background.

What I really think is missing here is the ability to customize these notifications based on activity. Let me give you a very simple scenario I’m sure many people will be able to relate to.

Let’s say that in most cases, your preference is to have your notifications spoken automatically when a banner comes in at the top of your screen. But you don’t want that to happen when you’re reading a book.

When you go into Apple’s Books app or the Amazon Kindle app, what you want is just a haptic to tell you that a notification has come in. And then, you can tend to it later. If these notification settings were configurable in activities, then you could do exactly that.

One sort of workaround that doesn’t do exactly the same thing is to set up a focus that triggers for book apps.

We’ve covered the focus feature in the last couple of iterations of this thing where we’ve gone through new versions of iOS, because I think focuses are one of the coolest new things to be added to iOS in a very long time. I use them extensively.

So when I go into my Amazon Kindle app or the Apple Books app, I have very minimal notifications when certain people call or text me or certain applications send a push, like Doordash or Uber Eats. I want that notification. But most of my notifications are silent.

The thing about that, though, is that I have to go into notification center to check if other notifications have come in. Whereas if this activity could be set up so that I can get haptic feedback, that would be much more immediate. But still, not intrusive. Still, this extra control of verbosity is very welcome.

While we’re in verbosity, I’m just going to go back.

VoiceOver: System notifications button.

Jonathan: And if we go quite a long way through this excellent screen of customization options, we have something else that’s new.

VoiceOver: Predictive text feedback button.

Jonathan: Apple’s done a lot with predictive text in iOS 17 to make it smarter and adaptable to you, and I’ll talk a bit more about that later in this episode. But for now, let’s take a look at what VoiceOver can do.

We’ll double tap.

VoiceOver: When predictive text appears, speak, play a sound button.

Jonathan: At the moment, it will speak the prediction and it will play a sound. If we go in here, …

VoiceOver: Selected. Speak. Selected. Play a sound. Change pitch. Braille. Do nothing.

Jonathan: So you’ve got quite a few options there. We’ll go back.

VoiceOver: When predictive text appears, speak. VoiceOver will output predictive text when it appears.

Jonathan: And the next option, …

VoiceOver: When predictive text feedback is entered, speak, change pitch button.

Jonathan: There’s an explanation.

VoiceOver: Voiceover will output predictive text when entered.

Jonathan: If you want to make the most of some of the new predictive text smarts that are in iOS 17, this is very useful to know about, so that you can get the experience exactly how you like it.

Braille Improvements

Taking a look at Braille in iOS 17, there is a tip when we go into Braille settings.

VoiceOver: Tip: Launch apps with your Braille device in the home screen. Press return on your Braille device, start typing the name of an app, then press return to open the app.

Jonathan: This doesn’t work for me. I think it’s because I’m using a Mantis, which is essentially posing as a Bluetooth keyboard. But if you have a standard Perkins-style Braille keyboard, then this is very handy.

I’ll flick down.

VoiceOver: Hide tip.

Jonathan: And double tap.

VoiceOver: Input: contracted button.

Jonathan: And the tip has now gone away.

There’s another big Braille feature that’s been introduced in iOS 17, and it’s this.

VoiceOver: Sound curtain switch button, off.

Jonathan: Be careful with this one.

I’ll flick right.

VoiceOver: Sound curtain ensures that your phone does not play sound from music or sound effects when the Braille display is connected. Emergency alerts will still play sounds.

Jonathan: If you are a Braille user and you don’t want any kind of sound playing on your phone when you’re using your Braille display (sometimes, that can be very useful when you’re in a meeting, or some sort of environment where any kind of sound is not appropriate), then you can enable sound curtain. You’ll get a warning when you try to do so.

And in fact, let me try that now.

VoiceOver: Sound curtain switch button, on. Alert: sound curtain.

Jonathan: I’ll flick right.

VoiceOver: Sound curtain will disable all audio including VoiceOver, when a Braille display is connected. Are you sure you want to continue?

Jonathan: Now, I’m going to do that and just make sure that it works the way they say it will. So I’ll flick right.

VoiceOver: Cancel button. Okay button.

Jonathan: Double tap.

I’m still reading everything on the screen. But I’m flicking right. I’ll go to the bottom of the screen.

And even when I’m flicking right at the bottom of the screen, there is no sound indicating that I’m at the bottom of the screen, which is what you would normally get.

I’m flicking left, and I’ve found sound curtain, which I’m sure you can assign to a keyboard if you’re going to use this regularly. And sound curtain is on, so i will press VO with the spacebar.

VoiceOver: Off.

Jonathan: And now, I’ve got my speech back. So that’s sound curtain.

Also, if I continue to go right, …

VoiceOver: Enable Bluetooth on start switch button, off.

Jonathan: And there’s an explanation about what this feature does.

VoiceOver: Enable Bluetooth when VoiceOver starts, so that Braille displays can connect.

Jonathan: I tend not to disable my Bluetooth anyway, so I will leave this feature off. But it will be useful for some with Braille. You can always find a few nuggets buried away under the more info for your Braille display.

So if I flick right, …

VoiceOver: Choose a Braille display.

Jonathan: to that section, and we’ll keep going.

VoiceOver: APH Mantis q40.

Jonathan: I’ll flick down.

VoiceOver: More info.

Jonathan: And double tap on more info.

VoiceOver: Braille commands button.

Jonathan: In this case, I’m just going to continue to flick right.

VoiceOver: Disconnect on sleep switch button, off.

Jonathan: This option, disconnect on sleep, is new to iOS 17. And if you enable it, it’ll do well, what it says on the tin. When you lock your screen on your phone using the power button, or the phone just times out and locks if you have auto-lock enabled, your Braille display will disconnect. That will save some battery for the phone and the Braille display. When you power the phone back on again, it will try and re-establish the connection.

That could take a while. Hopefully not. And depending on the Braille display, it may be a little bit unreliable having the connection come and go like that.

So if you are concerned about maximizing your battery life on both devices, you can give this a try and see if it has any adverse effects. If it does, you can always switch it back again.

When I was exploring what’s new in iOS 17, I found this first when I was looking around under this Braille command setting. Because actually, this interface came to Braille first. You could configure Braille commands before Apple released the ability to configure the rest of VoiceOver.

Now, it’s a highly configurable system, and you can create your own VoiceOver user interface if you really want to. But good luck if you do that and try and call Apple Support because it’s going to be very difficult for Apple Support to help you if all your commands are different. Anyway, it’s a highly configurable system.

Locate the Centre of the Screen From Your Keyboard

Now, and one of the wishes that I have had ever since I started using iOS was that there’d be a way from the keyboard to get to the center of the screen. This is something that I do very regularly in a wide range of apps such as Lary, my RSS reader, Mona for Mastodon, many applications actually where if you’re reading something and you go to the previous screen and you want to return to that position, often just tapping the center of the screen will get you to your last red position. There hasn’t really been an easy way to do this from the keyboard, which has made using the iPhone completely touchscreen free, a little bit challenging for a VoiceOver user.

Now that has been dealt with, what I’m going to do is show you this from the general interface, and not the Braille one. I mention it here though because I first discovered it when I was fossicking around in the Braille commands for new things.

But for everybody to get to this, even if you don’t have a Braille display, you go into your main VoiceOver settings screen. And then, go to …

VoiceOver: Commands button.

Jonathan: I’ll double tap on commands.

VoiceOver: All commands button.

Jonathan: And I’ll double tap all commands …

VoiceOver: Interaction button.

Jonathan: And flick right.

VoiceOver: Basic navigation button.

Jonathan: to basic navigation. I will double tap.

VoiceOver: Move in button.

Jonathan: And the one that we want is called

VoiceOver: Move to item at center button.

Jonathan: It currently doesn’t have a keyboard command assigned to it. But if you want to do that, we can double tap.

VoiceOver: Touch gestures, heading. Add gesture button. Jonathan: And flick right.

VoiceOver: Keyboard shortcuts, heading. Add keyboard shortcut button.

Jonathan: And you can double tap and add a keyboard shortcut, if you want to.

Lock Your Rotor in Place

There is a new feature in Rotor Settings. I’ve gone in there now, and I’ll flick right.

VoiceOver: Rotor items button. Change rotor with item switch button, on.

Jonathan: And we’ll get an explanation of this.

VoiceOver: Changes the selected rotor based on the VoiceOver focused item.

Jonathan: On is the default, and it’s how it’s always worked.

For example, if you are in an app and actions become available, the actions rotor automatically gets focused so that you can flick down and perform the actions.

If you switch this off, then you would have to manually rotate to the actions settings. This could be useful if you have a preference to just keep the rotor the way it is, unless you expressly choose to rotate somewhere else and change it.

So those are some of the VoiceOver changes in iOS 17.

[music]

Advertisement: When I finish the show, I’ll be rewarding myself by spending more time playing my favorite game of the moment. It’s called TimeCrest.

If you’re old enough to remember the classic text adventures, it’s kind of like that, but way better – with an epic soundtrack and many twists and turns.

In Time Crest, you’re the main character. A young mage named Ash contacts you frantically through a pocket watch, asking for your help. Ash’s world, Alincia, is about to be destroyed by falling meteors. But you demonstrate the ability to save Ash’s world by turning back time.

Chatting with Ash is kind of like texting a friend who needs help. And I must confess, I’ve come to look forward to the text conversations and what’s happening in Ash’s life.

Play Time Crest on your iPhone, iPad, or Apple Watch. Play it for hours at a time. And believe me, I have. Or you can dip in and out when you have a few minutes.

This is such a cool game.

Search for Time Crest in the App Store, or find out more at TimeCrest.com. that’s TimeCrest.com.

Siri Loses the Hey in Some Countries

There are a couple of new notable things where Siri is concerned. One is that in most English-speaking countries, you no longer have to prefix the Siri with “hey” anymore.

Unfortunately, with the English New Zealand Siri, you still have to. But I’ve got my Siri speech set to English UK, which seems to suit my voice well enough. And that means I have been able to drop the “hey” from the beginning of the phrase.

If you want “hey Siri” back again, you’re absolutely able to do that. Let’s have a look at this.

Siri, open Siri settings.

VoiceOver: Settings. Reset hidden suggestions button.

Jonathan: I’ll go to the top of the screen.

VoiceOver: Settings. Back button.

Jonathan: And flick right.

VoiceOver: Siri and search, heading. Ask Siri, heading. Listen for: Siri or hey Siri button.

Jonathan: I’ll double tap.

VoiceOver: Selected. Siri or hey Siri. Hey Siri, off.

Jonathan: You can turn listening off completely. If you want the hey Siri back again for whatever reason, you’re able to set that here as well.

Of course, in reality, if it’s listening just for Siri, then if you say “hey Siri”, it’s also going to work. It just ignores the “hey”.

Most people will have this option, but if it’s not available for you, then you’ll need to change your Siri input language to something other than the one you’re using.

VoiceOver: Listen for: Siri or hey Siri button.

Two New UK Siri Voices

If you like your Siri to speak with a UK accent, you’ve got 2 more choices now. Let’s have a listen to those.

VoiceOver: Siri voice: American, voice 4 button.

Jonathan: We’ll double tap.

VoiceOver: Variety, heading.

Jonathan: And flick right.

VoiceOver: Selected. American. Australian. British. Indian. Irish. South African. Voice, heading.

Jonathan: So those are the voices I’ve got access to. I’ll flick left.

VoiceOver: Irish. Indian. British.

Jonathan: Choose British.

Siri female: Hi! I’m Siri. Choose the voice you’d like me to use.

Jonathan: Now, we’ve got 4 UK Siri voices this time round.

VoiceOver: Voice, heading. Voice 1. Voice 2. Voice 3. Voice 4.

Jonathan: VoiceOver didn’t speak it but actually, it’s voice 3 that’s selected. So that’s one of the new Siri voices.

We’ll go to voice 4, …

VoiceOver: Voice 4.

Jonathan: And try this.

Siri male: Hi! I’m Siri. Choose the voice you’d like me to use.

Jonathan: Let’s give this a bit more of a workout. Siri, what’s the weather today?

Siri male: Looks like it will be pretty windy today. Daytime temperatures will hover around 17 degrees, with overnight lows around 12.

Jonathan: And it really is windy. If you can hear a bit of wind outside while I’m recording this, that’s why. It’s blowing a gale out there.

Let’s ask the other voice. Siri, what’s the weather today?

Siri female: Looks like it will be pretty windy today. Daytime temperatures will hover around 17 degrees, with overnight lows around 12.

Vary Siri’s Speaking Speed

Jonathan: There are other settings pertaining to Siri, and they live under the Siri part of accessibility settings. I’m going to take you there now because there’s a feature that many of us will appreciate.

VoiceOver: Speaking rate: 100%. Adjustable.

Jonathan: Thankfully, they’re using a standard slider control instead of those inefficient buttons that they’ve put into the speech settings.

So I’m going to go down.

VoiceOver: 90%. 80%.

Jonathan: It goes down to 80%. Let’s see what that sounds like.

Siri, what’s the weather today?

Siri female: It will be windy today. Daytime temperatures will hover around 17 degrees, with overnight lows around 12.

Jonathan: And it goes all the way up to 200%, which sounds like this.

Siri, what’s the weather today?

Siri: Looks like it will be pretty windy today. Daytime temperatures will hover around 17 degrees, with overnight lows around 12.

Jonathan: So for those of us who like our speech cranked up, it is good to see Siri offering this feature in accessibility settings.

Personal Voice

Personal voice is another feature announced back in May, when Apple previewed some of the accessibility features in iOS 17. Personal voice is intended for those who have speech difficulties who may be losing their voice, and want to preserve an electronic version of their voice so that they can type material that they want spoken, and a voice that sounds something like their own will be able to speak it.

We’re all familiar, of course, with Professor Stephen Hawking, and how he made that DeckTalk-like speech his own, and people associate that speech now with him. He had ALS, of course. And if he had been able to speak and use personal voice at the time he was diagnosed. But before he lost his power of speech, we would have been able to hear his voice instead of that electronic voice.

Once you’ve created your voice, it’s stored securely on your device. You can, if you want to, choose to share it with other devices that are logged into your iCloud account.

We’ll talk you through this process, how it works, and some of the VoiceOver accessibility limitations of the current implementation.

I’m on the main accessibility screen of my iPhone now, and I’m going to navigate by heading.

VoiceOver: Vision, heading. Physical and motor, heading. Hearing, heading. Speech, heading.

Jonathan: Not surprisingly, speech is where personalized voice is under.

If I flick right, …

VoiceOver: Live speech off button.

Jonathan: Live speech is associated with personal voice, but you don’t have to set personal voice up in order to use live speech. When you enable live speech, you can type or select from a preselected series of phrases that you’ve given the phone in the past, and the phone will speak them. You can use any voice on the system, or you can use your personal voice.

So if I flick right now, …

VoiceOver: Personal voice button.

Jonathan: This is the personal voice screen.

I’ll double tap.

VoiceOver: Accessibility. Back button.

Jonathan: And flick right.

VoiceOver: Personal voice, heading. Jonathan’s personal voice 1, the 24th of August, 2023 button.

Jonathan: I created my personal voice during the iOS 17 beta cycle.

I’ll flick right.

VoiceOver: Create a personal voice button.

Jonathan: Let’s create a voice. And we’ll take you through some of this process, so you understand what’s involved.

VoiceOver: User authentication. Try again. Alert.

Jonathan: Let me look at my phone.

VoiceOver: Face ID authentication.

Jonathan: There we go.

VoiceOver: Create your personal voice, heading. Record yourself. Read 150 phrases aloud, which may take about 15 minutes.

Generate your personal voice. iPhone will create and securely store your personal voice.

Communicate with live speech. Type to speak using personal voice through your device’s speaker, or in FaceTime, phone, and assistive communication apps.

Jonathan: That’s a good quick summary of what this feature does. So not only can you use this in real time on your phone speaker. If you’re making a phone call and you need to be able to use your voice there, then you are able to do that.

I understand some people have actually been able to use these to authenticate with banking services and other systems that require you to authenticate by voice. So in that sense, it appears to be fairly accurate.

Although, in my opinion, it’s not a patch on the natural sounding voices that ElevenLabs have been producing. And we’ve talked about ElevenLabs on this podcast before.

If I were to double tap the Continue button, it would give me 150 phrases that I need to say. And if you’re wearing headphones or something like that, you can hear the phrases, double tap, record the phrase, and then go on to the next phrase. Of course, it works beautifully with a Braille display as well.

When you’ve recorded your 150 phrases, you’ve then got to plug your phone in and switch it off because all of the processing happens on your device, and it’s pretty intense. So I suspect that how long this takes may depend on the speed of your device. And in fact, some earlier devices will not work with personal voice because they’re not powerful enough.

My test phone during the cycle was an iPhone 11, and I couldn’t use personal voice until I actually put it on my primary device which at the time was an iPhone 14 Pro Max.

When you’ve created the voice, you can double tap it. You can export the recordings that were used to create the voice. You can stipulate whether you want this shared across your iCloud account, and you can also delete the voice if you want to.

Now I’m going to back out of this screen.

VoiceOver: Apple Watch mirroring button.

Jonathan: And just go back to speech.

VoiceOver: Hearing, heading. Speech. Live speech off button.

Jonathan: We need to turn live speech on, so I’ll double tap this button.

VoiceOver: Accessibility. Back button. Live speech, heading. Live speech switch button, off.

Jonathan: It’s off at the moment.

VoiceOver: Triple click the side button to show live speech. Favorite phrases button.

Jonathan: If you were using this for its intended purpose, you might have a favorite phrase, especially if you were making a phone call that said, “Hello! My name is Jonathan. I’m not able to use my own voice, so you’re hearing a copy of it.

It may take me a while to type my responses to you, so please be patient if there’s a pause between when you finish saying something and when I reply.”

It would be a nuisance to have to type that every time, so you could store that as a frequently used phrase.

I’ll double tap.

VoiceOver: Hello! My name is Jonathan Mosen. I’m speaking with the assistance of Apple’s personal voice feature.

Jonathan: I’ll flick left.

VoiceOver: Add button.

Jonathan: And I’m going to add one.

VoiceOver: Add. Save, dimmed button. Phrase. Text field.

Jonathan: I’m going to pause the recording and type a fairly lengthy phrase into this, so we can hear what the voice sounds like.

I’ve typed in that phrase. It was quite a long phrase, actually. I don’t know whether there’s any kind of character limit on this. I haven’t come across it yet.

So I’ve backed out to the previous screen, and we need to enable live speech now.

VoiceOver: Live speech switch button, off.

Jonathan: I’ll double tap.

VoiceOver: On. Jonathan: And now, I’ll go back to the home screen.

VoiceOver: Settings.

Jonathan: Now that that’s switched on, we triple click the side button, and you will get a choice of turning VoiceOver on and off,or live speech. But of course in the iPhone 15, with its action button, you could assign live speech to the action button.

So I’m going to triple click the side button.

VoiceOver: Alert. Accessibility shortcuts. Selected. VoiceOver button. Live speech button.

Jonathan: Live speech is not selected at the moment, so I’ll double tap it to select it.

VoiceOver: Launcher. Call Bonnie button.

Jonathan: Now, we have a window that’s sort of popped onto the screen. I think it floats, and it can be quite difficult to locate by touch. But it seems to be quite close to the dock, so I’m going to try and find it.

VoiceOver: Dock. Drafts. Close. Hello!

Jonathan: There we go. I found it just to the right of the dock, actually.

VoiceOver: Hi! My name’s Jonathan, and I’m the host of the Living…

There’s the second phrase that I just typed in. Now I’m going to double tap that. It should start to speak through my personalized voice, and you can be the judge of how good it sounds.

Hi! My name’s Jonathan, and I’m the host of the Living Blindfully podcast.

If you haven’t heard of Living Blindfully before, it’s a podcast, usually published every week, that looks at a wide range of issues from a blindness perspective.

Sure, we talk about technology, because who doesn’t enjoy geeking out regularly? We also talk philosophical, political, and other important issues.

I hope you’ll give it a listen. It’s available in all the places where the groovy podcast can be found.

Now, I don’t think that’s particularly impressive. It does sort of sound like me but the inflection’s really strange, and it’s not saying everything particularly well. So I’d like to have another go at recording the voice to see if it’s any better.

There’s also an edit field.

VoiceOver: Hi! Hello! Close button. Keyboard button.

Jonathan: You can double tap the keyboard button.

VoiceOver:Launcher. Call Bonnie button.

Jonathan: Now, I’ll go back.

VoiceOver: Messages. Doc. mail.

Jonathan: Find the doc.

VoiceOver: Drafts. Phrases button. Close button. Type to speak. Text field. Is editing. Insertion point at start.

Jonathan: I’ve got my Mantis connected, so I have my Bluetooth keyboard. And I’ll just say tap, tap, tap. Is this thing on?

Now if I press enter, …

Tap, tap, tap, is this thing on?

There we go. It now speaks it.

So actually, it has become a little bit more accessible since the beta, which is, I guess, why we have betas.

So it is usable. You just locate the dock, and then flick right, and you will get focus into this little personalised voice section once it’s come up on the screen.

Now that it is a bit more accessible, I am inclined to have another go and see if the voice can be made to sound much better than it currently is.

But let’s put this in perspective. If you’ve got a choice between completely losing any semblance of your voice and having this on your phone at any time so that people can hear you, then it’s a no-brainer, and this is a really fantastic accessibility tool that Apple’s come up with.

Like many accessibility features, other developers are seeing the potential. And it is possible to grant third-party apps access to your personal voice. This doesn’t happen automatically. You’ll have to grant permission. A notification will come up every time an app wants to use your personal voice.

One app that is very quick out of the gate in this regard is Carrot Weather, which is a fantastic, very flexible weather app. It’s famous for the snark that it introduces into its weather forecast.

But in addition to it being a fun app, it’s a very powerful, configurable weather app. And with the version that was just released on iOS 17 release day, you can now choose to have the weather forecast read in your voice.

Liven Up Your Home Screen With Interactive Widgets

Gradually, over the last few years, Apple has been embracing widgets.

And I can hear all the Android listeners saying, “Yeah, we’ve been doing that for years.” And indeed, they have. Android had widgets long before iOS did.

Widgets first came to the Today screen, which in fact, was designed for widgets. Then, they were allowed onto the Home screen. They eventually made it to the Lock screen. And in iOS 17, widgets are interactive, which is a big feature in my view that’s going to add a lot of value over time as developers embrace this new concept.

What is a widget? Well traditionally, when widgets got started on iOS, they were bits of information that you could have readily available to you because they were on your Home screen. They were a little bit of an app that could appear for you without you having to go into the app.

But increasingly, widgets are about more than just providing you with a bit of information. They allow you to interact with an app in some way as well.

This really challenges those of us who’ve had iPhones for a very long time, or who’ve never really gone beyond the defaults to think about our Home screens as more than a static grid of apps.

Over the last few versions of iOS, Apple has been acting on its notion of what a new Home screen should be like. They really encouraged us to think differently about the Home screen when they introduced the App Library. Because with the App Library, even when you get an app from the App Store, it doesn’t have to appear anywhere on your Home screen if you don’t want it to. You can leave it in the App Library, and go and look at it if you want to. You can launch it using Braille Screen Input, or Spotlight Search, or Siri, and it need not ever appear on your Home screens at all.

Some of us organise our apps by pages. So if you have a few apps pertaining to travel, then you might put them all together on a particular page or two. If you have apps relating to food delivery services like Uber Eats or DoorDash, you might cluster those together.

Those of us who really are neat freaks (and I count myself among this group) put our apps into folders grouped by subject matter. I have a blindness folder, a productivity folder, a utilities folder, one full of audio recorders, and so on. And I have hundreds of apps tucked away like that.

On page 1 of my Home screen, traditionally, I’ve just had the apps I use the most and that I want available to me regularly. Widgets have encouraged me to rethink this concept because there are so many ways of launching an app. When you think about it, it’s a waste of precious real estate to just have this static grid of apps. Or at least, you can have some apps while also having some information you want to use regularly.

So I’ll show you what I’ve done, and then we’ll talk about how we make it happen.

I used the Today screen right from the beginning, and I still use it in the way that Apple intended.

So I’m on page 1 of my Home screen, where my apps and some of my widgets are now. And if I perform a 3-finger flick right, I’ll get to the Today view.

VoiceOver: Showing Today view.

Waterminder. Your progress is 83%.

Jonathan: I’m interested in this. It tells me that I have consumed 80% of my water intake.

I’ll flick right.

VoiceOver: Batteries. Black Pearl. 46% charged.

Jonathan: That’s my iPhone 14 which is a black iPhone, so it’s called Black Pearl.

VoiceOver: The blind watch wearer. 59% charged.

Jonathan: That’s my watch.

VoiceOver: Fitness. Move ring. 114%. Exercise ring. 106%. Stand ring. 66%.

Jonathan: That’s the fitness, and I always like to keep track of how my fitness is doing. So it’s right here on my Today view, and I can consult at any time.

I’ll flick right twice because if I flick right once, it will just give me the same data again, this time expressed in numerical value.

VoiceOver: 32 minutes out of 30 minutes. Fantastical. Today, Sunday. 17/9/23 button.

Jonathan: I have my Fantastical widget here. We’ve talked about Fantastical on the podcast before. It is a calendar app.

It also integrates with the Reminders app, and that’s one of the reasons I like having this widget on my Today screen because with one widget, I’ve got my Reminders and my appointments.

To give you a quick example of that, …

VoiceOver: [2:15] PM. Reset the MushroomFM PC button.

Tomorrow. Monday, 18/9/23 Button. 7 AM. Take out glass recycling button.

Jonathan: So I’ve got the Reminder to take the recycling out.

VoiceOver: 9 o’clock to [9:30] AM. Microsoft Teams meeting.

Jonathan: And then, I’ve got my first meeting of the day at 9 AM on Monday. And I’ll be able to scroll through my appointments.

Next on my Today screen, I have my share portfolio updating in real time with a service I use in New Zealand called Sharesies. It tells me the dollar value of my share portfolio and the return on my investment.

And after that, I have the Parcel app.

VoiceOver: Parcel. Selected. iPhone 15 Pro Max.

Jonathan: There we go. And it’s telling me about the iPhone 15 Pro Max, where that is, and other information as well. So that’s the Parcel app showing me in real time where parcels that I’m tracking are.

It’s really easy to create your Today screen, and what you’ll be able to add will depend on the apps that you have installed on your phone.

I’ll flick down.

VoiceOver: Edit mode.

Jonathan: And we’ll go into Edit mode.

VoiceOver: Started editing. Parcel. Is editing. Widget.

Jonathan: Now, I can see all the widgets that are on my Today screen. If I go to the top of the screen, …

VoiceOver: Add widget button.

Jonathan: There’s an Add Widget button. I’ll double tap.

VoiceOver: Search widgets. Search field.

Jonathan: You can search for the widgets that are installed on your system. Some apps will have widgets associated with them, some will not. And there are some apps that I wish did have widgets, but don’t.

VoiceOver: Music. Smart stack widget. Stack. Medium. Suggested button.

Jonathan: A smart stack is a particular kind of widget. When you install it, the data that it shows in its main view will vary depending on a range of factors such as location and time of day. If you want to see something else on the stack, you can flick up and down, and change the widget that has focus.

If I flick right, …

VoiceOver: 10%. Up next. Widget description. Pick up where you left off in your most recent course widget. Small. Suggested button.

Jonathan: We’re getting some suggestions here. But if I continue to flick right, we’ll get into an alphabetical list of the apps that I have on my device that offer widgets.

VoiceOver: Widgetsmith. Small widgets. Widget description. Once added, tap on it to configure which of your Widgetsmith widgets is shown here.

Widget. Small. Suggested button.

Jonathan: Widgetsmith is one of 2 utilities that I have on my iPhone that allow you to create widgets. I’ll talk about another of these a little bit later.

VoiceOver: Calm. All day calm. Widget description. Start each day with a daily calm. Take an afternoon break with a few deep breaths, and fall asleep each night to a new sleep story. Widget. Small. Suggested button.

Weather gods, active button. Amazon Alexa button.

Jonathan: Okay. Now we are in the alphabetical list.

VoiceOver: American utton. Any list button. App store button. Audible button.

Jonathan: And I have quite a few widgets here. If I double tap the name of the app, for example, Audible, you may have several (quite a large number in some cases) widgets that you can install.

Now, widgets can be on your Today screen, or they can just be on any page that you like of your home screen. And some widgets that are on your Today screen can be different, in terms of their functionality, from those which are able to appear on your home screen or on your lock screen.

I’m going to get out of this mode. I’ll go to the home screen.

VoiceOver: Add widget button. Finished editing.

Waterminder. Your progress is 83%.

Jonathan: So I’m back on my Today view now. I’ll go back home.

VoiceOver: Launcher. Call Bonnie button.

Jonathan: Now, you will have heard this before when we were traversing the home screen, and it said, “Launcher. Call Bonnie.” This is the second app that I have on my phone that allows me to create widgets. And this widget allows me to, with a double tap, call Bonnie. It does confirm the number that I’m going to call, because you don’t want to accidentally double tap the wrong thing and make a phone call to someone. But right at the top of page 1 of my main home screen, I’ve got a little widget that calls Bonnie. It’s extremely useful.

If I flick right, …

VoiceOver: Messages. 1 unread message.

Jonathan: You’ve got your standard message screen there.

VoiceOver: Settings. My Roger mic. App Store. Music. Sonos. MetService. Paparangi. 17 degrees. Arc condition: windy. Large. Image. 16 degrees. Arc thermometer. 11 degrees.

So in the middle of the apps, I’ve got a widget from the New Zealand MetService, which provides very local weather to us, much more so than Apple’s Weather app. And I’ve got the weather conditions right on page 1 of my home screen.

So you can mix and match. You can have some apps, you can have some widgets.

To understand the effect that this is going to have on your home screen, you need to understand that there are 3 sizes of widgets. The iPhone home screen grid is 4 apps wide by 6 apps tall, for a maximum of 24 app icons on one page.

A small widget is 2 wide by 2 tall. So if you install a small widget onto your home screen, it’s going to take the space of four app icons.

A medium sized widget is 4 wide by 2 tall. So it’s taking the space of eight app icons, 2 rows of apps.

And a large widget is 4 wide by 4 tall. So it’s taking the space of 16 apps or 4 rows of apps.

A small widget can share horizontal space, the remaining two half rows with another small widget or up to four app icons.

So if you start installing widgets onto pages of your home screen that already have apps, you may find that there’s no room for your apps. In which case, those apps will be nudged onto subsequent pages of your home screen.

So on the first page of my regular home screen, that’s not the today view but the first page that can include apps as well as widgets, I’ve got essential things. I’m going to perform a 3-finger flick left to get to page 2 of my home screen.

VoiceOver: Page 2 of 4. home. Broadcasts. MushroomFM. Recently played button.

Jonathan: I’ve got essential things on page 1. but on page two, I have 2 widgets for 2 apps that I use a lot.

The first is the Broadcast app. This is a great app for listening to Internet radio. It is my favorite Internet radio app by a long way, and it remembers the stations I’ve been listening to recently.

So what I need to do is double tap, for example, MushroomFM, and it will start to play without me having to go into the app.

If I flick right, …

VoiceOver: BBC Radio 4 button. RNZ Radio, New Zealand in Parliament button. BBC Radio 5 live sports extra button.

Jonathan: So it’s super handy to have these stations that I listen to regularly, just right there on my home screen for me to access. Thanks to the Broadcasts large widget.

If I listen to something new, then the stations will reorder and the oldest listened to station will drop off.

And of course, if I just double tap the broadcasts part of the beginning of the widget, I can launch the full broadcast app, which means there’s no need to have the broadcasts app itself on my home screen because I can launch it from the widget.

Next to that, I have this.

VoiceOver: Overcast. Resume Q+A. The 17th of September, 12 minutes left. David Seymour. welfare, crime, and ACT’S lost candidates. Image.

Jonathan: I’m running the beta of overcast. And at the time I’m recording this, this is a beta that is iOS 17 compatible.

And this is an example of an interactive widget. If I double tapped here, it would get me into the overcast app. But if I flick right, …

VoiceOver: play button.

Jonathan: There’s a play button. And by double tapping the play button, I don’t even go into the overcast app at all. It’ll just start playing that particular podcast.

I’ll flick right.

VoiceOver: New. The Sunday Session with Francesca Rudkin. the 17th of September, 16 minutes. Tammy Nielsen. Country star recaps adventure filled 2023 ahead of her rock and roll review to… image.

Jonathan: I’ll flick right.

VoiceOver: Play button.

Jonathan: And there’s a play button for that podcast as well.

And this is all happening on my home screen. It really has made my home screen come alive.

I’m going to perform a 1-finger triple tap.

VoiceOver: Remove widget button.

Jonathan: There’s a remove widget button.

VoiceOver: Edit home screen.

Jonathan: and an edit home screen button.

Sometimes on some widgets, there is an edit widget button. You don’t always see this, but you do sometimes. And that can allow you to customize the specifics of the widget.

For example, Drafts, which we’ve featured extensively on Living Blindfully before, has a whole lot of widgets. One of which allows you to bring up a specific draft on a specific page of your home screen.

So I actually have a page on my home screen now for Living Blindfully. And it does two things.

First, there’s a widget for the mail app that allows you to specify which mailbox you have on your home screen. And you can have multiple implementations of this widget, by the way.

But on the Living Blindfully page, I have it showing me the most recent messages that have come into the Living Blindfully mailbox right there on my home screen, without me even having to open the mail app.

And the second widget that’s on that same page pertaining to the podcast is my Living Blindfully ideas scratch pad in drafts. Whenever I think of someone I might want to interview or a point I might want to cover on the podcast, I write it in this draft.

Having it right there on the Living Blindfully page means that I can just double tap, go straight into the draft, and write something down. It’s effortless, and it makes the home screen so much more useful.

But to configure that, I had to triple tap on the Draft widget, choose edit widget, and specify the name of the draft that I wanted that particular instance of the widget to open.

I’ll go back to the home screen.

VoiceOver: Broadcasts.

Jonathan: If you want to add widgets to your home screen, it’s very easy. We’ll just flick down, …

VoiceOver: Edit mode.

Jonathan: I’m going to edit mode by double tapping.

VoiceOver: Started editing. Broadcasts. Is editing.

Jonathan: And at the top of the screen, …

VoiceOver: Add widget button.

Jonathan: There is now an ad widget button. You can double tap.

Now, we’ve got our list of widgets on the screen that are applicable to a home screen page. And since we’ve been talking about drafts, …

VoiceOver: Drafts button.

Jonathan: I’ll double tap that one.

VoiceOver: Grid.

Jonathan: Having chosen the app, we’ve now got all the widgets that that app offers. And if I go to the bottom of the screen,

VoiceOver: Close button. Add widget button. Page 1 of 9. Adjustable.

Jonathan: We see that there are 9 widgets available. We’re on page 1, which is a grid view of the drafts.

If I flick back, …

VoiceOver: Drafts grid. Widget description. Configurable access to commands, workspaces, and actions. Widget. Small button.

Jonathan: Apple does require the developer to provide a description of the widget, so you know what it does. And it will always tell you the size of the widget. Sometimes, the same kind of widget will be available in multiple versions – small, medium, and large.

So if I go to page 2, …

VoiceOver: Add widget button. Page 1 of 9.

Jonathan: We’re now on page 2.

VoiceOver: Page 2 of 9.

Jonathan: And we’ll go back.

VoiceOver: Drafts grid. Widget description. Configurable access to commands, workspaces, and actions. Widget. Medium button.

Jonathan: Now, we’ve got a medium sized widget, which obviously takes more space on your home screen but offers more functionality.

And if I go up again, we’ll get the large version of the same widget which takes up a lot of space.

Now, there are quite a few news apps that offer widgets. So if you want to, you can have a couple of pages assigned to your home screen to bringing you the latest breaking news.

I also have a page on my home screen where I have a couple of audio apps that offer widgets, including Just Press Record. So if I ever need to record something in a hurry, there’s a widget that activates Just Press Record, and then activates the record button without me having to do anything else.

Widgets are a lot of fun to play with. They add a lot of functionality. They’re very flexible. And with the addition of interactivity in iOS 17 so that you can actually have controls for your app as part of the widget, I think that you’ll find a lot more apps now getting into the widget space.

There’s one final thing I want to talk about that’s pertinent to widgets, and that is something that was added a couple of versions ago but people may not know about.

Once you start to play with widgets and you have large widgets that cause your apps to be moved onto pages you may not have anticipated, you may want to move the pages of your home screen around.

Now, it’s actually very easy to do that. I’m going to go back to the edit screen.

VoiceOver: Add widget. Search. Page 2 of 5. Adjustable.

Jonathan: Now, I’m on the adjustable page list and I’m going to double tap here.

VoiceOver: Home screen page hiding. Done button.

Jonathan: Now, we can flick right and we have a grid showing us our different home screen pages.

VoiceOver: Page 1, visible. Page 2, visible.

Jonathan: If you want to, you can hide any of these pages.

I’ll double tap.

VoiceOver: Page 2, hidden.

Jonathan: Double tap again.

VoiceOver: Page 2, visible.

Jonathan: It didn’t confirm automatically, but it is now visible again.

And if I flick down, …

VoiceOver: Jump to page 2. Drag page. Activate.

Jonathan: The drag page is a critical one. It does allow you to reorder the pages very easily.

Where this is handy is that you can also associate pages with focuses, and it’s extraordinary how configurable iOS has become. This might be useful, for example, if you’re traveling and you can set up a focus for travel, which has a page of apps providing real-time information.

Perhaps you are flying with an airline that’s giving you real-time information via widgets.

Perhaps like me, you use the TripIt app. I’ve been using TripIt since the Symbian days. And if you’re not familiar with that, it’s an app that allows you to forward your confirmation emails from airlines, hotels, and other travel-related providers to them, and they build an itinerary for you. With TripIt Pro, they’re constantly monitoring things and sending you push notifications if your gate changes, or if your flight’s delayed, any kind of thing like that.

Now, I have a widget along with Hilton and airlines I use regularly, and I don’t necessarily need to have that page visible when I’m not traveling. But if I have a travel focus, then I can make that my primary page so that as I travel, all the pertinent information is right there. The combination of widgets, page reordering, and focus is a very very powerful mix.

Advertisement: Living Blindfully is brought to you in part by Aira, and I thank them for their sponsorship of the podcast.

You know we’ve become used to some businesses offering free Wi-Fi. It’s a nice touch, and it makes us feel valued whenever we come across it.

And I know similarly that when I learn about a business that has purchased Aira Access, it’s like putting out a massive “Blind people are welcome here.” sign. I know that if I need it, I’ve got a trained professional agent available to me to provide assistance, and that means that the business cares enough to pay for that. I appreciate that.

From airports, to Starbucks, to Target, and more, Aira Access can assist you to navigate, shop, browse and not be reliant on friends, family or others who may not understand our needs as well. And don’t forget that as well as the offerings in physical locations, there are other businesses providing Aira Access that can be used from home.

So you should check out the app to find out what’s available. To do that, just open the Aira Access section in the Aira Explorer app on your favorite device. You can also visit Aira’s website to find out more at Aira.io. That’s A-I-R-A.I-O.

New Ring and Alert Tones

One way that you can make your phone truly yours is to personalize it with tones.

You can download tones, there are tones built into iOS, and I actually make all my own ringtones.

In iOS 17, there are a lot of new tones, both for ringing and shorter alerts like text messages and other things like that. And they’ve also remastered some of the other ones which have been moved into the classic section.

These new tones have been generally well received. But there is one criticism coming from some quarters, and that is that the notification sound has changed. The traditional tri-tone that’s been around in various forms forever has gone, and you can’t change it back.

Some people are saying that the new notification sound, which you’ll hear the moment you start to use iOS 17, (assuming you haven’t already) is way too quiet, and there’s nothing that people can do because you can’t actually specify a default notification sound. Some apps will let you specify their notification sound, but it’s not something that’s in the operating system that says, “By default, I want all notifications to play this sound.”

I won’t show you all of these because you can have fun playing with these yourself and choosing something that you like. But we’ll show you some of them, which you can find under the Sounds and Haptics settings screen that is on the main settings screen of iOS.

VoiceOver: Ringtone. Penny button.

Jonathan: Contacts who call me regularly all have their own personalized ringtone that I’ve picked and made myself. But my main contact ringtone is this.

VoiceOver: Penny.

[upbeat music]

Jonathan: I’m gonna stop that before I get pinged for it. But it is the piccolo trumpet solo from Penny Lane. Oh, I like that.

And my text tone is also the beep beep from Baby You Can Drive My Car.

But there are lots of new tones that you can choose from. Let’s have a listen to some.

VoiceOver: Arpeggio button.

[ring tone plays]

Jonathan: They’re quite modern-sounding.

VoiceOver: Braking button.

[ring tone plays]

Jonathan: And the haptics are doing some great things. They feel great when you’re getting the haptic feedback. [laughs]

VoiceOver: Canopy button.

[ring tone plays]

Jonathan: We could go on for a long time with this ’cause there are a lot. I think there are about 20 new tones. So you can play with these.

If you want to go through the old ones, they’re under classic. Some of them are sounding better than ever.

Changes to the Messages App

There are numerous adjustments to iOS 17’s Messages app. You’ll come across many of these,just in your regular use of the Messages app. But I want to highlight a couple, in particular.

To show you the first, we’ll go into our settings screen and find …

VoiceOver: Messages button.

Jonathan: I’ll double tap.

VoiceOver: Allow messages to access, heading.

Jonathan: This is the standard screen you’ll be well used to if you’ve gone to configure your messages app before.

There is one button I want to spend some time with, though, that is new, and it’s this one.

VoiceOver: iMessage apps button.

Jonathan: This allows you to configure the iMessage apps that are visible when you are using messages.

Just as some apps offer widgets, some also offer iMessage apps. But that screen can get very cluttered, and you find yourself having to navigate a series of apps that you know you will never use.

This has particularly become an issue since with iOS 16, Apple decided to move the audio recording to iMessage apps and offer the ability to record an audio message via an app, rather than having a dedicated button that’s always visible. I do wish that Apple would take it back to the way that it was, but we have what we have at the moment.

So let’s have a look in the screen.

VoiceOver: Included with an app, heading.

Jonathan: And we’ll flick right.

VoiceOver: American switch button, off.

Jonathan: Now, these are the apps that I have installed that offer iMessage apps. I can toggle the ones off that I will never use.

VoiceOver: Apple store switch button, off. Cardhop switch button, off.

Jonathan: I’m pretty conservative with my iMessage apps.

So you can go ahead and toggle off all the apps that you don’t want visible.

When you have just the apps that you want to use, there are two other things you should know.

One is that the list is no longer permanently visible in each iMessage conversation. Just to the left, if you flick left of the edit field where you type your message, you’ll find an add button. Visually, it looks like a plus. You double tap that to expose the messages apps every time you want to use them. I think that’s a good feature.

And the second thing you should know is that this screen is reorderable. Even though at this stage, there’s no actions rotor for this, you can double tap and hold an app and drag it around the screen. The first thing I did was reorder the audio app,so that it’s at the top of the list.

So now, when I double tap the add button in an iMessage conversation, focus is immediately placed on audio because it’s at the top of the list, and I can start recording a voice message.

We’ll show you this as we look at the next big feature I want to highlight in messages, and this is called check-in.

If you’re a parent, you’ve probably said to your kids on numerous occasions, “Please let me know when you get there, just so I know you’re okay.” But this can apply to a loved one as well, or anyone that you care about.

For the check-in feature to work, both of you need to be running iOS 17. So if you’re following the steps that I’m about to outline and you find that it isn’t working in the iOS conversation that you’ve chosen, it could be because the person that you’re communicating with is on an older version of iOS.

I’m in a conversation now with my nephew, Anthony, who is running iOS 17. And I’ve currently got focus on the edit field where I can type in my next message.

VoiceOver: Message. Text field. Is editing. iMessage. Insertion point at start.

Jonathan: When I flick left, …

VoiceOver: Apps button.

Jonathan: There’s an add button. If I double tap, it will expand, by now, thank goodness, to a much smaller list of available iMessage apps.

VoiceOver: Message. Text field. Is editing. iMessage. Character mode. Insertion point at start.

Jonathan: I find that focus gets a bit stuck. So sometimes, it’s necessary to put my finger somewhere else on the screen.

VoiceOver: YouTube.

Jonathan: Okay. That’s got me unstuck, so I now go to the top of the screen with a 4-finger single tap on the top half of the phone.

VoiceOver: Camera.

Jonathan: I did actually move the audio app back to its usual place, so that you can hear probably what you’re most likely to hear depending on the apps that you chose to keep visible. But remember, you can move these apps around.

VoiceOver: Photos. Stickers. Audio. Location. More.

Jonathan: Now we’re going to double tap the more button. That’s where we’ll get to the check-in feature.

VoiceOver: More.

Jonathan: Once again, focus is getting stuck here. But this could be something that is fixed eventually. So I’m going to just tap around the screen with my finger.

VoiceOver: YouTube. Store. Hashtag images. Check-in.

Jonathan: There is check-in. It’s a little bit difficult to find at the moment. But I found the check-in option under more, and I’m going to double tap.

VoiceOver: Back button. Attached app. Check-in. Timer. Around [8:45] AM button.

Jonathan: It is showing me this because I’ve played with the check-in app before.

But if you’re seeing something different, that’s because the first time you use the check-in feature, Apple will take you through an onboarding process explaining how check-in works. The process tells you that it will automatically notify the person that you’re setting this up with when you reach your destination.

If you stop making progress while you’re on your way, messages will check in with you, first to see what’s going on.

If there’s no response from you, (I think it waits about 15 minutes or so.) the check-in app will start to send information to the person that you’ve run it with, including your battery level, where precisely you are, and also your cellular status.

When you set this up, you do have some control on the amount of data that is shared. You can choose to share only limited data. And that will share location, network signal, and battery level. And if you opt for the full option, it will share all of that, plus the route that you’re traveling on, the location that the phone was last unlocked, and the location where the Apple Watch was last removed.

So now that we’ve added this app, we have this button here.

VoiceOver: Attached app. Check-in. Timer. Around [8:45] AM button.

Jonathan: You can hear that actions are available from that sound. Those actions are …

VoiceOver: React. Remove. Activate.

Jonathan: And that’s the default, of course, so we’ll activate.

VoiceOver: Cancel button.

Jonathan: And flick right.

VoiceOver: Check-in, heading. Done button. When I arrive button. 1 of 2. Selected. After a timer button. 2 of 2.

Jonathan: There are two scenarios equally important. We’ve highlighted the one that’s most common, I think, which is that you’re going to travel somewhere, and you have the ability to have someone just monitor your journey, make sure that you’re okay, and action being taken if it turns out you don’t get to where you need to be.

The other one has all sorts of scenarios. That’s the one that’s currently selected, actually. And that is on a timer. What have we got now?

VoiceOver: Timer, heading. 1 hour. Picker item. Adjustable. 0 minutes. Picker item.

Jonathan: And no minutes, so exactly an hour.

If you set this, and then the phone checks in with you and you don’t respond, then it can alert the person that you’ve set this up with. This could have all sorts of benefits if, for example, you’re not feeling well. I can immediately think of something like you’ve had surgery, and maybe you’re on some medication, and you want to set a timer so that if the phone contacts you and you don’t respond, then it will contact this other person who can take any kind of action that might be necessary.

So the check-in feature does actually go beyond travel, and it’s a very useful one. Be aware of how to use it. You never know when it might come in handy.

Manage Your Grocery List in the Reminders App

Reminders apps, to-do list apps, are a popular genre in the App Store. And there’s a real science to a lot of these task lists and project management type things.

Apple’s Reminders app, for a long time, was very basic in iOS. But in recent years, it’s become a lot more capable.

For several years, Bonnie and I have been using a third-party app called AnyList. It’s accessible, it’s powerful, and it also integrates with your Amazon’s personal assistance shopping list. So whether you’re using your phone or your soup drinker device, you can add things to your shopping list, and it’s a communal, collective shopping list. So when one of us notices that we’ve run out of a particular item that we need to purchase from the supermarket, we can add it to that list, and we both see the same list.

You can do the same kind of thing in Reminders, because you can share lists in iCloud. Bonnie and I have some shared lists as well that we do actually use in Reminders, relating to household tasks like making sure that the rubbish goes out on rubbish day, that kind of thing.

But it hasn’t really been as powerful for grocery lists as AnyList is, which is just a superb app. I highly recommend it. The developers are very responsive to feedback.

In iOS 17, AnyList has been Sherlocked a bit.

[laughs] This is the first time I get to use the word “Sherlocked” in this particular review or demonstration. It’s a common term that refers to when Apple comes along, notes what’s going on with one particular third-party app or a category of them, and kind of intervenes and puts something in the operating system. And that’s difficult because obviously, when an app is just there, and it’s sort of good enough, then it’s a disincentive for people to download third-party apps.

Let’s take a look at what’s happened with Reminders in iOS 17 because with this scenario that I’ve just outlined – maintaining a grocery list, it really is very powerful now.

I’m going to perform a 4-finger tap to go to the bottom of the screen.

VoiceOver: Add list button.

Jonathan: And we have an Add List button. I’ll double tap.

VoiceOver: Workbridge button.

Jonathan: The first question that I’m asked is where I want to create this list, because I’m actually syncing my tasks in Microsoft Outlook with the Reminders app. I do find that very helpful because I have a range of tasks that are work-related that I might want to deal with in Outlook, or I might want to deal with them on my phone, and I can do both.

This is definitely not a work-related issue, so I’ll flick right.

VoiceOver: iCloud button.

Jonathan: And I want to store this in iCloud. Those are the 2 choices that I have right now.

I’ll double tap.

VoiceOver: List name. Text field. Is editing. Insertion point at start.

Jonathan: I’m going to call this Shopping List, and I’ll type that in on my Mantis keyboard.

We’ll just confirm.

VoiceOver: Shopping list.

Jonathan: And we’ll flick right.

VoiceOver: Clear text button. List type: standard.

Jonathan: It’s perhaps a little less than exemplary in terms of accessibility because VoiceOver is not speaking that this is a button or anything actionable. But it is actionable, and it’s critical that we get this right for this demonstration. It’s asking for the kind of list.

I’ll double tap.

VoiceOver: Standard button. Groceries button. Smart list button.

Jonathan: And those are the 3 options.

I’m going to flick left.

VoiceOver: Groceries button.

Jonathan: Back to groceries, and double tap.

VoiceOver: List type: groceries.

Jonathan: Now, if I flick right, we’ll see the implications of having chosen groceries.

VoiceOver: Organize shopping items automatically using sections.

Jonathan: And this really does work. I’ll show you this in just a moment.

VoiceOver: Colors: red.

Jonathan: Now, if I continue to scroll through the screen, we can choose the color for the list, and also the icon. I’m not bothered by those. I will accept the defaults.

So I’m going to go back to the top of the screen.

VoiceOver: Cancel button.

Jonathan: And flick right.

VoiceOver: New list, heading. Done button.

Jonathan: And double tap the Done button, which will create this list.

VoiceOver: Lists. Share list button.

Jonathan: Having created this list, I’m now on a screen pertaining to it. And the first option I have is to share this list. Now clearly, if I was going to be using this, (and actually, we may. We’ll see how we go.) I would share this with Bonnie so that we can both see the list and add to it.

I’ll flick right.

VoiceOver: More button.

Jonathan: If we double tap the More button, we can delete this list, we can sort it, and do a range of other tasks.

VoiceOver: Shopping list, heading. No items. Items added to this list are automatically categorized using sections. Toolbar. New item button.

Jonathan: Let’s double tap New item, and we’ll add a few.

VoiceOver: New item.

Jonathan: This is a bit unusual, and it makes me wonder whether there are a few accessibility things that need to be tidied up with this app. But I am actually in an edit field, and I can tell that because on my Braille display, the little edit cursor is bouncing up and down. [laughs]

So I’m going to type steak and press Enter.

VoiceOver: Categorized steak as meat.

Jonathan: Fair enough, too. Now I’m going to type oranges. I’m in an edit field again, so I’ll just type oranges and press Enter.

VoiceOver: Categorized oranges as produce.

Title. Text field. Is editing. Insertion point at start.

Jonathan: Let’s type something a bit more complicated, like dishwasher tablets.

VoiceOver: Categorized dishwasher tablets as household items.

Title. Text field. Is editing.

Jonathan: We can’t fool this thing. Alright, let’s type potatoes.

VoiceOver: Categorized potatoes as produce.

Title. Text field. Is editing. Insertion point at start.

Jonathan: All right, we would want to buy some kombucha. So shall I type kombucha and see if it knows that, or do that?

VoiceOver: Categorized kombucha as coffee and tea.

Title. Text field. Is editing. Insertion point at start.

Jonathan: Yeah, okay. I mean, kombucha is fermented tea, so we’ll give that one to the artificial intelligence that’s making this work.

I can keep doing this. I can keep typing items in my list and pressing Enter because I have a Bluetooth keyboard. And then, we can go on to the next item.

But now, I’ll go to the top of the screen.

VoiceOver: Lists. Back button.

Jonathan: And flick right.

VoiceOver: Shopping list, heading. Share list button. More button. Done button.

Jonathan: And now flick right.

VoiceOver: Steak. Incomplete. In section, meat.

Jonathan: There are actions available. And also, something interesting has happened here. Focus has skipped.

If I flick left, …

VoiceOver: Meat. Expanded, heading.

Jonathan: The meat section is expanded. But when I flicked right, I didn’t hear that.

So let’s take a look at the options here.

VoiceOver: Delete. Rename. Drag item. Activate, default.

Jonathan: We can delete the whole section if we want to.

And if I double tap, …

VoiceOver: Meat. Collapsed, heading.

Jonathan: The meat section is collapsed. So it makes it easy to collapse whole sections of products based on the aisle in the supermarket that they’re likely to be.

I’ll double tap.

VoiceOver: Meat. Expanded, heading.

Jonathan: And meat is expanded again.

And if I flick right, …

VoiceOver: Steak. Incomplete. In section, meat.

Jonathan: If I shop online for my groceries, or I go to the supermarket and get some assistance to do that, then I can double tap the items and check them off as I shop.

I’ll flick down.

VoiceOver: Delete. Flag. Show details. Edit title. Enter detail.

Jonathan: It’s saying enter details. And that’s because even though I just sat here and typed steak, oranges, etc and pressed Enter after each one, you can move through and add additional notes. So if there’s a particular brand of product that you want, or a particular amount of steak that you want, you can add that in fields other than the title.

But my goal was to show you just how quickly you can assemble this list.

So let’s flick right.

VoiceOver: New item in meat button. Produce. Expanded, heading.

Jonathan: You’ll notice that now that we have a meat section, you can add a new item to the meat section. But you can also just continue to choose the general new item option, and the software will work out where things need to go.

So now we’re in the produce section, and we added a couple of things.

VoiceOver: Oranges. Incomplete. In section, produce. Potatoes. Incomplete. In section, produce.

Jonathan: And I’ll flick right.

VoiceOver: New item in produce button. Household items. Expanded, heading. Dishwasher tablets. Incomplete. New item in household items. Coffee and tea. Expanded. Kombucha. Incomplete. In section, coffee and tea.

Jonathan: The beauty and the logic of this is that you and other people in your household, if you have others who share this shopping list, can add items whenever they occur to you. They’ll be sorted into the aisles in the supermarket so that when you go and do your shop, all the items are grouped together without you having to do anything.

And as VoiceOver said, when we were navigating, each of these sections is its own heading.

And this is why I’m a big fan of setting up gestures to navigate by heading, rather than just keeping headings on the rotor. Because if you do this and you’re in your supermarket and you can navigate by heading using gestures, you can quickly get to the section of your list that corresponds to where you are in the supermarket. What did I want to buy from the meat section? Navigate by heading until you hear meat, and then all the items are grouped together.

This is a pretty cool new feature of the Reminders app in iOS 17.

Notes in Notes Can Now Link to Other Notes

As you know if you listen to the podcast regularly, I’ve become a fan of the Drafts app. I use it for most of my writing on iOS, and I do use Ulysses for some more advanced word processing.

But the Notes app in iOS 17, which is available to everybody who has iOS, has become increasingly capable over time.

And there’s one feature that I want to highlight that’s been added this year. To do that, we will open Notes.

Open Notes.

One thing they don’t seem to have added, or changed, or fixed is that you can’t press Command N when you’re running a Bluetooth keyboard to create a new note. So to do that, I’ll go to the bottom of the page.

VoiceOver: New note button.

Jonathan: and double tap the New Note button.

VoiceOver: New note. Note. Text field. Is editing. Character mode. Insertion point at start.

Jonathan: We heard by that sound that actions are available here. That’s important in terms of this demonstration. But we’ll come back to that.

Right now, I’m going to type Notes app demo and a new line. So Notes app demo effectively becomes the title of this document.

VoiceOver: New line.

Jonathan: And now, I’m going to type, “I am typing a quick note for this Living Blindfully demo.”

We’ll just read that back.

VoiceOver: “I am typing a quick note for this Living Blindfully demo.”

Jonathan: Now, I’m going to create another new note.

So I’ll back out of this, and we’ll go down.

VoiceOver: New note button.

Jonathan: I’ll double tap.

VoiceOver: New note. Note. Text field. Is editing. Character mode. Insertion point at start.

Jonathan: And I’ll type, Living Blindfully demo. This is my demo of iOS 17 for Living Blindfully.

VoiceOver: This is my demo of iOS 17 for Living Blindfully.

Jonathan: I’ll make a new paragraph here.

VoiceOver: New line.

Jonathan: And I will say, it includes a demo of the Notes app.

And now, I will select that line.

VoiceOver: It includes a demo of the Notes app. Selected.

Jonathan: What I’m going to do now is flick down on the touch screen.

VoiceOver: Add link.

Jonathan: And we have an add link option.

I’ll double tap.

VoiceOver: Cancel. Text field. Is editing. Enter a URL or note title. Insertion point at start.

Jonathan: You will recall that we called the first document we created Notes app demo. So I’m going to type the word Notes.

VoiceOver: Link suggestions available.

Jonathan: I’ll flick right.

VoiceOver: Link to Note, Notes app demo. [8:26] AM button.

Jonathan: And there we go. Notes app demo is there because I typed its partial name.

I’ll double tap.

VoiceOver: Notes app demo name, heading.

Jonathan: And now, what have we got?

VoiceOver: It includes a demo of the Notes app. Text field. Close button. Use note title as name switch button, off.

Jonathan: Now, we don’t want to use the note title as name. We want that link to stay as it is.

VoiceOver: Name, heading. Text field. Notes. Notes app demo. Link to, heading. Done button.

Jonathan: And we can double tap done at the top of the screen.

VoiceOver: Note. Living Blindfully demo. This is my demo of iOS 17 for Living Blindfully. It includes a demo of the Notes app.

Jonathan: I’m going to double tap.

VoiceOver: Note. Text field. Is editing. Living Blindfully demo. This is my demo of iOS 17 for Living Blindfully. It includes a demo of the Notes app. Insertion point at end.

Jonathan: You will have heard that click before it includes a demo of the Notes app. That’s because I have my iOS set up to represent hyperlinks with a click. If I perform the gesture that I’ve assigned to navigate by link which is a 4-finger flick down and up, …

VoiceOver: It includes a demo of the Notes app.

Jonathan: There’s the link.

And I can double tap.

VoiceOver: It includes notes. Back button. Share button. More button. The 19th of September. Note. Notes app demo.

Jonathan: And there we go. So if you were creating the master note, which is the Living Blindfully demo for iOS 17, you could create a note for each of the sections that we’re covering in this demo. And then, create hyperlinks to all those notes in your main note.

It really is a very powerful new feature in notes. And it turns it into a powerful wiki system. So that’s notes in iOS 17.

Track Your Mental Health in the Health App

With every release of iOS, and for that matter, watchOS, the Health app seems to become more capable. One major improvement in this year’s version of the Health app pertains to mental health. Let’s have a look at what’s possible.

I’m in the Health app now, and we need to go to the Browse tab.

VoiceOver: Browse tab. 3 of 3.

Jonathan: I’ll double tap.

There is an increasing number of categories in this list, and I won’t scroll through them all.

But the one that we’re interested in is this.

VoiceOver: Mental well-being button.

Jonathan: I’ll double tap.

VoiceOver: Today, heading. Exercise minutes. [6:19] AM. 1 minute. Audiograph available.

The x-axis is date. The y-axis is value. There is 1 data series.

Obviously, exercise has a significant impact on your mental health.

I’m recording this quite early in the morning, before my day gets busy. And I haven’t done my exercising today. VoiceOver: Sleep: [2:52] AM.

Jonathan: Yes, it was an early start for me this morning because I knew I had a lot to record.

Then, we get onto this heading.

VoiceOver: Past 7 days, heading.

Jonathan: This will show your time in daylight which is known to have a positive effect on mental health, as well as your mindful minutes over the last week.

VoiceOver: Past 30 days, heading.

Jonathan: If you’re taking the time to log your mood in the health app (and we’ll talk more about this in a minute), you’ll be able to see here your state of mind. And also, your risk for depression and anxiety.

VoiceOver: Get more from health, heading. Mental health questionnaire.

Jonathan: This is a standard mental health questionnaire that is often asked by your GP, or a mental health professional. And if you flick right, you’ll get an explanation of this.

VoiceOver: Along with regular reflection, assessing your current risk for common conditions can be an important part of caring for your mental health.

Jonathan: And if I flick right, …

VoiceOver: Take questionnaire button.

Jonathan: Let’s just have a quick look at the opening of this questionnaire.

VoiceOver: Mental health questionnaire, heading. This assessment uses standardized questions to give you a sense of your risk for two very common and treatable conditions – anxiety and depression. Begin button.

Jonathan: I’ll double tap again.

VoiceOver: Over the last 2 weeks, how often have you been bothered by the following problems?

Question 1 of 16.

Feeling nervous, anxious, or on edge.

Not at all.

Several days.

More than half the days.

Nearly every day.

Question 2 of 16.

Jonathan: So you would double tap the response that applies to you.

VoiceOver: Not being able to stop or control worrying.

Not at all.

Several days.

More than half the days.

Nearly every day.

Question 3 of 16.

Worrying too much about different things.

Jonathan: I won’t go through all 16 questions because if you’d like to review it on your own, you can do that and complete the questionnaire as often as you feel you need to. And the Health app will come up with some determinations about your likelihood for conditions such as anxiety or depression.

I’ll go back by performing a 2-finger scrub.

VoiceOver: Mental health questionnaire, heading.

Jonathan: Let’s do one more.

VoiceOver: Browse. Back button.

Jonathan: And let’s just navigate by heading.

VoiceOver: Mental well-being, heading. Today, heading. Past 7 days. Past 30 days. Get more from health, heading. About mental well-being, heading.

Jonathan: Let’s listen to this explanation. I’ll perform a read all.

VoiceOver: About mental well-being, heading. Learning about mental health. Understand what contributes to your mental health and why it matters. Common concerns about mental health. Learn about common mental health conditions and what to pay attention to. Mental health questionnaires. How they’re an important part of caring for your mental well-being.

Jonathan: One of the things I really like about the health app is increasingly, there’s a lot of material to read that’s authoritative.

Sometimes, when you consult Dr. Google, they try and prioritize good results. But sometimes, dodgy things do slip through.

You can read the articles in the health app and have a higher degree of confidence that you’re reading authoritative sources. This applies to all sorts of things. There are good articles about sleep, and all sorts of different data points that you might want to collect.

Whether it’s on the iPhone or the Apple Watch, you can log your mood. And you can be prompted to log your mood, if you like. To do this, we need to be on the mental well-being screen and locate this item.

VoiceOver: State of mind.

Jonathan: If you’ve not done this before, you’ll double tap, and you’ll be taken through some explanations about logging your state of mind, and also given the option to receive notifications to remind you to do it.

I’m going to double tap.

VoiceOver: Today, the 19th of September.

Jonathan: I’m going to go to the top of the screen.

VoiceOver: Mental well-being. Back button. State of mind, heading. Calendar button. Learning about mental health. Understand what contributes to your mental health and why it matters. Common concerns about mental health. Learn about common mental health conditions and what to pay attention to. Mental health questionnaires. How they’re an important part of caring for your mental wellbeing.

Jonathan: This is another place to get to some of these other items.

VoiceOver: The difference between emotion and mood. Learn about how emotions and moods can affect you. Caring for your mental health. Steps you can take to support your mental well-being. Options, heading. Selected. Add to favourites button.

Jonathan: The favourites part of the Health app is a really useful tool. Because I have the Apple Watch and a smart scale from Withings, and we have the Qardio Arm that we’ve talked about on the show before. That’s the blood pressure monitor. You can actually get a really good picture of your health with all these smart things monitoring it, and you can put the appropriate data points in your favourites so that when you open the Health app and choose favourites, you can scroll through and see how well you are running at the moment.

Now, I’ll flick right.

VoiceOver: State of mind will appear as a favourite in summary. Options button.

Jonathan: Let’s take a look at the options for this.

VoiceOver: State of mind, heading.

Jonathan: This is governing whether you will receive notifications to invite you to log your state of mind.

VoiceOver: During the day switch button, on. End of day switch button, on.

Jonathan: You want to be careful about using these because they can have a bad effect on your state of mind. [laughs] If you’re prompted all the time to log your mood, and you don’t really want to, and you find it distracting, it may have the opposite effect from that which you intend.

VoiceOver: Add reminder button.

Jonathan: You can add a custom reminder.

VoiceOver: You can receive a reminder to log your state of mind around the middle of your day, at the end of your day, or at a specified time. Manage mindfulness in the Apple Watch app button. You can add reminders for other mindful activities in the settings app on your Apple Watch.

Jonathan: I’ll go back.

And let’s have a look at how logging your mood works. To do that, we’ve got to double tap the button called …

VoiceOver: Log button.

Jonathan: I’ll do that.

VoiceOver: Cancel button.

Jonathan: And flick right.

VoiceOver: State of mind logging. Image. An illustration of a group of coloured flowers.

Jonathan: Aww!

VoiceOver: Log an emotion or mood.

Jonathan: The first thing you have to determine is are you logging an emotion, or are you logging a mood? What’s the difference? Apple helpfully tells us.

VoiceOver: Selected. Emotion. How you feel right now button. Mood. How you’ve felt overall today button. Next button.

Jonathan: We’re going to log an emotion.

VoiceOver: Back button.

Jonathan: And I’ll flick right.

VoiceOver: Emotion, heading. Cancel button. Choose how you’re feeling right now. Neutral. Neutral. Adjustable.

Jonathan: Now we’ve got a slider.

So if I go down, …

VoiceOver: Slightly unpleasant.

Jonathan: So what’s happened here is that the slider is in the centre. In other words, I’m not feeling super great. I’m not feeling super bad. I’m just kind of, yeah, there. It’s neutral.

We’ll keep going down.

VoiceOver: Unpleasant.

Very unpleasant.

Jonathan: And that’s as far as we can go down.

If I slide the slider up, …

VoiceOver: Very unpleasant.

Jonathan: We’re going back up.

VoiceOver: Unpleasant.

Slightly unpleasant.

Jonathan: Sometimes it is erroring out and not moving up, but I just keep going.

VoiceOver: Neutral.

Jonathan: And we’re on neutral.

VoiceOver: Slightly pleasant.

Slightly pleasant.

Pleasant.

Very pleasant.

Jonathan: And mate, it doesn’t get any better than very pleasant.

I’ll flick right.

VoiceOver: Next button.

Jonathan: And we’ll double tap next.

VoiceOver: Very pleasant image. An illustration of a gold circle against a brown background. An illustration of a flower on a brown background. Very pleasant. What best describes this feeling? Info button.

Amazed button.

Excited button.

Surprised button.

Passionate button.

Happy button.

Joyful button.

Brave button.

Proud button.

Confident button.

Hopeful button.

Amused button.

Satisfied button.

Relieved button.

Grateful button.

Content button.

Calm button.

Peaceful button.

Show more button.

Jonathan: Well, we could go on. I think I would just do calm.

VoiceOver: Calm button. Selected. Calm.

Jonathan: You can select multiple items, so you can select as many as apply to your emotion at the moment.

VoiceOver: Next button. Very pleasant. Image. A brown surface with a light shining on it. An illustration of a flower on a brown background. Very pleasant. Calm.

What’s having the best impact on you? Info button.

Health button.

Fitness button.

Self care button.

Hobbies button.

Identity button.

Spirituality button.

Family button.

Friends button.

Partner button.

Dating button.

Tasks button.

Work button.

Education button.

Travel button.

Weather button.

Current events button.

Money button.

Done button.

Jonathan: Well, I think what I’ll do is I’ll go back and do education, since we’re doing this tutorial type thing.

VoiceOver: Education button. Selected.

Jonathan: And, …

VoiceOver: Current events.

Jonathan: Current events because it’s iOS, it’s new, it’s current.

VoiceOver: Selected. Current events. Done button.

Jonathan: And then, we’ll double tap done.

And that’s all there is to logging your state of mind.

I suspect that there will be, optionally, some quite close integration between your state of mind logging and other info in here, and the new Journal app that Apple is promising.

But it has not yet been released. It’s not part of the initial release of iOS 17. Apple’s Journal app is coming later.

I’m very much looking forward to seeing how good that is.

I’m a fan of the idea that if you can’t measure something, you can’t improve it. So taking the time to log your mood, your emotional state using this app is a great idea. And over time, you can perhaps gain a better, more objective picture of the things that might be triggering for you, the things that you might want to avoid.

Before we leave the Health app, just a brief notable mention about an extension to a feature that was launched last year. We covered this extensively. And that was logging medication.

If you now define medication as critical and you haven’t taken it, you haven’t logged it at the time you’re expected to, you can get a reminder half an hour after that to remind you that as far as the phone is aware, you’ve not taken your medication, and that it’s important to do so.

[music]

Advertisement: Transcripts of Living Blindfully are brought to you by Pneuma Solutions, a global leader in accessible cloud technologies. On the web at PneumaSolutions.com. That’s P-N-E-U-M-A solutions dot com.

Live Voicemail and FaceTime Voicemail

Now, I’m going to summarize a range of other things that I won’t demonstrate for a variety of reasons, but are still note-worthy.

One is one I’d love to demonstrate, but we have no carrier in New Zealand that offers visual voicemail, so it’s not a feature available to me, sadly. It’s called live voicemail.

If you have visual voicemail (and most people do), it is on by default. But you can disable it in the phone app.

When someone’s leaving you a voicemail, you’ll be able to read in real-time a transcription of what’s being said. This is analogous to the old days of the answering machines where even if you are home, you could listen on the speaker to someone leaving a message, to see if you really wanted to pick up or not. [laughs] If it is someone you want to talk to, you can pick up right away, interrupt the voicemail, and talk to the person concerned.

I can’t test this so I don’t know if it’s accessible, but I haven’t heard that it’s not. And I suspect I would have heard if there were accessibility issues.

So that’s live voicemail in iOS 17.

Speaking of voicemail, now, if you call someone via FaceTime or FaceTime audio and they don’t pick up, then eventually, the call will time out and you’ll be given an option to leave a video message for them.

Improved Predictive Text and Dictation

Apple has announced some changes relating to the way you get data into the phone if you use dictation, or the standard virtual keyboard on the device.

I use neither very often. My primary method for getting data into the phone is Braille Screen Input (when it’s behaving properly, or even when it’s not, I still try and persist with Braille Screen Input). [laughs] Or if I’ve got my Mantis connected, which is a Braille display with a QWERTY keyboard, then I will enter data that way.

I will very occasionally use dictation if I’m in a hurry and I’m on the move.

And celebrate! Holy duck! Apple has improved autocorrect in iOS 17. Yes, it will be a bit less aggressive about filtering certain terms that you may choose to use. Apple is trying to step back from being the moral police.

And it also has some new technology that learns over time how you personally like to write – your writing style. So the more you use iOS 17, the more useful the autocorrect features are likely to be.

And it’s not only correcting constant misspellings. It’s also making much better effort to guess what it is you’re typing. When you hear the word that you want, you can actually now just press the spacebar, and the rest of the word will be completed.

And although Apple has been careful not to jump on the AI hype that’s been around lately, there is a lot of AI around in Apple. And this is AI at work here, because as it learns more about the way you like to write, you’ll find that it will start to complete sentences. So if it’s offering to complete a sentence and it’s got it right, just press the spacebar, and the rest of the sentence will be filled in.

It also says that it’s going to try and correct your grammar and common words that are misspelled, such as T-H-E-I-E versus T-H-E-I-R, affect and effect. It will correct those now.

You may well be saying, “I don’t need no grammar checker.” and I respect that. But for those people who do, it’s now built into iOS 17.

So if you like to use the virtual keyboard, do check in with keyboard settings to make sure that everything is set up the way that you like, and give it a try. The more you use it, apparently, the more intuitive it’s going to be about the way you like to write.

Let us know how you get on.

Apple also says that the same AI that’s helping it to understand how you write will also, over time, improve even further how accurate dictation works. That’d be wonderful, because I still see a lot of people dictating and making quite egregious errors that they choose not to correct. So any help in that regard will be marvelous.

Improved Sharing With People Close By

Apple has made several improvements to sharing content with people who are close by, and even continuing to share when they’re not. At least, that’s coming later in the year.

Let’s first talk about Name Drop. This reminds me a bit of an app that I used to use way back when, when I first got an iPhone, called Bump.

Name Drop is a new Airdrop feature for exchanging your contact information. It works with iPhone. It’ll also work with the Apple Watch if it’s running watchOS 10. You can choose specific phone numbers and email addresses that you want to exchange.

So if you’ve got a master contact, for example, and it’s got everything – your home address, your home phone number, work information. If you are exchanging contact information with a business associate, you don’t want them to have the home details. You want them to have the work details.

And possibly, vice versa. If you’re exchanging information with somebody who you know in a social setting, you want to have that kind of granular control. And you do with Name Drop.

If you’ve set up a contact poster, which is another feature in iOS 17 where you can actually dictate how your contact is represented when it’s sent to other people or even when you call someone, then that poster can also be transmitted this way. I think this is something many people have been doing in more complex ways – sending their contact along to others.

It’s more accessible for many of us than business cards, most of which aren’t Braille. And of course, consume trees as well.

So this is good news, and it’s an accessible way to get contact information onto your device, and onto someone else’s.

There’s also a new feature called proximity sharing. It’s kind of like Name Drop, but for files.

The way this works is that if you hold your iPhone close to another one, you can set up an AirDrop file transfer, which will let you share photos and other documents with somebody. It will work with videos as well. And they could take a long time.

Now, that’s where the internet transfer comes in. Later this year, (It’s not in the initial release.), you will be able to set up an AirDrop transfer using the proximity feature, and finish it over iCloud if you need to move away from the person.

So you can imagine if you’re sending a large file, like a video or even an audio file, and you just want to start this thing off and walk away. Then you’re able to do that, and iCloud will take care of the rest. For this to work, both participants are going to have to be signed in to iCloud.

If you want to set this up, we go to our old friend, the settings screen. And then, general. And under general, you’ll find …

VoiceOver: AirDrop button.

Jonathan: We’ll double tap.

VoiceOver: Selected. Receiving, off. Selected. Contacts only. Everyone for 10 minutes. AirDrop lets you share instantly with people nearby. You can be discoverable in AirDrop to receive from everyone, or only people in your contacts. Start sharing by, heading. Bringing devices together switch button, on.

Jonathan: If for some reason, you don’t want this feature to work for security reasons or whatever, (and I can understand that some IT administrators may be nervous about this), you can disable it in here.

VoiceOver: Easily swap numbers with NameDrop. Share photos and more by holding the top of your iPhone close to another iPhone.

Jonathan: So it’s on by default, it’s good to go, and I can exchange contact information using NameDrop.

SharePlay also benefits from this feature.

If you haven’t used SharePlay yet, it is a very cool feature that, in my view, came about a year too late. If Apple had had SharePlay in the operating system during the height of the lockdowns, that would have been huge. But it’s still a good feature.

Bonnie, for example, is heading to the United States in November, and we are big fans of For All Mankind. Using SharePlay, we’ll be able to watch it together on Apple TV+.

If you are in the same place and you want to SharePlay something, you can now do that using this method of bringing the two iPhones together.

This can be useful for a couple of reasons. One is, if you use hearing aids, or you have difficulty hearing, then being able to play something through SharePlay while somebody else is listening through their device is handy. It could also be handy if you want audio description, but the person that you are sharing this with does not, because that setting is specific to the device.

So I can watch For All Mankind with audio description on. If I were watching it with a sighted person who doesn’t want the audio description, they can watch it on their device with it off. And we can initiate that process simply by bringing the phones together and choosing SharePlay.

Sharing AirTags

And speaking of travel, if you use AirTags to put in your suitcase, and I highly recommend that you do because it’s extraordinary, some of the stories that we’re getting from people who have put AirTags in their luggage. The luggage has become lost, the airline has flat out lied about where that luggage is and what’s happened to it, and some people have even taken matters into their own hands and gone and found their luggage themselves because thanks to AirTags, they know precisely where it is.

And when Bonnie and I travel together, one of the downsides of AirTags is that we haven’t been able to share them. So I’ve had the AirTags on my account for my suitcase. She’s had them on hers if she’s traveling with a separate suitcase. And I mean, it kind of works. But it would be nice if we could share AirTags between us.

Now, you can. You can have an AirTag shared with up to 5 people. So that’s similar to the limits on family sharing.

So now, if you’re traveling with a communal suitcase, you’ll both be able to track that luggage.

That’s all done, of course, in the Find My app, which allows you to find people, items, and your devices.

Standby Mode

Let’s talk briefly about standby mode, which is available in iOS when your phone is in landscape mode and switched off. A sighted person can see a range of information. They can specify precisely what is displayed, including notifications and some widgets.

I’ve chosen to make sure this is completely disabled, because I can only imagine that it’s going to drain some battery without much benefit to me.

So if you want to do that, you can go into display settings and ensure that the always on display is disabled. And then, you can also go into standby mode, just to be sure, (There’s a standby option under the main settings screen.) and see that it’s disabled there.

Apple Music Changes

Many of us enjoy listening to music, so it’s notable that there are some changes to the Apple Music app this time.

You will eventually, finally, be able to collaborate with others on playlists. This hasn’t made the first cut in iOS 17, but it is coming.

Also notable is that Apple Music will now crossfade. You can go into the settings for Apple Music and set this up.

For me, this is a disappointment, having been used to, all the way back in the 1990s, plugins like the SQR plugin for Winamp. And obviously, as a broadcaster, I’m using Station Playlist Studio. And prior to Apple Music, there was a very good app in the App Store that did crossfading very well. And I wish I could remember what that app was called, but I think it’s gone now because it doesn’t have Apple Music integration.

Now, all of those things that I mentioned have what I call intelligent crossfading. By that, I mean that the software is listening, if you will, to the sound of the previous file, making a judgment as to when it reaches a decibel threshold, then starting the next track. Not fading in the next track, but just starting it because of the decibel level that the previous track has reached.

What we have in Apple Music now, in iOS 17, is not that. It’s a very crude instrument. Basically, you can set the number of seconds at which the next track kind of fades in. I don’t like the effect very much.

It is a shame that with all of Apple’s software resources, they haven’t given us true intelligent crossfading that is making its judgments based on the specific sonic qualities of the outgoing and incoming track. But it’s there, and some people will appreciate the fact that there are now no gaps between tracks if you set this up.

It’s difficult for me to demonstrate this without being pinged for royalty infringement, but I’m sure you’ll be able to have a play with this.

Some Safari Improvements

As usual, there have been some improvements to Safari this time round, and I want to highlight a couple.

One is the creation of profiles. This is a big step up from tab groups, which still exist.

The idea is that increasingly, particularly with eSIMs where it’s possible to have your work phone and your home phone on the same device, you might want to have completely separate profiles for various things.

You can now set this up in Safari. You do this by going to Safari settings and creating the profile. When you do this, the profile becomes available in Safari itself. You can specify your separate set of favorites for each profile, and a range of other settings as well.

So just as you can have a separate focus for work and make some apps disappear during work hours and stop some push notifications during work hours, now you can have a separate profile for work, or any other purpose that you can think of.

There’s also a nice little addition to the page menu, which includes the Safari reader. The reader’s a long-standing feature in Safari. I use this a lot because some pages can get very busy.

If you go, for example, to a news website and there’s an article on that page, you often find that it’s broken up by all sorts of verbiage. A sighted person can just glance past it, but it’s more time-consuming and distracting for a blind person to skip past it. The reader can often eliminate a lot of that excess verbiage.

One new feature that’s been added in iOS 17 is the ability for Siri to read the webpage. Now that you can vary the speed of Siri, that could be a feature that you’ll really enjoy. You may want a more responsive voice, a more traditional screen reader voice running when you’re using your phone most of the time. But if you’re sitting back and having an article read to you, then Siri can now do that from within Safari.

And the final thing I’d highlight about Safari that’s interesting to me is that you can now specify a different search engine for when you’re privately browsing. The use case for this one is fairly obvious, I think, but I will outline it. You may prefer to use Google, or some other search engine that perhaps, is not as privacy-focused as others.

One of the most famous privacy-focused search engines is DuckDuckGo. I’ve personally tried to get used to using DuckDuckGo because I like what they stand for. But I just find that Google gives me better results. So I find myself gravitating back to Google over time.

However, if you want a truly private, in-private browsing experience and you’re wondering whether Google’s collecting anything, you can now set a separate search engine for when you have a private tab open, and that is settable from within Safari settings.

Closing and Contact Info

And there you have it –some of the new things that are in iOS 17. I hope you found this helpful.

As you play with iOS, I’d be very interested to know what you think. Do be in touch and let me know your findings.

In the meantime, it’s time for me to go because this one is a long one.

Thanks for being here for episode 250.

Remember that when you’re out there with your guide dog, you’ve harnessed success. And with your cane, you’re able.

[music]

Voiceover: If you’ve enjoyed this episode of Living Blindfully, please tell your friends and give us a 5 star review. That helps a lot.

If you’d like to submit a comment for possible inclusion in future episodes, be in touch via email,. Write it down, or send an audio attachment: opinion@livingblindfully.com. Or phone us. The number in the United States is 864-60-Mosen. That’s 864-606-6736.

[music]