Home / Blog / Report

How Does the Apple Watch Series 3 with Cellular Work?

watchOS 4 ceramic series3 control center connected to device

The Third generation of Apple Watch was just released, and with it comes the ability to use the Watch while on a cellular network such as Verizon, Sprint, AT&T or T-Mobile, but how exactly does this device communicate with Apple to get content to your watch?

Experimentation

The first thing I did when I turned on my new watch was to pair it with my phone, obviously, but I then tried to use the cellular features within the same room as my phone. This meant that I turned off the radios of my phone and tried several different ways to get the watch to work and all it would say is, “Disconnected.” I was discouraged and a bit disappointed with the watch, because I could not get it to work with LTE out of the box. Yesterday I called Verizon, and we went through the set up process again, and we did see more options that had to be set during setup that were not there the first time, so this was promising. but I still could not figure out how to use LTE after it was set up.

The Solution

When I woke up this morning, I gave the problem some more thought, and I realized that the watch still relies on the iPhone to work. This means that both devices must be on the Internet, and that both devices have to be in communication for the watch to work properly. I left the apartment and went for a walk, and sure enough, I was able to see a green cellular icon in control center with a signal strength icon at the top. I then tried to use Apple Music, but of course, that was promised for next month, so that did not work at this time.

Conclusion

While I was disappointed with the watch at the time of unboxing, I think the watch is growing on me now that I have had time to play with it. I think they could do much more with it like making it completely independent from the iPhone, but I think they have come a long way from the Series 0 watch that everyone loved.

What’s New in VoiceOver in iOS 11

iOS 11

On Tuesday, September 19th, Apple released iOS 11, bringing with it some new VoiceOver features and improvements.

Apps

You can now drag and drop apps using VoiceOver To do this, do the following:

  1. While on the home screen, double tap and hold to enter edit mode.
  2. Find an app you wish to move.
  3. Set the VoiceOver rotor to”actions” if it’s not done automatically and flick up or down to “drag app name.”
  4. Navigate to where you wish to drop the app and choose an option. You can drop an app before the app that the VoiceOver cursor is focused on, after it, or create a folder containing the focused app and the one you’re dragging. If you wish to drag more than one app, you can choose the final option. This is to “Add To Drag Session.” You can use this method do drag files from one app to another minus the double tap and hold.

Verbosity

VoiceOver includes several new verbosity settings you can now change. They are located by tapping Settings>General>Accessibility>VoiceOver>Verbosity.

These options include:

  • Speak hints. This setting is on by default. Double tapping this setting will turn them off.
  • Punctuation. After double tapping this option, you can choose to set it to all, some, or none.
  • Speak detect text. This determines whether automatically detected text in the focused item is spoken. For example: If you are on an app with an unlabeled button, VoiceOver will announce something like “Button. Possible text: View menu.”
  • Capital letters: This option will change what VoiceOver does when encountering a capital letter. You can choose from speak cap, play sound, change pitch, and do nothing.
  • Deleting text: You can choose from speak, play sound, change pitch, and do nothing.
  • Embedded links: You can choose from speak, Play sound, change pitch, or do nothing.
  • There is a table output heading with options related to the reading of tables.
  • You can toggle reading of table headers and row and column numbers.
  • As in iOS 10, you can turn the emoji suffix on or off, depending on whether or not you want VoiceOver to speak the word “Emoji” when one is encountered.

Mail

You no longer need to three-finger tap on a message to hear the preview.

When reading a message, you have VoiceOver actions to reply, archive, flag, mark as read/unread, and to activate.

If you use threaded messaging, you have a rotor option, “Expand/Collapse Thread”. When expanded, you can manage all of the messages inside a thread on an individual basis.

Smart Actions Rotor

VoiceOver in iOS 11 now has a new feature that allows for the user to continue to use the last used actions rotor item. This is useful for deleting large amounts of messages. This feature also appears in the App Switcher

What have we missed?

Know something that is not on this list? Please let us know by emailing us at feedback@iaccessibility.net or tweet us at @iaccessibility1

Maybe it’s not Apple with the problem: Maybe it’s us

iOS 11

A few days ago, I discovered an article written by someone in the blind community criticizing a new feature that has been implemented in iOS 11.

Background:

In iOS 11, Apple has implemented a new feature dealing with the way VoiceOver manages the Actions Rotor in the Mail App. If you are familiar with the way the App switcher worked in iOS 10, you’ll recognize this behavior. Now, if you delete one message or perform any other action on it, that action will stay selected until you manually change it. This is a big change in behavior from before, when the action would return to the default setting after performing an action on a message.

What we’re seeing now

The blind community has become very divided and upset over this feature. Claims have been made that this is inconsistent with typical rotor behavior, that it is half-baked, that it’s a step backward in accessibility, and that it sets a dangerous standard for Apple to model.

The reality is this: this new feature is a change. It is a deviation from the typical behavior that we have seen in the rotor for years. Unfortunately, it seems that members of the blind community find it difficult to deal with these changes and retrain their brains. This is a big part of being successful in the world: we must have the capability to adapt and problem solve. Things will not always remain the same. We cannot expect Apple to hold back on a potentially game-changing feature just so that its user base does not have to deal with a new environment. The feature is a huge productivity boost for those of us who delete a bunch of emails at once, receiving hundreds of messages a day from mailing lists. It’s faster than selecting the edit button then selecting the messages to delete. The feature is buggy, yes, but it will continue to improve, that is, if we don’t stop it in its tracks from our constant complaints.

Yes, people struggle with the rotor. Yes, older people may have difficulty, but if we are going to train someone in using technology, it’s less about training them in how to do something than it is training them how to solve problems that come up. No one can plan for every single situation that arises. No one can teach for every single possible quirk, crash, or inconsistency in an operating system. Apple’s operating system has never been aimed at the older age group specifically, so comments that I have seen about this feature being detrimental to older blind folks is ridiculous. We, as blind people, have a hard enough time getting mainstream developers to listen to us and make their products accessible without complaining that advances they’re trying to make in our productivity are bad things. Let’s learn to adapt and remember that sighted people have to deal with these changes, too. With each new OS release comes new features, new ways of doing things, and, in some cases, new bugs, for all people, not just the disabled. It’s not just us who have to learn. If we insist that things remain the same, how can we expect companies to develop new and innovative technology for us? How can we expect anything to get any better without a little brain retraining every now and again?

Known iOS 11 Bugs

iOS 11

Each year, Apple releases a new version of iOS with new features which change the operating system in ways that can cause bugs for users. While we all wish this did not happen it is inevitable, and it is good to know what works and what does not work, so in this article, you will find what we have found to be known bugs in iOS 11.

  • iPad multitasking with split view apps is broken currently with VoiceOver. Workaround can be found at https://iaccessibility.net/guide-getting-around-ipad-multitasking-bug-ios11/
  • Zoom will not rotate between portrait and landscape properly all the time. This occurs when the device is rotated to landscape and back to portrait. Sometimes, the device will rotate back to portrait but the zoom window will stay in a landscape state.
  • A VoiceOver bug has been found that causes drag and drop of icons to not work correctly in the Dock and on the Home screen This causes the entire device to hang for an unpredictable amount of time, and is not always seen to happen. Only known fix is to reset your device’s settings.
  • A VoiceOver bug has been found that causes Messages on iOS to say, “null” when a conversation is deleted. This does not display visually but does show with VoiceOver. @mcourcel on Twitter reports that closing Messages after deleting a conversation fixes the null conversation.
  • A bug has been found that causes iOS to relaunch Springboard when a user activates the App Switcher or opens an app from the App Switcher while VoiceOver is running. This does not happen if VoiceOver is not running. and does not always happen when switching apps.

What bugs have you found? Let us know and we will post them here. To send us your bugs, email us at feedback@iaccessibility.net or tweet us at @iaccessibility1 on Twitter.

Guide: Getting Around iPad Multitasking bug in #iOS11

iPad Pro Front and Back

iOS 11 has changed the way users work with apps by introducing an improved Dock and a new way to work with two apps at once, which lets users stretch and shrink apps to be the size they want. You can also have an app float on top of another app, but these amazing features do not work properly with VoiceOver at this time. Our hope is that, in iOS 11.1, Apple will fix this bug, but until then, there is a work around to still use two apps at once with VoiceOver. Here is how it works.

  1. Find the status bar and swipe up with three fingers to reveal the control center.
  2. Find the app that you are currently using with one finger.
  3. Find the app you would like to use with the first app that one finger is on. This second app will be in the Dock.
  4. Double tap and hold on the new app and then drag the new app to your other finger.
  5. Keep holding your finger on the new app that will be moved on top of the first app until VoiceOver opens the first app and makes a sound.
  6. You can then move the new app to where you want, or move it to the right or left to make the app be to the left or right of the main app. At this point you may lift that finger.

During this process, VoiceOver will not read the status of the drag and drop so you will have to follow this guide specifically if you would like this feature to work. Hopefully Apple will fix this in a future update.

iPhone X Accessibility: What We Know So Far

Picture of iPhone X with gradient wallpaper

The iPhone X is Apple’s Brand new iPhone, with a full sized screen, no bezel, at 5.8 inches. Perhaps one of the biggest points is that Apple has removed the beloved Home Button, so How can this phone be accessible? Here is what we know so far. This article will be updated as information becomes available.

  • Triple click Home, which is a customizable accessibility shortcut to activate functions such as VoiceOver and Zoom, has been changed to triple pressing of the side button on the right side of the Phone.
  • To open Notification Center, find the left status bar and then flick down with three fingers.
  • To open control Center, find the right status bar, and then flick down with three fingers.
  • FaceID supports VoiceOver: You can turn off the need for the eyes to be focused in accessibility settings.

Again, we will continue to update this article as new information becomes available. Let us know on Twitter or in the comments if you know of something, and we will update the article accordingly.

#iOS11 is not as social as iOS 10

iOS 11

Background

In the past, iOS contained built in social media sharing features which included Twitter and Facebook. These features made it easy for apps to connect to social platforms in order to share content with features built right in to iOS. In iOS 11, the latest version of Apple’s mobile operating system, these features have been removed. As a result, users will now be responsible for installing and using social network apps from the App Store that have share extensions and account management.

While, on the surface, this seems like a big deal, it really has been coming for some time. For a while now, Facebook has been authenticating 3rd party app logins through their own app or from their mobile webpage, instead of using the iOS login prompts.. You can even sync your contacts through the Facebook app, which used to be managed in iOS settings

So What Does This Mean for Apps?

At the time of this writing, a lot of Twitter applications use the built-in Twitter APIs in iOS. As a result, they will need to be updated so that they do not give an error that there are no Twitter accounts set up in the settings app.

Conclusion

While I think that this update to the handling of social sharing will not change much in the way of everyday consumers using their iOS devices, I think it was a great thing to be able to manage all  social account logins from one place in the Settings app. Android has this, and a user can even add accounts  that were not included with the OS. I doubt they would do this, but I really hope that the engineers at Apple have something in mind to replace the previous social framework, and will surprise us in a future version of iOS.

Hear That? It’s an Eclipse!

Eclipse Soundscapes

On August 21, 2017, millions of people will get the chance to see the total solar eclipse as it passes through the United States. However, not everyone will be able to enjoy it. Those of us with impaired or no vision will certainly miss out on this great spectacle, or will we? As Apple adds use to say, “There’s an app for that.” No, seriously!

The app, called Eclipse Soundscapes, aims to give it’s users “a multisensory experience of this exciting celestial event.”

From the App Store

The Eclipse Soundscapes Project app is specially designed so that people who are blind and visually impaired can share in the awe and wonder of astronomical events in real time with their sighted peers.   The app is a joint effort between The Smithsonian Astrophysical Observatory (SAO), NASA’s Heliophysics Education Consortium (HEC), the National Center for Accessible Media (NCAM), and the National Park Service (NPS). Features include an interactive “Rumble map”; audio descriptions of key features of the eclipse; a play-by-play description of the total solar eclipse as it is happening in the user’s area; and a countdown clock to the next upcoming eclipse.   The “Rumble Map” gives the user the sensation of “feeling” the Sun during an eclipse. Our technology translates images of key eclipse features into a series of unique frequency modulated tones that map out variations in light and dark as the user explores the image with their fingertips.  These tones are specially designed to make the user’s mobile device shake, or rumble, in response to the changes.   After the eclipse, the Eclipse Soundscapes app will provide access to a database of soundscape recordings from U.S. National Parks and other urban and rural locations so that users can experience how eclipses change the behavior of different species, including humans. During the next five years, the app will expand to include other eclipses and astronomical objects of interest giving people who are blind and visually impaired – and everyone else – a new way to engage with the universe around them.

You can grab it from the App Store for free.

You can find out more about the Eclipse Soundscapes project here.

I’m really excited to see an app like this because I’ve always had an interest in what events like this looked like. I can’t wait to use it during the eclipse. I think it’s going to be a really neat experience! Don’t you?

Let us know what you think.

Freeing everyone’s ability to learn VoiceOver for iOS

VO Starter app icon with blue gradient with letters VO in center

Today we decided that VO Starter should provide training to all iOS users who want to learn VoiceOver, so we decided to make VO Starter a free app. One reason we did this is because everyone should know how to use their device, and the other app that helps with this training is now 2 years old and could use an update. iAccessibility hopes to provide the best training opportunities for students with the use of VO Starter and future VoiceOver training apps, so from this point on, VO Starter shall be a free app. If you are a trainer of the blind then this will be a great asset to your students and to all who need some extra training on iOS devices.

VO Starter on the App Store – iTunes – Apple

Hooke Audio – 3D Binaural Microphone for iOS

Hooke Audio Logo

In the past, 3D audio recording has been something that many have not had access to. It seems that we are now seeing many different products on the market that can do 3D audio, and Hooke Audio is one company that provides such a solution. Thanks to the great folks at Hooke, the iAccessibility Management Team has had the opportunity to use the Hooke Verse headset and record several demos using the iOS app. We will soon have demos of the headset’s recordings on the iA Cast podcast, but I wanted to talk today about the Hooke Audio app and how amazingly accessible it is.

How The App Works

To explain how the app works, I must first explain more about the Hooke Verse headset. The Hooke Verse is a Bluetooth headset that can record in 3D, also known as Binaural, audio, which means that the sound can come from any angle instead of just left or right. The Hooke Verse headset appears to use specific APIs to communicate with an iOS device, like an iPhone, to send audio from itself to the device. From what I can tell, the Hooke Verse does not contain the same recording technology as many headphones, because it is not able to be used to make calls or use with TeamTalk or FaceTime on iOS. Once the user initiates recording in the Hooke Audio App, the headset will start recording in 3D audio. Recording in iOS does require the Hooke Audio app, but you can use the share button in the app to send the recordings to other apps.

Hooke Audio Accessibility

One of the big advantages of the Hooke Audio app is that it is fully accessible with VoiceOver. Hooke has worked very closely with the Blind community to make their app and device as accessible as possible.

When you enter the Hooke App for the first time, you will be guided through a tutorial on how to set up and use your headset. Please read through this tutorial as there are some interesting suggestions on how to properly use the headset.

Typically, we would explain where to find the buttons in an app in our reviews, but I think you will find a pleasant surprise when you open the Hooke app for the first time with VoiceOver on. I personally am a visual app user, but I found it very easy to learn the app with VoiceOver when I opened it and it showed me a dialog explaining how to use the app with VoiceOver.

Conclusion

Hooke Audio has created a great headset and companion app for recording 3D audio. I would like to see it have more features that worked with other apps like GarageBand and have AudioBus support, but I think that this is a great start to something that makes an iOS device into more than what it was designed for.

We will be reviewing the headset itself in great detail in a future article and in several podcast episodes.

Link to Hooke Audio

 

Bring Your Adventures to the Mac

iAccessibility Logo

Everyone likes games, and when gaming is accessible they are even better. MUDs have been an accessible way for people to play games online with friends for a long time, but there has been a lack of accessible MUD clients for the Mac. This week, iAccessibility is proud to announce that there is now a MUD client that everyone can use and it works on the Mac. Yesterday, iAccessibility released MUDAbility a low priced MUD client for macOS that works with or without VoiceOver. While this is the first release, we plan to update the app frequently and often to give you the most features possible. Our plan is to have it ready for the Mac App Store soon. If you are interested then head over to https://iaccessibility.net/downloads/mudability-accessible-mud-client-for-macos/

What the iA staff would like to see from #WWDC17

WWDC17 poster

Each year, Apple announces their latest software at their Worldwide Developer Conference, and WWDC 2017 starts today. the management staff here at iAccessibility has written down what they would like to see, and is written below.

Matt’s wishlist

Monday, June 5 is Apple’s World Wide Developer’s Conference (WWDC) where we will most likely see some new hardware and software. Here are a few of my hopes for tomorrow’s conference announcements.
iOS

  1. Updated Braille translation support – I should be able to type quickly on my Braille display without waiting for the operating system/display to sync with each other.
  2. Siri – Hopefully in iOS 11, Siri will be able to support more third party applications with Siri kit. I would love to be able to say, “Hey Siri, Play podcasts with Overcast,” or, “Hey Siri, play my Audible Book.” Also, while using Hey Siri, you should be able to continue to speak to Siri after you’ve asked it a question. This should be true for which ever device, especially if the rumour is true about Apple releasing a Siri speaker.

MacOS

  1. No more lag — personally, there shouldn’t be any lag when moving with VO, whether it be in Safari, Pages or Textedit. At the moment there is also a lag when switching activities with both apple voices and Vocalizer voices.
  2. “Hey Siri,” – Apple is always about the “universal experience across devices.” Hopefully in the next version of both MacOS and TVOS we see “hey Siri” added.

These are just a few things I would love to see at today’s event. It will be great to see what’s added in the upcoming operating systems.

Jason’s Wishlist

As we all know, Apple’s WWDC conference is happening on June fifth. Here are some things I’d like to see.

iOS

I’d like to see Apple put the “Pro” into iPad Pro. From Apple’s website: “iPad Pro is more than the next generation of iPad — it’s an uncompromising vision of personal computing for the modern world. It puts incredible power that leaps past most portable PCs at your fingertips. It makes even complex work as natural as touching, swiping, or writing with a pencil. And whether you choose the 12.9-inch model or the 9.7-inch model, iPad Pro is more capable, versatile, and portable than anything that’s come before. In a word, super.”

For me, two things come to mind that would make the iPad “super.” 1. Finder for iOS, and 2. xCode for iOS. If apple wants the iPad Pro to be an “uncompromising vision of personal computing,” adding those two features to iOS would, I think, really help Apple meet that goal.

Siri

I’d like to see Siri get smarter, and more capable.

More app categories being added to Siri kit would be nice, especially if Apple is indeed going to announce a Siri speaker.

I think Siri needs to have a more natural conversation stile than it currently does, and that stile should be hands-free. We can already say something like “Hey Siri. What’s the weather?” And it’ll respond. But if you want to ask something else, you would keep having to say “Hey Siri.”

Maybe Siri could listen for a couple seconds to see if you have something else to ask before waiting for the “Hey Siri” command again?

I can already see a downside to my idea, but it leads me to the last thing I’d like to see for Siri, and that would be to improve Siri’s ability to only respond to your specific voice.

Conclusion

These are just a few things I’d like to see announced at WWDC. Will they happen? Well, all I can say is it’s going to be fun to watch it and find out.

Michael’s Wishlist

WWDC is one of my favorite announcements of the year, and I can’t wait to see what Apple brings us today. There is so many things that I would like to see them release today, so I will separate them out by OS.

iOS

  1. I have to agree with what Jason said about the iPad Pro. I would really like to see this become more of a Pro device for dev and others.
  2. I would like to see a full file system come to iOS
  3. I would like to see full audio support come to iOS where multiple media types can play at the same time, so I could have TeamTalk and music playing at the same time, or have recording from multiple inputs work.
  4. Better support for Siri to learn new skills
  5. Removal of Apple Music Cache files. You can do this on Android, so iOS should have this as well.
  6. less use of iTunes to get items like Ringtones on your device.

macOS

  1. Support for, “Hey Siri”
  2. Support for touch screen macs

WatchOS

  1. Allow the watch to unpaid and pair with different phones.
  2. Speed improvements
  3. Siri Improvements

Windows

  1. Support for iMessage for Windows

Android

  1. Support for iMessage
  2. Support for Apple Watch
  3. iCloud contacts and sync for Android

While I am sure much of my list will not be on the list for today’s event I can hope that some things will make it to reality.

Conclusion

I think our team has put together some awesome ideas of things we would like to see today at WWDC, and we will see what we get in an hour.

WWDC 2017 – Apple

Back to Top