|

Control Your iPhone with a Scrunched Nose or a Tongue Out with iOS 26

Apple is rolling out new accessibility controls in iOS 26 that let you map facial expressions to everyday actions on iPhone and iPad. Think raise your eyebrows to go Home, or pucker right to scroll down. If you like hands-free control or need alternative inputs, this one is worth a look.

What this is

Head Tracking lives under Accessibility and uses the front camera to watch for expressions like Raise Eyebrows, Open Mouth, Smile, Stick Out Tongue, or Pucker Lips left/right. You can assign those to actions such as Single Tap, Home, Open App, Scroll, or even your Accessibility Shortcut. Devices with a TrueDepth front camera are supported.

How to turn it on

  1. Update to the latest iOS on your device.
  2. Go to Settings → Accessibility → Switch Control.
  3. Turn on Head Tracking, then open Actions to assign expressions to the actions you want.
  4. Adjust Sensitivity so slight vs exaggerated expressions register the way you prefer.

Avoid accidental taps

If staring at something too long triggers a click you did not mean, review your Dwell settings under Switch Control so you are not selecting by accident.

A quick on/off you’ll actually use

Add Accessibility Shortcut to Control Center, then include Switch Control inside that shortcut. That gives you a fast way to toggle Head Tracking when you need it.

Why parents will care

  • Sticky hands, busy kitchen, still open the camera with a smile.
  • Stroller in one hand, scroll a recipe with a quick pucker.
  • Small accessibility win for kids who benefit from hands-free controls.

Similar Posts

Leave a Reply