I Love My Wife, but I’m Not Sharing AirPods With Her Again Thanks to This iPhone Trick


When my wife and I travel, we usually watch shows and movies on our flight, but we always have to split a set of AirPods. So while one of our ears is enjoying the show, the other is forced to endure the crying baby behind us on the plane. But thanks to Apple’s Audio Sharing feature, we can enjoy a show together while using our own sets of AirPods, AirPods Pro or other headphones. 

Tech Tips

Apple introduced Audio Sharing with iOS 18 and iPadOS 18 in 2024. It allows two pairs of compatible AirPods or Beats headphones to pair with a single iPhone or iPad. So you and another person can immerse yourselves in music, movies or a TED Talk without disturbing those around you.

Here’s how to share audio with another set of AirPods or Beats headphones.

How to enjoy music and podcasts together 

1. Open Music, Podcasts, Spotify or any similar app with your AirPods in your ears and connected to your iPhone or iPad.
2. Start playing music or a Podcast.
3. Open your Control Center by swiping down from the top right corner of your screen. 
4. Go to your audio controls page — it’s your second Control Center page by default.
5. Tap the silhouette of two people on the right side of your screen.

The Weird Little Guys podcast in the iOS 26 Control Center.

You can share your audio with another pair of headphones by tapping this symbol.

Apple/Screenshot by CNET

6. Bring the other pair of AirPods in their case or Beats headphones near your device. You might need to press the button on the back of your AirPods case to go into Pairing Mode. 
7. Tap Share Audio when your device detects the other AirPods or Beats headphones. 

Each pair of headphones can change their own volume via headphone controls, or you can change the volume for each pair via the connected device. But you’ll be able to enjoy a podcast or other music together without splitting a pair of AirPods or disturbing people around you. 

035fb4265dada057e9b89dbc18f823c0.png

Apple

How to watch a movie or show with separate headphones

Pairing another set of headphones to your device to watch a film or TV series is similar. Here’s how. 

1. Open Apple TV, Netflix, YouTube or another streaming app with your headphones on. 
2. Start playing a film or show. 
3. Open your Control Center by swiping down from the top right corner of your screen. 
4. Go to your audio controls page — it’s your second Control Center page by default.
5. Tap the silhouette of two people on the right side of your screen.
6. Bring the other pair of AirPods in their case or Beats headphones near your device. You might need to press the button on the back of your AirPods case to go into Pairing Mode. 
7. Tap Share Audio when your device detects the other AirPods or Beats headphones. 

Now you and another person can watch a video together without disturbing the people around you and without splitting a pair of AirPods. 

For more iOS news, here’s what to know about iOS 26.4 and iOS 26.3. You can also check out our iOS 26 cheat sheet for other tips and tricks.

Watch this: iPhone in Space: The Many Apple Products That Left Earth





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


A new class-action lawsuit, filed on Monday by three teenage girls and their guardians, alleges that Elon Musk’s xAI created and distributed child sexual abuse material featuring their faces and likenesses with its Grok AI tech.

“Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety that the production and dissemination of this CSAM have caused,” the filing says. “xAI’s financial gain through the increased use of its image- and video-making product came at their expense and well-being.”

From December to early January, Grok allowed many AI and X social media users to create AI-generated nonconsensual intimate images, sometimes known as deepfake porn. Reports estimate that Grok users made 4.4 million “undressed” or “nudified” images, 41% of the total number of images created, over a period of nine days. 

X, xAI and its safety and child safety divisions did not immediately respond to a request for comment.

The wave of “undressed” images stirred outrage around the world. The European Commission quickly launched an investigation, while Malaysia and Indonesia banned X within their borders. Some US government representatives called on Apple and Google to remove the app from their app stores for violating their policies, but no federal investigation into X or xAI has been opened. A similar, separate class-action lawsuit was filed (PDF) by a South Carolina woman in late January.

The dehumanizing trend highlighted just how capable modern AI image tools are at creating content that seems realistic. The new complaint compares Grok’s self-proclaimed “spicy AI” generation to the “dark arts” with its ease of subjecting children to “any pose, however sick, however fetishized, however unlawful.”

“To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse,” the complaint reads.

AI Atlas

The complaint says xAI is at fault because it did not employ industry-standard guardrails that would prevent abusers from making this content. It says xAI licensed use of its tech to third-party companies abroad, which sold subscriptions that led abusers to make child sexual abuse images featuring the faces and likenesses of the victims. The requests ran through xAI’s servers, which makes the company liable, the complaint argues.

The lawsuit was filed by three Jane Does, pseudonyms given to the teens to protect their identities. Jane Doe 1 was first alerted to the fact that abusive, AI-generated sexual material of her was circulating on the web by an anonymous Instagram message in early December. The filing says she was told about a Discord server by the anonymous Instagram user, where the material was shared. That led Jane Doe 1 and her family, and eventually law enforcement, to find and arrest one perpetrator.

Ongoing investigations led the families of Jane Does 2 and 3 to learn their children’s images had been transformed with xAI tech into abusive material.





Source link