AppleのiOS 17は、一方的なヌードに対する保護を拡大

Apple's iOS 17 makes it easier to share content, but it also has new safeguards to prevent abuses of that newfound power. The company has 明らかになった that its upcoming software will add a Sensitive Content Warning feature that helps adults avoid unsolicited nude photos and videos. If you receive something potentially concerning, you can either decline it, agree to see it or learn about ways to get help.

Communication Safety also protects kids beyond the Messages app. The feature will use machine learning to detect and blur sexually explicit material sent and received through AirDrop, Contact Posters, FaceTime messages and the Photos picker. The technology can now recognize videos in addition to still shots. If this content arrives, children can message trusted adults for help or find useful resources.

Both Sensitive Content Warning and Communication Safety process media on-device. Apple also says it doesn't have access to the material. Communication Safety requires that you enable Family Sharing and mark certain accounts as belonging to children.

Apple unveiled its plans to curb unsolicited nudes in 2021 alongside a plan to flag photos uploaded to iCloud when they contained known child sexual abuse material (CSAM). The company scrapped this plan at the end of 2022 amid concerns governments could pressure it to scan for other image types, not to mention risks of false positives. Communication Safety and Sensitive Content Warning don't have those issues — they're only meant to prevent creeps from traumatizing others.

Legislators have aimed to criminalize unwanted nudes, and individual services have their own anti-nude detection tools. In that light, Apple is mainly filling gaps in the deterrence system. In theory, shady characters won't have much success blasting iPhone users with rude texts and calls.

Engadget が推奨するすべての製品は、親会社から独立した編集チームによって選択されています。 一部のストーリーにはアフィリエイト リンクが含まれています。 これらのリンクのいずれかから何かを購入すると、アフィリエイト コミッションが発生する場合があります。 すべての価格は公開時のものです。

ソース