Apple의 iOS 17, 원치 않는 누드에 대한 보호 기능 확장

Apple's iOS 17 makes it easier to share content, but it also has new safeguards to prevent abuses of that newfound power. The company has 공개 that its upcoming software will add a Sensitive Content Warning feature that helps adults avoid unsolicited nude photos and videos. If you receive something potentially concerning, you can either decline it, agree to see it or learn about ways to get help.

Communication Safety also protects kids beyond the Messages app. The feature will use machine learning to detect and blur sexually explicit material sent and received through AirDrop, Contact Posters, FaceTime messages and the Photos picker. The technology can now recognize videos in addition to still shots. If this content arrives, children can message trusted adults for help or find useful resources.

Both Sensitive Content Warning and Communication Safety process media on-device. Apple also says it doesn't have access to the material. Communication Safety requires that you enable Family Sharing and mark certain accounts as belonging to children.

Apple unveiled its plans to curb unsolicited nudes in 2021 alongside a plan to flag photos uploaded to iCloud when they contained known child sexual abuse material (CSAM). The company scrapped this plan at the end of 2022 amid concerns governments could pressure it to scan for other image types, not to mention risks of false positives. Communication Safety and Sensitive Content Warning don't have those issues — they're only meant to prevent creeps from traumatizing others.

Legislators have aimed to criminalize unwanted nudes, and individual services have their own anti-nude detection tools. In that light, Apple is mainly filling gaps in the deterrence system. In theory, shady characters won't have much success blasting iPhone users with rude texts and calls.

Engadget에서 추천하는 모든 제품은 모회사와 무관한 편집 팀에서 선택합니다. 일부 스토리에는 제휴사 링크가 포함되어 있습니다. 이 링크 중 하나를 통해 무언가를 구매하면 제휴 수수료를 받을 수 있습니다. 모든 가격은 게시 당시에 정확합니다.

출처