Close Menu
The Westside GazetteThe Westside Gazette
    Facebook X (Twitter) Instagram
    • About Us
    • Contact
    • Media Kit
    • Political Rate Sheet
    • Links
      • NNPA Links
      • Archives
    • SUBMIT YOUR VIDEO
    Facebook X (Twitter) Instagram
    The Westside GazetteThe Westside Gazette
    Advertise With Us
    • Home
    • News
      • National
      • Local
      • International
      • Business
      • Releases
    • Entertainment
      • Photo Gallery
      • Arts
    • Politics
    • OP-ED
      • Opinions
      • Editorials
      • Black History
    • Lifestyle
      • Health
      • HIV/AIDS Supplements
      • Advice
      • Religion
      • Obituaries
    • Sports
      • Local
      • National Sports
    • Podcast and Livestreams
      • Just A Lil Bit
      • Two Minute Warning Series
    The Westside GazetteThe Westside Gazette
    You are at:Home » Apple Will Scan IPhones, IPads For Images Of Child Sex Abuse
    News

    Apple Will Scan IPhones, IPads For Images Of Child Sex Abuse

    August 9, 20214 Mins Read0 Views
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Share
    Facebook Twitter LinkedIn Pinterest WhatsApp Email
    Advertisement

    WASHINGTON — Apple will add a series of new child-safety features to its next big operating system updates for iPhones and iPads.

    As a part of iOS 15 and iPadOS 15 updates later this year, the company will implement a feature to detect photos stored in iCloud Photos that depict sexually explicit activities involving children.

    “This will enable Apple to report these instances to the National Center for Missing and Exploited Children,” the company said in a notice on its website.

    “NCMEC acts as a reporting center for child sexual abuse material and works in collaboration with law enforcement agencies across the US.”

    Its method of detecting known child sexual abuse material is “designed with user privacy in mind,” as per Apple.

    The company said it is not directly accessing customers’ photos but instead is using a device-local, hash-based matching system to detect child abuse images.

    The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image, as per the notice.

    Only another image that appears nearly identical can produce the same number; for example, images that differ in size or trans-coded quality will still have the same NeuralHash value.

    Using another technology called threshold secret sharing, the system ensures that Apple cannot interpret the contents of the safety vouchers unless the iCloud Photos account crosses a threshold of available child sexual abuse content, the notice read.

    The cryptographic technology allows Apple to interpret the contents of the safety vouchers associated with the matching child sexual abuse material images only when the threshold is exceeded.

    Apple also said it can’t actually see user photos or the results of such scans unless there’s a hit.

    “Suppose there’s a match between a user’s photos and the child sexual abuse material database. In that case, Apple then manually reviews each report to confirm sexually explicit images of children, then disable the user’s account and send a report to the National Center for Missing and Exploited Children,” said the tech giant.

    If a user feels their account has been mistakenly flagged, as per Apple, “they can file an appeal to have, their account reinstated.”

    The system provides a high level of accuracy that ensures less than a one-in-1-trillion chance per year of incorrectly flagging a given account, as per the tech giant.

    In addition, with Apple’s iOS 15 update, the iPhone’s Messages app will add new tools to warn children and their parents if they are receiving or sending sexually explicit photos.

    “When receiving this type of content, the photo will be blurred, and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo,” Apple said.

    “As an additional precaution, the child can also be told that to make sure they are safe, and their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos.”

    Apple’s iOS 15 also will provide updates to Siri and Search to “provide parents and children expanded information and help if they encounter unsafe situations.”

    Siri and Search will intervene when users try to search for child sexual abuse material, displaying prompts that will “explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

    The iOS 15 update is slated to be available in the fall of 2021, available for iPhone 6s and later models.

    (With inputs from ANI)

    Edited by Amrita Das and Pallavi Mehra



    The post Apple Will Scan IPhones, IPads For Images Of Child Sex Abuse appeared first on Zenger News.

    Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email
    info@zenger.news'
    zenger.news
    • Website

    Related Posts

    The Blueprint of Manipulation: How Jeffrey Epstein’s Power Network Conditioned it’s Victims

    November 17, 2025

    Donate to the National Newspaper Publishers Association

    November 4, 2025

    Target continues to challenge our spending power, so we must continue to take action • Full Target Boycott! ✊

    April 28, 2025

    (Please enter your Payment methods data on the settings pages.)
    Advertisement

    View Our E-Editon

    Advertisement

    –>

    Advertisement
    Advertisement
    advertisement

    Advertisement

    –>

    The Westside Gazette
    Facebook X (Twitter) Instagram Pinterest
    © 2026 The Westside Gazette - Site Designed by No Regret Media.

    Type above and press Enter to search. Press Esc to cancel.

    Go to mobile version