A New York Times column that year prompted Google executives to devote resources to the problem, organizing projects, including one called Sparrow, to help victims permanently exclude content from search, three former employees say. employees. The product manager confirmed that leaders sometimes pushed teams to improve Google’s management of NCEI.
Google has made its opt-out form more user-friendly to use, understand and access, according to sources. The search giant removed legalese and outdated use of the term “revenge porn” because pornography is generally considered consensual. The company added instructions on submitting screenshots and more details on the review process.
The form became accessible by clicking on the menu that appears next to each search result. Requests increased 19-fold during a first test, according to a source. A second source claims that it became one of Google’s most used abuse reporting forms and that after the changes, a much higher percentage of requests resulted in the results being removed. Google disputes these figures, but declined to share complete NCEI data.
Government-mandated transparency reports show that Google has removed most of the nearly 170,000 search and YouTube links flagged for unwanted sexual content in South Korea since December 2020, the earliest data available, and rejected nearly 300 pieces of content in response to 380 user complaints in South Korea. India since May 2021. Limited data suggests Google finds more credible reports than its smaller search rival Microsoft, which took action in 52% of the nearly 8,400 cases received. globally for Bing and other services from 2015 to June 2023.
Launched late In 2021, the StopNCII system built a database of more than 572,000 hashed photos and videos and prevented these media from being shared more than 12,000 times across 10 services, including Instagram and TikTok. Google has not adopted the tool to block search content due to concerns about the actual contents of the database, according to three sources.
To protect victims’ privacy, StopNCII does not review the content they report and hashes do not reveal anything about the underlying content. Google fears it will end up blocking something innocent, according to sources. “We don’t know if it’s just a picture of a cupcake,” one said. The sources add that Google has also chosen not to fund a system it considers better, despite internal suggestions to do so.