A new system is helping to combat child sexual abuse images

[ad_1]

Every day, one The British analyst team faces seemingly endless horrors. The 21-person team working at the Cambridgeshire Internet Observation Foundation’s office spent hours browsing images and videos containing child sexual abuse. Moreover, every time they find a photo or clip, they need to be evaluated and tagged. Last year alone, the team identified 153,383 web pages with links to images of child sexual abuse. This creates a huge database, which can then be shared internationally to stop the flow of abuse. problem? Different countries/regions classify images and videos in different ways.

So far, analysts at the UK-based child protection charity have checked whether the material they found falls into three categories: A, B, or C. These groupings are based on British laws and sentencing guidelines for child sexual abuse and broadly stipulate the types of abuse. For example, images in the most serious category A include the most serious crimes against children. These classifications are then used to calculate how long a person convicted should be sentenced. But other countries use different classifications.

IWF now believes that data breakthroughs can eliminate some of these differences. The organization rebuilt the hash software called Intelligrade to automatically match images and videos with the rules and laws of Australia, Canada, New Zealand, the United States, and the United Kingdom (also known as the Five Eyes countries). This change should mean reducing repetitive analysis work and making it easier for technology companies to prioritize the most abusive images and videos.

“We believe that we can better share data so that more people can use it in a meaningful way, rather than all of us working in our own small island,” said Chris Hughes, director of the IWF report hotline. “Currently, when we share data, it is difficult to make any meaningful comparisons of the data because they simply cannot mesh correctly.”

Countries apply different weights to images based on what happens in the images and the age of the children involved. Some countries classify images according to whether the child is pre-adolescent or adolescent and the crime that is occurring. The most serious category A in the UK includes penetrative sex, bestiality and sadism. Hughes said that it does not necessarily include masturbation behavior. In the United States, this belongs to a higher category. “At present, the United States requires that IWF Class A images will miss this level of content,” Hughes said.

All photos and videos viewed by the IWF are assigned a hash value, which is essentially a code, shared with technology companies and law enforcement agencies around the world. These hash values ​​are used to detect and prevent known abusive content uploaded to the network again. The hash system has had a major impact on the spread of child sexual abuse materials online, but IWF’s latest tools add a lot of new information to each hash.

The secret weapon of the IWF is metadata. This is data about data-it can be the content, people, methods, and time contained in the image. Metadata is a powerful tool for investigators because it enables them to discover patterns in people’s behavior and analyze their trends.One of the biggest proponents of metadata is spies, and they say it’s better than People’s message content.

Hughes said that the IWF has increased the amount of metadata it creates for each image and video it adds to its hash list. Every new image or video it views is evaluated in more detail than ever before. In addition to determining whether sexual abuse content belongs to the three groups in the UK, its analysts have now added up to 20 different pieces of information to their report. These fields match what is needed to determine image classification in other Five Eyes countries—the charity’s policy staff compared each law and determined what metadata was needed. “We decided to provide high granularity about describing age, describing what happened in images, and confirming gender,” Hughes said.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker