The Australian Federal Police (AFP) and researchers from Monash University’s Faculty of Information Technology have worked together to create My Pictures Matter.
Do you know that by uploading your childhood photos you could help police combat online child sexual exploitation?
My Pictures Matter an initiative of the AiLECS Lab is a crowdsourcing campaign with a difference. Here, people who are aged 18 and above can contribute photographs of themselves as children.
The research behind My Pictures Matter is funded by Monash University, the Australian Federal Police and the Westpac Safer Children Safer Communities grant program.
AiLECS Lab Co-Director Associate Professor Campbell Wilson said to develop AI that can identify exploitative images, we need a very large number of children’s photographs.
“By obtaining photographs from adults, through informed consent, we are trying to build technologies that are ethically accountable and transparent.”
According to the AFP media release, these pictures will be used “to train artificial intelligence (AI) models to recognise the presence of children in safe situations, to help identify unsafe situations and potentially flag child exploitation material.”
Further, world first ethically-sourced and managed image bank will support Australian police officers and the children whom they are trying to protect from exploitation.
People who contribute photos to this initiative will be able to get details and updates about each stage of the research.
In case, as per AFP, a person doesn’t wish to be associated with the project they can even “opt to revoke their research consent and withdraw images from the database at a later date.”
Project lead and data ethics expert Dr Nina Lewis adds that the principles of data minimisation have been applied to maintain privacy around the images submitted.
“We are not collecting any personal information from contributors other than the email addresses associated with consent for research use, and these email IDs will be stored separate to the images.”
In 2021, the AFP-led Australian Centre to Counter Child Exploitation received more than 33,000 reports of online child exploitation and each report can contain many images of children being sexually assaulted or exploited.
Investigators need to review that material, which can be a slow and horrific process. It can also cause significant psychological distress.
AiLECS Lab co-director and AFP Leading Senior Constable Dr Janis Dalins said the ultimate goal is to more rapidly identify victims and material not previously seen by law enforcement.
“This will enable police to intervene faster to remove children from harm, stop perpetrators and better protect the community.”
Associate Professor Ritesh Chugh, Information and Communications Technology expert at CQ University, told The Australia Today that law-enforcement agencies globally are using AI to fight crime.
“Data generated through surveillance cameras, social media communication and audio conversations is a gold mine for assistance in public safety.”
Dr Chugh warns that there is a fine line to tread between public safety and privacy, which agencies need to consider in their AI applications and usage. He is hopeful that with time law-enforcement agencies would be able to perfect their Machine learning tools and AI systems to counter online child exploitation.
“Nevertheless, we’ll continue to see an increased use of AI and machine learning by law-enforcement agencies as they train their automated systems to identify forensic trends and correlations missed by the human eye.”
The researchers at AiLECS Lab are aiming to have a database of at least 100,000 ethically-sourced images for training the AI algorithm by the end of this year.