This study examined the intersection of artificial intelligence (AI) and child sexual abuse (CSA), employing a rapid evidence assessment of research on the uses of AI for the prevention and disruption of CSA, and the ways in which AI is used in CSA offending. Research from January 2010 to March 2024 was reviewed, identifying 33 empirical studies.
All studies that met inclusion criteria examined AI for CSA prevention and disruption—specifically, how technology can be used to detect or investigate child sexual abuse material or child sexual offenders. There were no studies examining the uses of AI in CSA offending.
This paper describes the state of current research at the intersection of AI and CSA, and provides a gap map to guide future research.
References
URLs correct as at November 2024
*Included in review
*Agarwal N, Ünlü T, Wani MA & Bours P 2022. Predatory conversation detection using transfer learning approach. In G Nicosia et al. (eds), Machine learning, optimization, and data science. Lecture Notes in Computer Science vol. 13163. Springer International Publishing: 488–499. https://doi.org/10.1007/978-3-030-95467-3_35
*Al-Nabki MW, Fidalgo E, Alegre E & Alaiz-Rodriguez R 2023. Short text classification approach to identify child sexual exploitation material. Scientific Reports 13(1): 16108. https://doi.org/10.1038/s41598-023-42902-8
*Anderson P, Zuo Z, Yang L & Qu Y 2019. An intelligent online grooming detection system using AI technologies. 2019 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) New Orleans, LA, USA: IEEE1–6. https://doi.org/10.1109/FUZZ-IEEE.2019.8858973
*Brewer R et al. 2023. Advancing child sexual abuse investigations using biometrics and social network analysis. Trends & issues in crime and criminal justice no. 668. Canberra: Australian Institute of Criminology. https://doi.org/10.52922/ti78948
Butler J 2023. AI tools could be used by predators to ‘automate child grooming’, eSafety commissioner warns. The Guardian, 20 May. https://www.theguardian.com/technology/2023/may/20/ai-tools-could-be-used-by-predators-to-automate-child-grooming-esafety-commissioner-warns
*Cardei C & Rebedea T 2017. Detecting sexual predators in chats using behavioral features and imbalanced learning. Natural Language Engineering 23(4): 589–616. https://doi.org/10.1017/S1351324916000395
*Cubitt T, Napier S & Brown R 2023. Understanding the offline criminal behavior of individuals who live stream child sexual abuse. Journal of Interpersonal Violence 38(9–10): 6624–6649. https://doi.org/10.1177/08862605221137712
*Cubitt T, Napier S & Brown R 2021. Predicting prolific live streaming of child sexual abuse. Trends & issues in crime and criminal justice no. 634. Canberra: Australian Institute of Criminology. https://doi.org/10.52922/ti78320
*Dalins J, Tyshetskiy Y, Wilson C, Carman MJ & Boudry D 2018. Laying foundations for effective machine learning in law enforcement. Majura: A labelling schema for child exploitation materials. Digital Investigation 26: 40–54. https://doi.org/10.1016/j.diin.2018.05.004
Edwards G, Christensen L, Rayment-McHugh S & Jones C 2021. Cyber strategies used to combat child sexual abuse material. Trends & issues in crime and criminal justice no. 636. Canberra: Australian Institute of Criminology. https://doi.org/10.52922/ti78313
Ganann R, Ciliska D & Thomas H 2010. Expediting systematic reviews: Methods and implications of rapid reviews. Implementation Science 5: 56–66. https://doi.org/10.1186/1748-5908-5-56
*Gangwar A, González-Castro V, Alegre E & Fidalgo E 2021. AttM-CNN: Attention and metric learning based CNN for pornography, age and child sexual abuse (CSA) detection in images. Neurocomputing 445: 81–104. https://doi.org/10.1016/j.neucom.2021.02.056
Garriss K & DeMarco N 2023. FBI warns of using AI deepfakes as part of sextortion schemes. Yahoo! News, 6 July. https://www.yahoo.com/news/fbi-warns-using-ai-deepfakes-212047097.html
*Granizo SL, Valdivieso Caraguay ÁL, Barona López LI & Hernández-Álvarez M 2020. Detection of possible illicit messages using natural language processing and computer vision on Twitter and linked websites. IEEE Access 8: 44534–44546. https://doi.org/10.1109/ACCESS.2020.2976530
*Grubl T & Lallie HS 2022. Applying artificial intelligence for age estimation in digital forensic investigations. https://doi.org/10.48550/arXiv.2201.03045
*Guerra E & Westlake BG 2021. Detecting child sexual abuse images: Traits of child sexual exploitation hosting and displaying websites. Child Abuse & Neglect 122: 105336. https://doi.org/10.1016/j.chiabu.2021.105336
Henseler H & de Wolf R 2019. Sweetie 2.0 technology: Technical challenges of making the sweetie 2.0 Chatbot. In S van der Hof, I Georgieva, B Schermer & BJ Koops (eds), Sweetie 2.0: Using artificial intelligence to fight webcam child sex tourism. Information Technology and Law Series, vol 31. The Hague: TMC Asser Press: 113–134. https://doi.org/10.1007/978-94-6265-288-0_3
High-Level Expert Group on Artificial Intelligence 2019. A definition of AI: Main capabilities and disciplines. Brussels: European Commission. https://digital-strategy.ec.europa.eu/en/library/definition-artificial-intelligence-main-capabilities-and-scientific-disciplines
Internet Watch Foundation 2023. How AI is being abused to create child sexual abuse imagery. https://www.iwf.org.uk/about-us/why-we-exist/our-research/how-ai-is-being-abused-to-create-child-sexual-abuse-imagery/
*Isaza G, Muñoz F, Castillo L & Buitrago F 2022. Classifying cybergrooming for child online protection using hybrid machine learning model. Neurocomputing 484: 250–259. https://doi.org/10.1016/j.neucom.2021.08.148
*Jin P, Kim N, Lee S & Jeong D 2024. Forensic investigation of the dark web on the Tor network: Pathway toward the surface web. International Journal of Information Security 23(1): 331–346. https://doi.org/10.1007/s10207-023-00745-4
*Laranjeira C, Macedo J, Avila S & dos Santos JA 2022. Seeing without looking: Analysis pipeline for child sexual abuse datasets. arXiv. Presented at the 5th Conference on Fairness, Accountability and Transparency (FAccT), 2022. https://doi.org/10.48550/arXiv.2204.14110
Long C 2023. First reports of children using AI to bully their peers using sexually explicit generated images, eSafety Commissioner says. ABC News, 16 August. https://www.abc.net.au/news/2023-08-16/esafety-commisioner-warns-ai-safety-must-improve/102733628
Lupariello F, Sussetto L, Di Trani S & Di Vella G 2023. Artificial intelligence and child abuse and neglect: A systematic review. Children 10: 1659. https://doi.org/10.3390/children10101659
*Macedo J, Costa F & dos Santos J 2018. A benchmark methodology for child pornography detection. Paper presented at the 2018 31st SIBGRAPI Conference on Graphics, Patterns and Images: 455–462. https://doi.org/10.1109/SIBGRAPI.2018.00065
*Meyer M 2015. Machine learning to detect online grooming (Master’s thesis).
https://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-260390
Milmo D 2023. AI-created child sexual abuse images ‘threaten to overwhelm internet’. The Guardian, 25 October. https://www.theguardian.com/technology/2023/oct/25/ai-created-child-sexual-abuse-images-threaten-overwhelm-internet
*Murcia Triviño J, Moreno Rodríguez S, Díaz López DO & Gómez Mármol F 2019. C3-Sex: A chatbot to chase cyber perverts. 2019 IEEE International Conference on Dependable, Autonomic and Secure Computing, International Conference on Pervasive Intelligence and Computing, International Conference on Cloud and Big Data Computing, International Conference on Cyber Science and Technology Congress: 50–57. https://doi.org/10.1109/DASC/PiCom/CBDCom/CyberSciTech.2019.00024
Murphy M 2023. Predators exploit AI tools to generate images of child abuse. Bloomberg, 23 May. https://www.bloomberg.com/news/articles/2023-05-23/predators-exploit-ai-tools-to-depict-abuse-prompting-warnings#xj4y7vzkg
*Ngejane CH, Eloff JHP, Sefara TJ & Marivate VN 2021. Digital forensics supported by machine learning for the detection of online sexual predatory chats. Forensic Science International: Digital Investigation 36: 301109. https://doi.org/10.1016/j.fsidi.2021.301109
*Ngo VM, Gajula R, Thorpe C & Mckeever S 2024. Discovering child sexual abuse material creators’ behaviors and preferences on the dark web. Child Abuse & Neglect 147: 106558. https://doi.org/10.1016/j.chiabu.2023.106558
Okolie C 2023. Artificial intelligence-altered videos (deepfakes), image-based sexual abuse, and data privacy concerns. Journal of International Women’s Studies 25(2): 1–16. https://vc.bridgew.edu/jiws/vol25/iss2/11
*Oronowicz-Jaśkowiak W et al. 2024. Using expert-reviewed CSAM to train CNNs and its anthropological analysis. Journal of Forensic and Legal Medicine101: 102619. https://doi.org/10.1016/j.jflm.2023.102619
*Peersman C, Schulze C, Rashid A, Brennan M & Fischer C 2016. iCOP: Live forensics to reveal previously unknown criminal media on P2P networks. Digital Investigation 18: 50–64. https://doi.org/10.1016/j.diin.2016.07.002
*Pereira M, Dodhia R, Anderson H & Brown R 2021. Metadata-based detection of child sexual abuse material. https://doi.org/10.48550/arXiv.2010.02387
*Polastro M de C & Eleuterio PM da S 2012. A statistical approach for identifying videos of child pornography at crime scenes. 2012 Seventh International Conference on Availability, Reliability and Security: 604–612. https://doi.org/10.1109/ARES.2012.71
*Puentes J et al. 2023. Guarding the guardians: Automated analysis of online child sexual abuse. https://doi.org/10.48550/arXiv.2308.03880
*Razi A et al. 2023. Sliding into my DMs: Detecting uncomfortable or unsafe sexual risk experiences within Instagram direct messages grounded in the perspective of youth. Proceedings of the ACM on Human-Computer Interaction 7: article 89. https://doi.org/10.1145/3579522
*Rodríguez JI, Durán SR, Díaz-López D, Pastor-Galindo J & Mármol FG 2020. C3-Sex: A conversational agent to detect online sex offenders. Electronics 9(11): 1779. https://doi.org/10.3390/electronics9111779
*Rondeau J, Deslauriers D, Howard III T & Alvarez M 2022. A deep learning framework for finding illicit images/videos of children. Machine Vision and Applications 33(5): 66. https://doi.org/10.1007/s00138-022-01318-6
*Sae-Bae N, Sun X, Sencar HT & Memon ND 2014. Towards automatic detection of child pornography. 2014 IEEE International Conference on Image Processing (ICIP): 5332–5336. https://doi.org/10.1109/ICIP.2014.7026079
*Seedall M, MacFarlane K & Holmes V 2019. SafeChat system with natural language processing and deep neural networks. https://sure.sunderland.ac.uk/id/eprint/10968/
*Seigfried-Spellar KC et al. 2019. Chat analysis triage tool: Differentiating contact-driven vs. fantasy-driven child sex offenders. Forensic Science International 297: e8–e10. https://doi.org/10.1016/j.forsciint.2019.02.028
Singh S & Nambiar V 2024. Role of artificial intelligence in the prevention of online child sexual abuse: A systematic review of literature. Journal of Applied Security Research 1–42. https://doi.org/10.1080/19361610.2024.2331885
Thiel D, Stroebel M & Portnoff R 2023. Generative ML and CSAM: Implications and mitigations. Internet Observatory Cyber Policy Center, Stanford. https://fsi.stanford.edu/publication/generative-ml-and-csam-implications-and-mitigations
Thorn 2024a. Introducing Safer Predict: Using the power of AI to detect child sexual abuse and exploitation online. https://www.thorn.org/blog/introducing-safer-predict-using-the-power-of-ai-to-detect-child-sexual-abuse-and-exploitation-online/
Thorn 2024b. Youth perspectives on online safety, 2023: An annual report of youth attitudes and experiences. https://www.thorn.org/research/library/2023-youth-perspectives-on-online-safety/
*Ulges A & Stahl A 2011. Automatic detection of child pornography using color visual words. 2011 IEEE International Conference on Multimedia and Expo: 1–6. https://doi.org/10.1109/ICME.2011.6011977
*Westlake B et al. 2022. Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos. Trends & issues in crime and criminal justice no. 648. Canberra: Australian Institute of Criminology. https://doi.org/10.52922/ti78566
Acknowledgements: This research was conducted as part of the National Office for Child Safety’s Child Safety Research Agenda.