{"id":15517,"date":"2024-07-23T05:22:35","date_gmt":"2024-07-23T12:22:35","guid":{"rendered":"https:\/\/worldcampaign.net\/?p=15517"},"modified":"2024-07-23T05:22:36","modified_gmt":"2024-07-23T12:22:36","slug":"ai-generated-videos-of-child-sexual-abuse-a-stark-vision-of-the-future-internet-watch-foundation","status":"publish","type":"post","link":"https:\/\/worldcampaign.net\/?p=15517","title":{"rendered":"&#8220;AI-generated videos of child sexual abuse a \u2018stark vision of the future\u2019&#8221;, Internet Watch Foundation"},"content":{"rendered":"\n<p>Published:&nbsp;&nbsp;Mon 22 Jul 2024<\/p>\n\n\n\n<ul>\n<li><strong>Real victims\u2019 imagery used in highly realistic \u2018deepfake\u2019 AI-generated films<\/strong><\/li>\n\n\n\n<li><strong>First fully synthetic child sexual abuse videos identified<\/strong><\/li>\n\n\n\n<li><strong>Offenders share AI models for more than 100 child sexual abuse victims<\/strong><\/li>\n\n\n\n<li><strong>Instances of more extreme AI-generated images on the rise<\/strong><\/li>\n<\/ul>\n\n\n\n<p>AI-generated imagery of child sexual abuse has progressed at such a \u201cfrightening\u201d rate that the Internet Watch Foundation (IWF) is now seeing the first convincing examples of AI videos depicting the sexual abuse of children.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">These incredibly realistic \u2018deepfake\u2019, or partially synthetic, videos of child rape and torture are made by offenders using AI tools that add the face or likeness of another person to a real video.<\/h2>\n\n\n\n<p>The IWF, the UK\u2019s front line against online child sexual abuse, was among the first to raise the alarm last year over how realistic AI-generated images of child sexual abuse have become and the threat that misuse of the technology poses to both existing victims of child sexual abuse and to potential new victims.<\/p>\n\n\n\n<p>Now,&nbsp;<a href=\"https:\/\/www.iwf.org.uk\/about-us\/why-we-exist\/our-research\/how-ai-is-being-abused-to-create-child-sexual-abuse-imagery\/\">a new report update<\/a>&nbsp;from the IWF shows how the pace of AI development has not slowed as offenders are using better, faster and more accessible tools to generate new criminal images and videos, some of which are being found on the clear web.<\/p>\n\n\n\n<p>Disturbingly, the ability to make any scenario a visual reality is welcomed by offenders, who crow in one dark web forum about potentially being able to \u201c\u2026create any child porn<sup>1<\/sup>&nbsp;we desire\u2026 in high definition\u201d.&nbsp;<\/p>\n\n\n\n<p>In a&nbsp;<a href=\"https:\/\/www.iwf.org.uk\/about-us\/why-we-exist\/our-research\/how-ai-is-being-abused-to-create-child-sexual-abuse-imagery\/\">snapshot study<\/a>&nbsp;between March and April this year, the IWF identified nine deepfake videos on just one dark web forum dedicated to child sexual abuse material (CSAM) \u2013 none had been previously found when IWF analysts investigated the forum in October.<\/p>\n\n\n\n<p>Some of the deepfake videos feature adult pornography which is altered to show a child\u2019s face. Others are existing videos of child sexual abuse which have had another child\u2019s face superimposed.<\/p>\n\n\n\n<p>Because the original videos of sexual abuse are of real children, IWF analysts say the deepfakes are especially convincing.<\/p>\n\n\n\n<p>Free, open-source AI software appears to be behind many of the deepfake videos seen by the IWF. The methods shared by offenders on the dark web are similar to those used to generate deepfake adult pornography.<\/p>\n\n\n\n<p>The report also underscores how fast the technology is improving in its ability to generate fully synthetic AI videos of CSAM. One \u201cshocking\u201d 18-second fully AI-generated video, found by IWF analysts on the clear web, shows an adult male raping a girl who appears about 10 years old. The video flickers and glitches but IWF analysts describe the activity as clear and continuous.<\/p>\n\n\n\n<p>While these types of videos are not yet sophisticated enough to pass for real videos of child sexual abuse, analysts say this is the \u2018worst\u2019 that fully synthetic video will ever be. Advances in AI will soon render more lifelike videos in the same way that still images have become photo-realistic.<\/p>\n\n\n\n<p>Since April last year, the IWF has seen a steady increase in the number of reports of generative AI content. Analysts assessed 375 reports over a 12-month period, 70 of which were found to contain criminal AI-generated images of the sexual abuse of children<sup>2<\/sup>. These reports were almost exclusively all from the clear web.<\/p>\n\n\n\n<p>Many of the images were being sold by offenders on the clear web in place of \u2018real\u2019 CSAM. These included dedicated commercial sites and forums which include links to subscription-based file hosting services.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/www.iwf.org.uk\/media\/petajsl5\/susie-hargreaves-obe-iwf.jpg?mode=max&amp;width=773&amp;rnd=133656933176970000\" alt=\"Susie Hargreaves OBE, IWF CEO\"\/><figcaption class=\"wp-element-caption\">Susie Hargreaves OBE, IWF CEO<\/figcaption><\/figure>\n\n\n\n<p><strong>Internet Watch Foundation CEO Susie Hargreaves OBE<\/strong>&nbsp;said:&nbsp;<em>\u201cGenerative AI technology is evolving at such a pace that the ability for offenders to now produce, at will, graphic videos showing the sexual abuse of children is quite frightening.&nbsp;<\/em><\/p>\n\n\n\n<p><em>\u201cThe fact that some of these videos are manipulating the imagery of known victims is even more horrifying. Survivors of some of the worst kinds of trauma now have no respite, knowing that offenders can use images of their suffering to create any abuse scenario they want.<\/em><\/p>\n\n\n\n<p><em>\u201cWithout proper controls, generative AI tools provide a playground for online predators to realise their most perverse and sickening fantasies. Even now, the IWF is starting to see more of this type of material being shared and sold on commercial child sexual abuse websites on the internet.&nbsp;<\/em><\/p>\n\n\n\n<p><em>\u201cThe right decisions made now \u2013 by government, the tech industry and civil society \u2013 to ensure that the safety of children is a priority in the design of AI tools could stave off the devastating impact that misuse of this technology will have on global child protection.\u201d&nbsp;&nbsp;<\/em><\/p>\n\n\n\n<p>The snapshot study also assessed more than 12,000 new AI-generated images posted to a dark web forum over a month. Most of these (90%) were so convincing that they could be assessed under the same law as real CSAM<sup>3<\/sup>. Analysts confirmed that more than 3,500 images were criminal and depicted the sexual abuse of children, the majority of which were girls.<\/p>\n\n\n\n<p>Analysts further found that the severity of the AI-generated child sexual abuse had got worse since they first investigated the dark web forum. More than 955 pseudo-photographs (32%) were graded as Category A \u2013 depicting penetrative sex, bestiality or sadism \u2013 an increase of 10 percentage points<sup>4<\/sup>.<\/p>\n\n\n\n<p><strong>IWF Internet Content Analyst Alex<\/strong><sup>5<\/sup>&nbsp;said:&nbsp;<em>\u201cThis is an indication of rapid advances in technology and expertise. Previously, some AI-generated scenarios were difficult to portray with much realism, but now we are seeing offenders experiencing success generating more extreme, \u2018hardcore\u2019, pseudo-photographs of child sexual abuse. These complex scenes usually show sexual penetration and involve more than one person.&nbsp;<\/em><\/p>\n\n\n\n<p><em>\u201cSo-called deepfake videos, which now use AI technology to alter existing imagery, can also be very difficult to separate from \u2018real\u2019 child sexual abuse material, even under an expert analyst eye. Sometimes it is because we already recognise the victim that we can determine the difference.<\/em><\/p>\n\n\n\n<p><em>\u201cIt\u2019s no matter that the fully AI-generated videos that we are now seeing are at a rudimentary stage, they are a stark vision of the future, are still criminal and shocking to view.\u201d<\/em><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/www.iwf.org.uk\/media\/itvphfol\/deborah-dennis.jpg?mode=max&amp;width=773&amp;rnd=133658755007330000\" alt=\"Deborah Denis, CEO of Lucy Faithfull Foundation\"\/><figcaption class=\"wp-element-caption\">Deborah Denis, CEO of Lucy Faithfull Foundation<\/figcaption><\/figure>\n\n\n\n<p><strong>Deborah Denis, CEO of Lucy Faithfull Foundation, said:&nbsp;<\/strong><em>\u201cAdults viewing and sharing sexual images of children is a major problem and one that AI is making worse. AI and its capabilities are rapidly evolving and there is an unacceptable lack of safeguards within the technology which allows online child sex offenders to exploit it every day. It\u2019s vital that tech companies and politicians do more to address these dangers as a matter of urgency.<\/em><\/p>\n\n\n\n<p><em>\u201cOur research shows there are serious knowledge gaps amongst the public regarding AI &#8211; specifically its ability to cause harm to children. The reality is that people are using this new, and unregulated, technology to create some of the worst sexual images of children, as well as so-called \u2019nudified\u2019 images of real children, including children who have been abused.<\/em><\/p>\n\n\n\n<p><em>\u201cPeople must know that AI is not an emerging threat \u2013 it\u2019s here, now. We need the public to be absolutely clear that making and viewing sexual images of under-18s, whether AI-generated or not, is illegal and causes very serious harm to real children across the world.\u201d<\/em><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/www.iwf.org.uk\/media\/o4jlpzeb\/vicki-green-mff-2023.jpg?mode=max&amp;width=773&amp;rnd=133656966628070000\" alt=\"Victoria Green, Marie Collins Foundation Chief Executive \"\/><figcaption class=\"wp-element-caption\">Victoria Green, Marie Collins Foundation Chief Executive&nbsp;<\/figcaption><\/figure>\n\n\n\n<p><strong>Marie Collins Foundation Chief Executive Victoria Green<\/strong>&nbsp;said:&nbsp;<em>\u201cThe impact of AI generated child sexual abuse images on victims and survivors cannot be overstated. When images or videos of child sexual abuse are created, the permanency and lack of control over who sees them creates significant and long-term impacts for those with lived experience. They are revictimised every time these are viewed, and this is no different with AI images.<\/em><\/p>\n\n\n\n<p><em>\u201cTo know that offenders can now use easily available AI technology to create and distribute further content of their abuse is not only sickening for victims and survivors but causes immense anxiety.<\/em><\/p>\n\n\n\n<p><em>\u201cWe urgently need big tech and government to take a joint approach to regulate the use of AI tools. Victims and survivors have a right not to live in fear of revictimisation by technology which should be safe by design.\u201d<\/em><\/p>\n\n\n\n<p>Offenders on the dark web forum investigated by the IWF openly discussed and shared advice on how to use generative AI technology to develop child sexual abuse imagery.<\/p>\n\n\n\n<p>Step-by-step direction is given for offenders to make their own \u2018child porn<sup>1<\/sup>\u2019 and requests are made for fine-tuned CSAM models of particular, named victims or celebrities. IWF analysts have recognised one victim,&nbsp;<a href=\"https:\/\/www.iwf.org.uk\/news-media\/news\/the-horror-and-the-heartbreak-how-one-child-sexual-abuse-survivor-s-torment-will-never-end-thanks-to-ai\/\">\u2018Olivia\u2019<\/a>, whose story was told in our 2018 annual report.<\/p>\n\n\n\n<p>One offender shared links to fine-tuned models for 128 different named victims of child sexual abuse.<\/p>\n\n\n\n<p>However, tutorials to generate AI CSAM as well as the making of fine-tuned AI CSAM models remain legal at this point in time. The UK prohibition on paedophile manuals does not include pseudo-photographs of children and a gap in UK law means that offenders cannot be prosecuted for making AI models fine tuned on CSAM.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Notes<\/h3>\n\n\n\n<p><sup>1&nbsp;<\/sup>The IWF uses the term child sexual abuse to reflect the gravity of the images and videos we deal with. \u2018Child pornography\u2019, \u2018child porn\u2019 and \u2018kiddie porn\u2019 are not acceptable descriptions. A child cannot consent to their own abuse. Read&nbsp;<a href=\"https:\/\/www.iwf.org.uk\/about-us\/our-campaigns\/no-such-thing\/\">more<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Published:&nbsp;&nbsp;Mon 22 Jul 2024 AI-generated imagery of child sexual abuse has progressed at such a \u201cfrightening\u201d rate that the Internet Watch Foundation (IWF) is now seeing the first convincing examples of AI videos depicting the sexual abuse of children. These incredibly realistic \u2018deepfake\u2019, or partially synthetic, videos of child rape and torture are made by [&hellip;]<\/p>\n","protected":false},"author":1001004,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[53],"tags":[],"_links":{"self":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/15517"}],"collection":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/users\/1001004"}],"replies":[{"embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=15517"}],"version-history":[{"count":1,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/15517\/revisions"}],"predecessor-version":[{"id":15518,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=\/wp\/v2\/posts\/15517\/revisions\/15518"}],"wp:attachment":[{"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=15517"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=15517"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/worldcampaign.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=15517"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}