"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2019-03-16T22:26:53Z"
data-video-section="politics"
data-canonical-url="https://www.cnn.com/videos/politics/2019/03/16/white-supremacy-white-isis-new-zealand-reaction-sot-wajahat-ali-nr-vpx.cnn"
data-branding-key="key-moments-from-air"
data-video-slug="white supremacy white isis new zealand reaction sot wajahat ali nr vpx"
data-first-publish-slug="white supremacy white isis new zealand reaction sot wajahat ali nr vpx"
data-video-tags="ana cabrera,companies,continents and regions,discrimination,international relations and national security,misc people,national security,new york times co,new zealand,oceania,racism and racial discrimination,societal issues,society,terrorism,terrorism and counter-terrorism,terrorist attacks,unrest, conflicts and war,isis,misc organizations"
data-details="">
"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2019-03-16T22:26:53Z"
data-video-section="politics"
data-canonical-url="https://www.cnn.com/videos/politics/2019/03/16/white-supremacy-white-isis-new-zealand-reaction-sot-wajahat-ali-nr-vpx.cnn"
data-branding-key="key-moments-from-air"
data-video-slug="white supremacy white isis new zealand reaction sot wajahat ali nr vpx"
data-first-publish-slug="white supremacy white isis new zealand reaction sot wajahat ali nr vpx"
data-video-tags="ana cabrera,companies,continents and regions,discrimination,international relations and national security,misc people,national security,new york times co,new zealand,oceania,racism and racial discrimination,societal issues,society,terrorism,terrorism and counter-terrorism,terrorist attacks,unrest, conflicts and war,isis,misc organizations"
data-details="">
"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2023-05-30T19:56:54Z"
data-video-section="business"
data-canonical-url="https://www.cnn.com/videos/business/2023/05/30/artificial-intelligence-pose-risk-of-extinction-humanity-salin-sot-nc-vpx.cnn"
data-branding-key=""
data-video-slug="DO NOT USE artificial intelligence pose risk of extinction humanity salin sot nc vpx"
data-first-publish-slug="DO NOT USE artificial intelligence pose risk of extinction humanity salin sot nc vpx"
data-video-tags=""
data-details="">
Video Ad Feedback
Experts warn AI could pose 'extinction' risk for humanity
"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2023-02-14T10:58:28Z"
data-video-section="business"
data-canonical-url="https://www.cnn.com/videos/business/2023/02/14/flirt-artificial-intelligence-ai-dating-apps-hinge-tinder-orig.cnn-business"
data-branding-key="nightcap"
data-video-slug="flirt artificial intelligence ai dating apps hinge tinder orig"
data-first-publish-slug="flirt artificial intelligence ai dating apps hinge tinder orig"
data-video-tags="artificial intelligence,business and industry sectors,business, economy and trade,cnn,companies,computer science and information technology,domestic alerts,domestic-business,domestic-health and science,iab-artificial intelligence,iab-business and finance,iab-computing,iab-industries,iab-software and applications,iab-technology & computing,iab-technology industry,international alerts,international-business,international-health and science,mobile apps,mobile technology,openai,software and applications,technology,warnermedia"
data-details="">
Video Ad Feedback
CNN tried an AI flirt app. It was shockingly pervy
"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2023-02-11T02:28:49Z"
data-video-section="business"
data-canonical-url="https://www.cnn.com/videos/business/2023/02/11/deepfake-newscast-ai-chinese-messaging-wang-pkg-ac360-vpx.cnn"
data-branding-key=""
data-video-slug="deepfake newscast AI chinese messaging wang pkg ac360 vpx"
data-first-publish-slug="deepfake newscast AI chinese messaging wang pkg ac360 vpx"
data-video-tags=""
data-details="">
Video Ad Feedback
These newscasters you may have seen online are not real people
"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2023-01-28T11:20:56Z"
data-video-section="business"
data-canonical-url="https://www.cnn.com/videos/business/2023/01/26/nightcap-chatgpt-students-clip-orig-nb.cnn"
data-branding-key="nightcap"
data-video-slug="nightcap chatgpt students clip orig nb"
data-first-publish-slug="nightcap chatgpt students clip orig nb"
data-video-tags="artificial intelligence,business and industry sectors,business, economy and trade,companies,computer science and information technology,domestic alerts,domestic-business,domestic-health and science,education,education systems and institutions,iab-artificial intelligence,iab-business and finance,iab-computing,iab-education,iab-education industry,iab-industries,iab-technology & computing,iab-technology industry,international alerts,international-business,international-health and science,openai,primary and secondary education,teachers and teaching,technology"
data-details="">
Video Ad Feedback
Hear why this teacher says schools should embrace ChatGPT, not ban it
"
data-check-event-based-preview=""
data-is-vertical-video-embed="false"
data-network-id=""
data-publish-date="2023-01-24T14:05:56Z"
data-video-section="business"
data-canonical-url="https://www.cnn.com/videos/business/2023/01/24/what-is-chatgpt-cnntm-pkg-yurkevich-contd-vpx.cnn"
data-branding-key=""
data-video-slug="what is chatgpt cnntm pkg yurkevich contd vpx"
data-first-publish-slug="what is chatgpt cnntm pkg yurkevich contd vpx"
data-video-tags="artificial intelligence,business and industry sectors,business, economy and trade,companies,computer science and information technology,domestic alerts,domestic-business,domestic-health and science,iab-artificial intelligence,iab-business and finance,iab-computing,iab-industries,iab-technology & computing,iab-technology industry,international alerts,international-business,international-health and science,openai,technology"
data-details="">
Video Ad Feedback
He loves artificial intelligence. Hear why he is issuing a warning about ChatGPT
Among the many tragedies of the massacre at two New Zealand mosques on Friday is a bitter irony: The terrorist who killed at least 50 people in an Islamophobic attack resembled in many ways a member of ISIS. If his life had gone different in some way, he might well have ended up one and killed people somewhere else in its name. The type of extremism and hatred is of course different. But they have at least one thing in common: the internet as a tool of radicalization.
There is still much we don’t know about the suspect and his background. But before anything at all was known about him, anyone who has studied or covered extremism and these kinds of attacks could have given you an educated guess about what kind of person he was: Male. Probably in his 20s. Decent chance of at least a minor criminal record. More than likely a history of hatred toward or violence against women. Oh, and one more thing — probably spent a fair amount of time on the internet.
People could easily become radicalized before social media. Many are still radicalized without it. But social media, often in combination with other factors, has proven itself an efficient radicalizer, in part because it allows for the easy formation of communities and in part because of its algorithms, used to convince people to stay just a little longer, watch one more video, click one more thing, generate a little more advertising revenue.
The recommendations that YouTube provides, for instance, have been shown to push users toward extreme content. Someone who comes to the site to watch a video about something in the news could quickly find themselves seeing a conspiracy theory clip instead, for instance. (In January, YouTube said it was taking steps to remedy this.) A few years ago, someone looking for information about Islam could soon find themselves listening to a radical preacher.
Combine those algorithms with men who are disaffected, who may feel that the world owes them more, and you have a recipe for creating extremism of any stripe.
“They’re picking up an ideology that helps them justify their rage, their disappointment, and it’s something available,” Jessica Stern, a research professor at Boston University’s Pardee School of Global Studies and the co-author of “ISIS: The State of Terror,” told CNN Business Friday. “Terrorism runs in fads. We noticed that people were picking up the ISIS ideology who weren’t even Muslim, they were converting to Islam. The ISIS ideology was an attractive way for some of these men to express their rage and disappointment. This is another ideology that is becoming very popular, it’s another fad.”
For all the largely much-deserved criticism they’ve gotten recently over all the things they’ve failed to act upon, the social networks did step up and take real and impressive action when faced with a deluge of ISIS supporters and content. The big tech companies could be taking similar action against white supremacists now.
“The issue on mainstream sites is for the most part there’s been an aggressive takedown” of ISIS-related content, Seamus Hughes, the deputy director of the Program on Extremism at George Washington University, said. “That same dynamic hasn’t happened when it comes to white supremacy.”
The companies could take action against white supremacists now. Indeed, they could go on forever like that, playing whac-a-mole with different movements that pop up and begin radicalizing their users, moving against them after enough people have been killed. It would be easier for them to do that than to actually deal with the underlying problem of those algorithms designed to keep people around.
“It makes sense from a marketing perspective; if you like Pepsi then you’re going to watch more Pepsi videos… but you take that to the logical extreme with white supremacy videos,” Hughes said. “They’re going to have to figure out how to not completely scrap a system that has brought them hundreds of millions of dollars of ad revenue while not also furthering someone’s radicalization or recruitment.”
Perhaps the most disheartening aspect of this is that the companies have been told, over and over again, that they have a problem. Ben Collins, a reporter with NBC News, tweeted Friday, “Extremism researchers and journalists (including me) warned the company in emails, on the phone, and to employees’ faces after the last terror attack that the next one would show signs of YouTube radicalization again, but the outcome would be worse. I was literally scoffed at.”
So what should the platforms do now?
Asked that question, Bill Braniff, the director of the National Consortium for the Study of Terrorism and Responses to Terrorism (START) and a professor of the practice at the University of Maryland, said, “What I believe we should be asking them to do is to continue to minimize the salience or the reach of violent extremist propaganda, that calls for violence… but not to limit themselves to just content takedowns as the way to do that. What happens when large platform takes down this content or these views is that the content just shifts to smaller platforms. … Maybe fewer people will be exposed over time, and that’s a good thing, but that’s not the same as a comprehensive solution.”
Content takedowns alone can both contribute to a persecution narrative and drive people to smaller, more radical sites, Braniff noted. And he thinks that means giving up an opportunity to use the algorithms to redirect, rather than reinforce.
“We know that people… can actually be addressed through counseling [and] mentorship,” he said. “If instead of directing people who might be flirting with extremism to support, if you censor them and remove them from these platforms you lose… the ability to provide them with an off-ramp.”
While noting that platforms should still take down content that explicitly calls for violence, which also violates their terms of service, Braniff said, “There’s some content that doesn’t violate the terms of use, and so the question is, can you make sure that information is contextualized with videos before and after it on the feed?”
The comprehensive solution he sees is a change to the algorithms, so that they could point people to differing views or even in some cases to support such as counseling.
“Algorithms can either foster groupthink and reinforcement or they can drive discussion,” he said. “Right now the tailored content tends to be, ‘I think you’re going to like more of the same,’ and unfortunately that’s an ideal scenario for not just violent extremism but polarization … We’re only sharing subsets of information and removing the middle ground, the place where we come together to discuss different ideas… [a] massive part of violent extremism is polarization, and it’s really dangerous.”