logo

Role of disruptive technology in preventing violent extremism

Sajid Amit & Lumbini Barua | Wednesday, 13 November 2019


For many Bangladeshis, the spectre of terrorism appeared to be a remote phenomenon until it hit close to home, with the Holey Artisan Bakery (HAB) Attack of July 2016. It was the first instance of a large-scale, highly coordinated and well-planned attack by militants in the heart of Dhaka city. The city and the country at large were possessed by fear, which only faded gradually, as the government launched a successful crackdown, arresting potential perpetrators, busting militant dens and hide-outs, and even foiling terrorist attempts. Bangladesh has also been internationally recognised for the government's successful counter terrorism (CT) efforts.
That said, both common sense and empirical evidence suggest that any nation once afflicted by the malaise that is violent extremism, cannot afford to be complacent or assume that extremism can be extinguished once and for all. While the government continues to be watchful, it is important that more and more private sector initiatives take root, aimed at preventing and countering this malaise.
At a time of increasingly smart technologies and global interconnectedness through social media, there is a growing number and variety of channels for extremist groups, with which to disseminate propaganda and recruit people to their cause. Therefore, the fight against radicalisation and violent extremism ought to also shift to the digital space.
Recently, we conducted a study to map disruptive digital technologies deployed globally for preventing or countering violent extremism (P/CVE). These technologies attempt to identify, censor and remove extremist content online; posit counter narratives to debunk extremist ones and create platforms to connect P/CVE stakeholders.
For instance, in the United States, an enterprising consortium consisting of researchers, a communications company, and technology company, have developed an initiative called the "Redirect Method." The Redirect Method identifies persons actively seeking content shared by IS and proceeds to redirect them to a YouTube playlist containing videos disproving radical ideologies. The interesting angle to this is that the Redirect Method avoids censoring any content. The pilot phase of this project produced 500,000 minutes of videos with average viewing time of 8 minutes 20 seconds.
Worth noting that the problem with censoring extremist content is that the removed content simply gets uploaded again from a different website, using different online identities. However, recently, there has been a work-around to this.
In 2018, Dr. Hany Farid, chair of the Computer Science Department at Dartmouth College, New Hampshire, United States, developed a robust algorithm-based technology named "eGLYPH." First, he created a database of digital signatures known as "hashes" consisting of hundreds of videos and keywords related to IS. Next, he deploys a web crawler to search the web for any content that matches with the hashes of the database. It does this, every 20 minutes, throughout the day, thus rendering it virtually impossible to re-upload any version of the already-flagged content.
Popular social media platforms have also been quite active in P/CVE programming, and in coordinated ways. In July 2017, Facebook, Microsoft, Twitter, YouTube, and several other large technology companies came together to form a network called the "Global Internet Forum to Counter Terrorism (GIFCT)" to counter violent extremist propaganda. The network is using artificial intelligence to identify content that can incite people to violence, often deactivating accounts more proactively than practiced earlier. In 2018, Facebook deactivated 583 million accounts that it suspected could be engaged in behavior harmful to community.
In terms of mobile apps, there are several examples of effective ones from different parts of the world. There is an app named "UNDP Africa Toolkit: Tackling Extremist Narratives," which aims to guide civil society organisations in conducting campaigns to tackle extremist narratives. There is an app called "MASAR" by Hedayah, the famed international center for excellence for CVE, which connects CVE practitioners so that they can share knowledge and best practices. There is also an interesting UK-based app called AVE ("Against Violent Extremism"), which connects real life stories of former extremists and survivors with "at risk" youth.
In Bangladesh, there are a handful of interesting initiatives worth pointing. There is a UNDP Project called the Partnerships for a Tolerant and Inclusive Bangladesh (PTIB), which has supported ICT-based interventions and startups interested in promoting P/CVE. Moreover, USAID's Obirodh Project has actively tried to invest in entrepreneurs interested in using technology for P/CVE. Furthermore, an NGO named Young Power in Social Action (YPSA) has launched an app known as "YPSA CVE Initiative" in Cox's Bazar, to combat VE through community engagement. This app was created to raise awareness on VE and promote youth engagement. Last but not least, there is an interesting app created by the Counter Terrorism and Transnational Crime (CTTC) unit of the Dhaka Metropolitan Police (DMP), known as "Hello City." Developed in 2016, the app can be used to report on incidences of extremism, cybercrime, and other kinds of crimes, while the informant's identity remains protected. There are also other apps and online tools being built by Bangladeshi entrepreneurs and stakeholders.
While such initiatives are highly laudable, there is space for much more. For instance, on the curricular front, research suggests that blended multimedia-based and interactive courses that offer online and offline learning opportunities for youth can be effective. The interactive and multimedia dimensions are important to withhold youth interest. Educational institutions can be deployed to roll out such courses en masse. There also needs to be a greater number of centers that can combat fake news and teach people to spot fake news. Fake news has already been identified as a driver of intercommunal violence and radicalisation in Bangladesh. Beyond curricular interventions and fake news awareness, there is an entire area of internet safety that can be explored in-depth which offers a deep and holistic perspective to digital literacy.
While tech companies, communications companies, educational institutions, government offices, and various private sector entities engage in this space, it is also important to not ignore religious leaders. It is possible to include religious leaders in technology-based interventions, for instance, by forming networks of tolerance preachers, which can in turn, serve as a platform for other aspiring preachers and theologians. Ultimately, the ream of possible interventions is as unlimited as the creativity and resourcefulness of practitioners.
..............................................
Sajid Amit is Director, Center for Enterprise and Society (CES), University of Liberal Arts Bangladesh (ULAB).
[email protected]
Lumbini Barua is Research Associate, Center for Enterprise and Society (CES), University of Liberal Arts Bangladesh (ULAB).
[email protected]