This tool provides researchers into extremist groups, individuals, and actions with a system that makes their work more efficient and collaborative. Extremist Explorer collects data from a variety of social media sites, forums and chatrooms, and algorithmically identifies violent hate speech there. Integrated into EE are tools such as the research platform Spiderfoot, facial recognition, and a databasing function.
If you are a researcher or journalist who focuses on extremism, you may qualify to use this tool. Please email [email protected] for more information.
In partnership with the Lambda School, Human Rights First is building a tracker that collects and visualizes law enforcement use-of-force incidents posted on social media. Incidents are identified with artificial intelligence and approved manually. This tool informs journalists and the wider public with a real-time updating database of law enforcement use-of-force across the United States. The underlying database will be available for any researchers to access.
Video evidence of human rights violations and use-of-force is voluminous and we are creating an AI model that will help human rights activists quickly sift through the mountain of evidence that is likely in their possession. Human Right First is building a computer vision algorithm that identifies within videos the use of force and use-of-force objects, such as guns and batons, used by law enforcement. The AI will also detect in videos relationships between objects (e.g., officer with a baton on top of protester).
This database tracks instances of COVID-19 in US Immigration and Customs Enforcement’s (ICE) detention centers across the country. The tool’s mapping provides insight into how the pandemic infiltrated ICE detention centers around the country. The project also reports data on testing administration and tracks the lived experiences of detainees.