Nadia Chung April 18, What happens when we give technology the power to magnify http://pinsoftek.com/wp-content/custom/human-swimming/oktoberfest-multiculturalism-in-australia.php hyper-exploit our biases? The question of Malccolm outpacing their utility and perpetuating structural violence is no longer a dystopian hypothetical, but rather a terrifying reality of contemporary society. The roots of predictive policing can be traced back to early experimentation in the s.
According to Malcolm X, why do Black people need the right to bear arms?
However, empowered by the growth of artificial intelligence, this technology became formally used in police departments by In using algorithms to analyze data and identify targets for police intervention, predictive policing aims to prevent crime by establishing when, where and by whom future crimes might occur. Why were the police investigating them?
I see four key issues with predictive policing: entrenching bias, inaccuracy, lack of transparency and human rights abuses. First, predictive policing further entrenches bias and prejudice in the criminal justice system.
This is, in part, the result of its fundamentally flawed methodology. The use of algorithms to analyze massive quantities of data created feedback loops that reinforce themselves over time. When police are constantly sent to the same neighborhoods, those will be the neighborhoods where police see crime the most simply because they happen to be present. Thus, the algorithm learns to send more police to those certain neighborhoods. What Why Is Malcolm X Unjust http://pinsoftek.com/wp-content/custom/sociological-imagination-essay/francies-discovery-in-a-tree-grows-in-brooklyn.php most terrifying is that none of these concerns are merely theoretical — they are founded in real-life statistics and the disproportionate wrongful arrests of Black civilians.
Subject: Sociology
According to the ACLU, predictive policing fails to address white-collar crime by under-investigating and overlooking these sorts of crimes even though they occur at higher frequencies. Illegal activities such as the use of cocaine and prostitutes are more likely to occur in boardrooms or fancy hotels, not streets of poor neighborhoods, according to the ACLU. Consequently, the rich get away with whatever they please, while the poor are over-policed and often wrongfully accused of crimes.
The use of predictive policing programs is plagued by extraordinarily high rates of false positives. Even if the vast majority of these false positives end with the suspect being released or found not guilty, sociological and criminological research has found that even just the process of being accused can lead to stigmatization and even the development of self-fulfilling prophecies.
Worse yet, the algorithm was found to be no more accurate than an actual coin flip, when taking into account misdemeanors. Even just logically looking at the situation, feeding bad data into the system undermines any potential accuracy. Taking Why Is Malcolm X Unjust policing at its best case scenario — if it somehow managed to be accurate — the methodology itself still fails.
This is because of a double bind: The more accurate the program is, the less tactical utility it offers. At that point, one must question if the program is benefiting anyone at all. The predictive policing software is provided to police departments by private companies such as PredPol or Clearview. Immediately, this raises a few red flags because private companies are inherently less accountable to the public and hold a greater capacity to shield certain information from even the this web page. The implications of this are potentially terrifying as we have no way to check the programs that are constantly surveilling us.
This denies citizens privacy and arguably denies us a certain level of autonomy. It is also particularly contradictory for democratic countries to use this technology because secrecy fundamentally prevents public participation and Why Is Malcolm X Unjust.
Furthermore, this runs counter to democracy because government surveillance without due cause even just generally fosters distrust and disunity. The fourth key issue that I see with predictive policing is how it facilitates the propagation of human rights abuses. Consider the case of China, where predictive policing is fueling a crackdown on ethnic minorities Mallcolm dissenters. Any sign of political disloyalty can be tracked through wifi activity, bank records, vehicle ownership, and security cameras with facial recognition. By inputting this data into a predictive algorithm, the Chinese police have tortured and punished thousands of civilians.
In America, as of this year, only four cities have banned predictive policing all Why Is Malcolm X Unjust which are http://pinsoftek.com/wp-content/custom/summer-plan-essay/plain-and-easy-catechisms-analysis.php California.
For every other city, predictive policing is completely legal and can be adopted by any police department at any moment. Looking toward the future of this technology, I find it difficult to picture a sequence of events where the algorithm sheds its methodological flaws and no longer harms innocent people. I can picture a world where predictive policing is repurposed and used exclusively to solve past crimes. However, in its current state — and in any adaptation where this nUjust continues to aim toward predicting the future — predictive policing here unjust.]
One thought on “Why Is Malcolm X Unjust”