Artificial Intelligence Meets Three-Fifths

With respect to incarceration ‘risk assessment’—along with matters like quality of health care provided African–Americans [17], suitability for mortgage lending, and so on given the dominating context provided by the evolutionary descendant of slavery and Jim Crow, any reasonably sophisticated AI entity examining big data sets on individual African–Americans will recover the unbroken historical trajectory leading back to the foundational statement in the US Constitution that an enslaved person is to be counted as ‘three-fifths of a man’ for representational purposes.

For the US polity, this obscenity appears written—and repeatedly rewritten—in stone.

The above is a quote from National library of medicine national center for biotechnology information The names have changed, but the game’s the same’: artificial intelligence and racial policy in the USAR. Wallacecorresponding author.

The world is changing at an alarmingly rapid pace. What was the trend two months ago is now obsolete. People no longer work at a company until retirement and then receive a pension for the remainder of their lives to experience retirement comfortably. They change jobs quicker than a drunken square dance partner on their third round of dosey doe. Along with the change in jobs, relationships are equally as fluid. The concept of lifelong friends now vanishes as quickly as a blocked post, gram, or tweet. Suddenly, Haters appear with digital pitchforks in hand With the push of a button. 

In troduce technology, from gaming to finely tuned masters of your favorite playlist, movie, and virtual reality goggles, and we all feel like Sonic the Hedge Hog Going around the world in a blink. With text-to-speech, I perused through my reading room ebooks faster than a New York minute. Even the top entertainer is named Taylor Swift. 

Life is good, right? Or do we need to pause amidst all of this swiftness, before we begin to celebrate the tech revolution as the answer to everything?  

Artificial intelligence (AI) is now here as our world continues to change. Inundated by all of this change, some things don’t. Things such as bias, racism, and white supremacy. They exist in the hearts and minds of some who would want to turn back the hands of time. There was a time when blacks and whites were not allowed to drink out of the same fountain or use the same bathroom. Numerous reports say the infrared technology that activates a squirt of soap to cleanse our hands in public restrooms does not work well with darker-complected people. This would seem like a throwback to a different time in the modern world. Some would say, “That’s just silly, or at worst, just a coincidence,” but let’s look at facial recognition.

A December 19th, 2019 Washington Post article by Drew Harwell entitled Federal study confirms racial bias of many facial-recognition systems, casts doubt on their expanding use. Says,

“The faces of African American women were falsely identified more often in the kinds of searches used by police investigators where an image is compared to thousands or millions of others in hopes of identifying a suspect.

Algorithms developed in the United States also showed high error rates for “one-to-one” searches of Asians, African Americans, Native Americans and Pacific Islanders. Such searches are critical to functions including cellphone sign-ons and airport boarding schemes, and errors could make it easier for impostors to gain access to those systems….

Women were more likely to be falsely identified than men, and the elderly and children were more likely to be misidentified than those in other age groups, the study found. Middle-aged white men generally benefited from the highest accuracy rates.”

Just imagine the nefarious activity that can develop out of these inaccuracies. For people of color, this echoes what was considered a bygone era of the past. The entryway into the world of AI presents increasing amounts of hurdles and pitfalls for already vulnerable communities. Though under constant attack, laws have been put in place for protected classes, including marginalized communities of color. Backdoor hacks that pass those protections can be found in Algorithms that use systems analysis and despicable Bell Curve-type input to dictate how these systems will communicate through and with minority populations. Racial bias that is illegal can be programmed right into AI. What segregationists Jessie Helms, George Wallis, Bull Conner, and many others could never pull off, developers of some of the newest AI technology have, whether purposefully or not.

“AI-generated images: Reinforcing racial and gender stereotypes

Bloomberg Graphics investigated AI bias using text-to-image conversion with Stable Diffusion, an open-source AI platform. The results were alarming, with the AI system exacerbating gender and racial stereotypes, surpassing those found in the real world. When prompted with terms like “CEO” or “prisoner,” the generated images consistently exhibited biases.

The investigation revealed underrepresentation of women and individuals with darker skin tones in high-paying job-related images, while overrepresentation occurred in low-paying job-related images. In searches related to crime, the AI disproportionately generated images of darker-skinned individuals, despite a more diverse prison population in reality.

These findings demonstrate that AI algorithms, driven by biased training data and human-programmed tendencies, reinforce societal prejudices rather than mitigating them.

Unveiling the roots of AI bias

The bias in AI systems can be traced back to their learning process, which relies on examples and data input. Humans play a pivotal role in shaping AI behavior, either intentionally or unintentionally, by providing data that may be biased or stereotypical. The AI then learns and reflects these biases in its results.

Reid Blackman, an expert in digital ethics, cited the case of Amazon’s AI resume-reading software, which unintentionally learned to reject all resumes from women. This example highlights how AI can inadvertently perpetuate discrimination if it learns from biased examples.

Addressing AI bias requires comprehensively examining AI systems’ data, machine learning algorithms, and other components. One crucial step is assessing training data for bias, ensuring that over- or underrepresented groups are appropriately accounted for.”

From Cyptopolitan: story entitled, AI Bias: How Technology Reflects and Reinforces Prejudices

by Glory Kaburu

Now, as we take a fresh look at this article’s initial quote, we must take note of an important pattern. The premise is, in many respects, set for the installation of a permanent racial underclass. Protracting the original purpose of the Three-Fifths Compromise in the 1787 construct of the U.S. Constitution to other people of color will become the predictable outcome as AI continues to interpret big data to create a narrative. The narrative is a simplified way of saying the basis on which policies are built, budgets are justified, and how culture is influenced.

It would be a big mistake to feel safe as a person of color who is not of African descent because there is plenty of data that would include these other groups in this underclass existence space based on historical and cultural data. That, combined with big data, could prove equally detrimental to all people of color unless an awakening takes place in these early stages of AI development.

The BIPOC community needs to become more proactive in education, allocation, and cultural change to bring more color into these fields of study and job preparedness for AI development and big data management. However, communities of color cannot do this alone. It is of the utmost importance that allies and accomplices within the majority culture put their implied privilege to purpose and become side-by-side partners in the procurement and allocation of funding, the development of resources, and policy changes in this fight for change.

AI does not have to be the devil that many have characterized it as, in the same way that television and the internet were referred to in the past. To guarantee its potential benefits for everyone, all people must be represented in its development.

This will not be a one-time foray by Three-Fifths Magazine into this transformative moment in America. We will continue to be on the front line and sound the alarm.

“And I heard the voice of the Lord saying, “Whom shall I send, and who will go for us?” Then I said, “Here I am! Send me.”

Isaiah 6:8 ESV

By Kevin Robinson Founder, Editor/Publisher

17. Madhusoodan J. Is a biased algorithm delaying health care for Black people? Nature. 2020;588:546–547. doi: 10.1038/d41586-020-03419-6. [PubMed] [CrossRef] [Google Scholar] [Ref list]

Leave a comment