Page 36 - IAT
P. 36

The Human Algorithm:



               Why Diversity is AI’s Missing Code




                                                                   judicial decisions to healthcare access—the cost
                                                                   of homogeneous development  teams becomes
                                                                   exponentially higher. Every biased algorithm
                                                                   doesn’t just make a mistake; it perpetuates and
                                                                   amplifies  societal inequities  at scale. A biased
                                                                   hiring AI doesn’t just reject one candidate; it
                                                                   shapes thousands of careers. A skewed medical
                                                                   AI doesn’t just misdiagnose one patient; it
                                                                   impacts countless lives.


                                                                   These aren’t hypothetical scenarios—they’re
           Ashutosh Upadhyay                                       already unfolding around us with alarming
                                                                   consequences. Let us take the example of what
                                                                   happened with Amazon’s AI recruitment tool,
           AI Automation Consultant & Marketer                     which turned into a cautionary tale of gender
           ashutosh@thealgohype.com
                                                                   bias in tech. Trained on a decade of male-
                                                                   dominated hiring data, the system taught

           One morning in 2015, a Google engineer’s inbox          itself that being male was a mark of success. It
           pinged with a message that would shake the tech         began automatically downgrading resumes that
           world: their AI was labelling Black people as “gorillas.”   included terms like “women’s” or mentioned
           This wasn’t just a technical glitch but a stark warning   all-women colleges, effectively slamming doors
           that we were building artificial intelligence with real   before deserving women candidates could even
           human blindspots. It was the moment that forced the     reach them.
           tech world to confront an uncomfortable truth: the
           most sophisticated code in the world is worthless if it
           doesn’t understand the humans it serves.


          The Hidden Code in AI Development

          Imagine  building  a universal  translator  while  only
          speaking one language. Sounds absurd, right? Yet
          that’s  precisely  what  happens  when  homogeneous
          teams build AI systems to serve a diverse world.
          The code might be flawless, but the context is fatally
          flawed. We’ve seen this play out time and again—
          healthcare AIs missing heart attacks in women
          because they present symptoms differently than
          men, facial recognition failing in different lighting
          conditions because it was tested primarily on lighter
          skin tones and voice assistants creating digital
          barriers for entire communities because they struggle      Even more troubling is the healthcare
          with accents.                                              algorithm that affected millions of Americans.
                                                                     What seemed like an objective measure—
          The Stakes Have Never Been Higher                          using past healthcare costs to determine
                                                                     medical needs—revealed a dangerous blind
           As AI systems increasingly shape our world—from           spot. Because Black patients historically


          32 | INSIGHT EXCHANGE                                                      INTERNAL AUDIT TODAY
   31   32   33   34   35   36   37   38   39   40   41