Systemic racism is a hotly debated topic these days, both in the news, social media, and the streets of America. I am not going to try to educate anyone on the historical racial tensions in this article. I have my opinions of what is or is not working just like everyone else. I will try to convey that no matter what your stance is, you can reinforce bias behaviors without even realizing it even if you are as “woke” as they come.

Especially if you work for someone other than yourself, you must treat your technology and data science projects as if bias is real and exists. If you choose to look away, you may release a product into this world that harms others. You may not even realize it is happening until something egregious happens. By then, it is far too late. You have damaged your personal and professional reputation. You may have harmed individuals with your medical, financial, or fraud predictions. You may feel regret and remorse for years to come.

Learning how to avoid the worst-case scenario requires some background into our brains.

Our brains are biased for a reason.

I have biases because I am human. Humans sort and process large torrents of information each day, every day. Our brains quickly direct input into buckets. If we try to think deeply about each piece of information, we will be paralyzed. Our brains are efficient at taking short-cuts.

Some of this incoming information may be landing in the wrong bucket. It mixes and mingles with our thoughts and actions, resulting in implicit bias. We might be unaware that we have a preference for or against people, things, or activities.

So if we are unaware of our implicit biases, how can we prevent them from leaking into our data, code, and algorithms? Unfortunately, there are no magic wands for removing bias. Self-awareness is a good first step. Harvard has a fascinating study going on right now. Project Implicit researches this space, and there are some online ‘tests’ that report out possible bias in your responses.

Project Implicit

Log in or register to find out your implicit associations about race, gender, sexual orientation, and other topics! Or…

implicit.harvard.edu

With new experiences and self-reflection, you can move the misplaced information to the bucket you want. Imagine the next time you use a questionable phrase on which someone calls you out. While blurting out an apology, your brain is frantically searching for that phrase and moving it into the ‘bad’ bucket.

It takes work, and it takes time.

#technology #racism #bias #data #data-science #data analysisa

How you can prevent systemic racism in your own small way
1.10 GEEK