A blog dedicated to speaking our truth and exposing the world

Month: January 2021

Timnit Gebru – Forced Leave For Revealing Problematic Undertones of Google’s AI Model?

Timnit Gebru, an accomplished and respected figure in the computer science field, is particularly known for her revolutionary work in artificial intelligence (AI) and its ethics. As a strong advocate for expanding diversity in technology, she is also the co-founder of Black in AI, an organization of Black researchers working on AI. At Google, she assisted in the construction of the most diverse AI team with leading experts, and often challenged various mainstream AI practices, such as the commercialization of face recognition to police because of its inaccuracy towards recognizing women and people of color- bringing up the possibility of misuse and discrimination against POC. 

Recently, however, Gebru has left the multinational technology company over complicated disputes. A series of tweets, leaked emails and online articles have revealed Gebru’s exit was the result of a conflict over a paper she co-authored.

According to Jeff Dean, the head of Google AI, stated that the paper “didn’t meet our bar” and “ignored too much relevant research” on the efforts of minimizing the effects of large language processing models on the environment and bias; quickly cutting off Gebru’s access to her work email after a series of internal email conversations (MIT Technology Review). Gebru, however, among many others, claim that she was wrongfully fired, and forced out of the company. In fact, more than 1,400 Google staff members and 1,900 external petitioners have signed for her support.

According to the MIT Technology Review,  who had access to early drafts of the paper–unavailable to the public as of now–covered a series of ethical issues and risks regarding the usage of large language processing models, which Google has been using and building overtime. For some context, natural language processing models are essentially machine learning systems built to sort through sample text to draw some conclusions or “predictions” from them. So, the larger the system, the more text needed.

The four main issues discussed in the paper include environmental and financial costs, enigmatic or inscrutable datasets, misdirected research efforts, and essentially the potential of conning users and providing misinformation. 

Firstly – not only can training machine learning models be extremely costly, they produce a whole lot of CO2 as well. According to the MIT Technology Review, training a variation of Google’s language model (BERT), which supports the company’s search engine, has produced 1,438 pounds of CO2. Worse, this number should be viewed as a minimum and the emissions produced after testing the model once. 

Secondly, and this is where everything begins to intertwine, there’s the problem of using massive amounts of text to train the machine. Large language models are going to need more text samples to sort through and learn from. Today, much of that data comes from the Internet and all the websites that come with it- even I used a sample from Amazon for my own RNN sequence. Nonetheless, the problem isn’t the Internet itself really, but the potential content the machine might be sorting through. It’s important to keep in mind that an AI model learns what it is given- it cannot distinguish between racist, sexist, or abusive language in the dataset. 

Obviously, with the expanse of the internet and massive dataset, there is no guarantee such language will not be included in the training set. Simply put, teaching an AI model that such language is “ok” would be bad. In addition to the normalization of such language, these AI models will not be able catch the nuances of new anit-sexist and anti-racist vocabularly that has arisen from recent political risings, such as the MeToo and Black Lives Matter movements. Furthermore, these models will miss the complexities of languages, cultures, and norms of marginalized groups, those with less access to the Internet and smaller linguistic footprints online. The language generated from such AI models will only be able to produce homogenized  results, disproportionately reflecting the richer countries and communities, and failing to properly mirror the diversity we find today. 

This disproportion is the limitation of AI- these language models cannot understand language, but merely analyze the patterns employed in language. The real problem is Google is willing to ignore these potential consequences and profit off of them. By commercializing this skill–analyzing data and language– the focus for multiple companies, and likely Google as well, becomes to solely increase the accuracy results of these models rather than producing learning models that project to understand language and overcome these limitations for POC. Not only will such companies be sweeping mass amounts of profit from these models, there are no guidelines as to how such models can be used. These models can also learn to mimic human beings and spread misinformation or other malicious content. 

As another person of color and aspiring computer scientist, not only is this disheartening to hear, but also concerning. AI ethics is already a touchy subject, and with the increasing use and research towards AI functions, it’s crucial we get the ethical foundations correct and fair. Technological innovations designed to benefit one group cannot be considered innovations. It simply becomes a disadvantage for another group. Technology should benefit the large majority and constantly aim for the amelioration and consideration of all groups, regardless of background, race, ethnicity, or gender. As the world gravitates towards the increased use of tech, it’s only fair that it’s constructed to include everyone- not a select few.

– Janet

“Life Goes On,” it Really Does

Whether you are one of the many Kpop “armys” or not, I am sure someone has introduced the song “Life Goes On” by BTS to you. If no one has, I suggest listening to it because it will leave you feeling wholesome and hopeful for the coming year. 

“Life Goes On,” released on November 20th, is the first track of BTS’ mini album – BE – and is personally one of my favorites to relax to at the end of a long day. The song accurately captures the past year, saying: “어느 날 세상이 멈췄어, 아무런 예고도 하나 없이 [One day, the world stopped, Without any warning].” Immediately, the beginning draws listeners in and relates to their feelings of loss. Many of us have felt like the world stopped, and with it the lives we were looking forward to. We feel robbed of a year…but “Life Goes On” gifts it back to us. 

The song continues: “끝이 보이지 않아, 출구가 있긴 할까? [There’s no end in sight, Is there a way out?]” I personally resonate with these lines. 2020 has been like the photo calendar you hang on your wall or keep at your desk. Except, instead of seeing pictures of kittens or Harry Potter characters at the start of each month, we saw wildfires, police brutality, illness, homelessness and political instability. The government has not been as forgiving as I would like in response to these events, and unfortunately there are many people who lack decent empathy towards those affected by 2020. I often asked myself if the cycle of events would ever end so I could return to my normal life. However, after listening to this song, I do not doubt that the cycle will. 

“Like an echo in the forest, 하루가 돌아오겠지, 아무 일도 없단 듯이 [The day will come back around, As if nothing happened].” Towards the end of the song, BTS sings these lines and I am not an optimist, but I believe in what they sing. With Covid-19 vaccinations underway, the world will eventually return back to normal. I will be able to go to Quickway with my friends and sit in the lounge, drinking sweet, mango slushies with them. And you will be able to stop scrolling through TikTok and go outside to experience the world we have neglected in our time at home. 

That being said, I disagree with the line that it will be “[as if nothing happened].” Society, particularly the medical and science fields, will remember this. People who have been forced onto the streets, or work multiple jobs, or struggle with maintaining their food supply will remember this because it will be a struggle to go back to “normal.” Although this is not a political article, I will say that I do not believe the US government is doing enough to help those affected by Covid-19; by a virus that was created naturally and spread because the government did not respond fast enough. But I digress, “Life Goes On” gives me hope for a better future. This year has been a struggle for sure, but with lows comes the highs and I think we are all due for a good year.

– Ava

Powered by WordPress & Theme by Anders Norén