Skip to content
Advertisement

‘Just Google it’ not always best path for information

Advertisement

Today, when children ask difficult questions about what’s happening in the world, the usual response is “Just Google it!“

Well, research has shown that this response may not be the best route.

“Maybe they should go to a library, read a book or ask their parents,” Dr. Safiya Umoja Noble says, explaining her research, which is outlined in the book “Algorithms of Oppression: How Search Engines Reinforce Racism.”

“We have more data and technology than ever in our daily lives and more social, political and economic inequality and injustice to go with it,” Noble writes in her book.

“Today, young people use the internet to sort out their identity.” she told an audience at a recent downtown tech summit, noting that the web now operates as a major vehicle for learning.

“It’s a misnomer to call it a ‘search engine,’” Noble says of Google, which she says works from a framework of profit at all cost. “It is, in fact, an advertising platform.”

Noble was inspired to write her book in 2012, when she was struck by a Google search fail for the words “Black girls.” She was trying to find things on the internet that might have been interesting for her stepdaughter and nieces.

“The first page is almost all pornography,” she said. “And that’s not just Black girls, but Asian girls and Latina girls, too. Women on these sites are not girls.”

Since then, porn has been suppressed in the case of that search.

Although society may think algorithms are neutral or objective, we sometimes forget that human beings are developing the digital platforms we use every day.

Discrimination can be embedded in computer code.

According to Noble, data discrimination is a real, social problem, in that the derogatory stereotypes presented in algorithms help form and/or entrench inaccurate opinions in the mind of the reader.

“Some of the very people who are developing search algorithms and architecture are willing to promote sexist and racist attitudes openly at work and beyond,” her book states.

Noble is an associate professor at UCLA in the departments of Information Studies and African-American Studies, and a visiting faculty member in USC’s Annenberg School of Communication.

Before she became a university professor, Noble had a 12-year career in multicultural marketing, where she helped sell products to African-Americans and Latinos.

Her book has been widely-reviewed in journals and periodicals and was recognized by Bustle magazine as “one of the 10 books about race to read instead of asking a person of color to explain things to you.”

Nobel told the audience that these so-called “search engines” are easy to manipulate, especially if you have some capital and time. Biased algorithms which privilege “Whiteness” and discriminate against people of color are simple to post.

“White supremacist and nationalist organizations spread misinformation,” Noble said. “These disinformation sites post intentionally false information to mislead you.”

In 2015, while President Obama was in office, a search for “nig*a house” took users to a map of the White House.

Google apologized for the offense, caused by vandalism of its maps tools, and they removed the page after a while.

By intentionally confusing the Google system, users can add their own results. Google itself did not appear to have had any role in adding the racist search.

In  another example, Noble displayed a PowerPoint slide of an old “disinformation” result that was posted following the 2016 election, which read that Trump had won the popular vote. It was totally inaccurate.

In other countries, Noble said, the web is more regulated and posts are quickly removed. Large internet search engines in the U.S. reportedly have no incentive to change because they’re making money keeping their “unedited, open” posts as they are, surrounded by profitable advertising.

“If the endgame to the playbook is consumer manipulation, then we have to ask ourselves ‘what is the future of humanity?’” Noble said.

In another example of oppressive algorithms, Noble told the audience that a number of companies are utilizing automation to help them select which resumes to consider.

First, they create a baseline of an “ideal employee.” Then automated software will review resumes and deselect those which don’t meet the gender, race and age outliers of that baseline.

Noble said that the increased racism can be partly attributed to the fact that courses in ethics, justice and morality are not generally taken seriously in university settings. Her ethics class was taught by an adjunct professor.

Hence, today’s algorithms—digital decisions—are reinforcing oppression and enact new modes of racial profiling. She terms this “technical redlining.” And it’s on the rise.

Noble asked the audience to use the power of their technical legitimacy in the jobs they hold at local companies and work to create better systems.

1) Break the monopoly status of Google—create other search engines to show a variety of points of view.

2) Think about ways to take on the “giant”.

3) Learn to be truth-tellers and fact-checkers.

“This conscious bias is creating barriers and stopping us from moving forward,” Noble said of the algorithmically driven data failures that are found online.

“We need to stay diligent with these things,“ Noble told the audience. “Social inequality will not be solved by an app.”

Advertisement

Latest