Impartial or I'm partial⚖️ [HTC #37]
Hello World! Welcome to our 37th edition of Hold the Code.
In this edition, we cover a recent, unexpected deal between Uber and taxi drivers, the connection between technology and nature, and the culture biases that exist many AI systems.
Cabbies and Uber Unite!
Uber has recently made a deal with New York taxis to allow cabbies to use their platform to find passengers. The deal has the potential to benefit both Uber and the cabbies by providing the app with more drivers amidst a labor shortage and providing taxis with a wider pool of passengers.
A Two-for-One Deal
Uber has launched similar deals in other cities, such as Spain, Colombia, Germany, Austria, and Hong Kong. The deal lets cabbies use Uber’s platform to find rides. Unlike traditional ride-share drivers, Uber allows cabbies to view the fare and the location of the ride before accepting. This means passengers may order an Uber but have a yellow cab show up.
Uber may be eager for the new labor source provided by taxis. Since the pandemic many of their ride-share drivers have quit. Recently, many drivers also have been questioning if ride-sharing is profitable enough to continue due to rising gas prices. Therefore, the taxis give Uber a welcome boost of labor in New York.
Taxi drivers will also see benefits that include a wider pool of customers and more profitable into-the-city trips. Typically, passengers hail cabs for trips that take them out of the city. Uber may help taxis balance out their travel and enable them to turn a larger profit getting back into the city.
A Rocky Past…
Even a few years ago, a deal like this would have seemed far-fetched. Uber’s Ex-CEO , Travis Kalanick was once quoted saying “[Uber’s] opponent is an asshole named Taxi.”
Over the years, the two groups have not exactly played nice. Taxis have blamed Uber for the damage done to their industry in recent years, even taking it so far to blame the company for a number of taxi driver suicides. Uber hasn’t taken a backseat either by mobilizing their audience to file complaints to New York City Hall in response to a new crackdown on ride-sharing.
Not Perfect Yet
While the deal may seem like the perfect, fairytale ending for the taxi-versus-Uber storyline, there are still a number of concerns about the treatment and compensation of App Workers, which include ride-share drivers (and now New York cabbies as well). According to The New York Taxi Worker Alliance (representing over 21,000 drivers), cabbies will make 15% less on average on the Uber app when compared to their traditional fares.
Emma Woods, a spokesperson for Justice for App Workers, says, “Whether we started driving for Uber five years ago or five minutes ago, what app drivers have in common is that we are underpaid and under-protected by app companies in their relentless pursuit of profits. We are fighting for all app workers, and we welcome the yellow cab drivers to join our movement.”
Bringing Nature into Intelligence
When you hear phrases like artificial intelligence, cloud computing, and the Internet, do you ever associate them with nature?
Greenhouse Gas Giant
Computing is energy-hungry. Today, the many devices and servers that our modern computing infrastructure relies on, such as the Internet, produce a gigantic amount of greenhouse emissions that’s equal to the entire airline industry - yikes!
How can we change this?
Nature Comes to the Rescue
Recently, NYU researchers launched the Solar Protocol, a project spotlighting just how much energy our global computing industry consumes and providing a solution to mitigate the industry’s climate impact. In contrast to traditional large to large-scale, high-volume web services that algorithmically direct network traffic to whichever server gives the quickest response time, Solar Protocol leverages a network of solar-powered servers and directs traffic to wherever the sun shines. In our increasingly data-driven world, Solar Protocol factors the sun’s interaction with Earth as key input data for the algorithm deciding where to send computing traffic.
In a way, Solar Protocol is enabling us to reconnect with nature and listen to its inherent natural algorithms - it’s beautiful, isn’t it?
Weekly Feature: Navigating Bias
Despite what you may think, computers are not as impartial and objective as they are often labeled to be. AI algorithms tend to reflect the bias of their programmers, sorting and classifying images in ways that align with the perspective of the programmers.
As AI becomes more accessible to everyday life, the potential bias within these algorithms are beginning to concern the general public.
Image Recognition
Native American technology high school and college students gathered for a conference in Phoenix, AZ and were tasked with applying labels to Native American cultural images. These labels were compared to that of labels given by a “major image recognition app” and performed poorly
“...one app identified a Native American person wearing regalia as a bird, ” Chamisa Edmo, a technologist and citizen of the Navajo Nation, who is also Blackfeet and Shoshone-Bannock.
With failure at this level, one is likely to question how can such an advanced algorithm mistake an image of a person for a bird? The answer to this question is bias.
Widening Scope
For an algorithm, data amounts to the experiences we learn from in everyday life. When an algorithm is “trained” on certain data, this simply means that programmers run the code with respect to this type of data. Any bugs that are encountered are only fixed for data that is similar to the training data.
As one would expect, bias can often be found within this process of training. The diversity in images for training data of a program is essential for its performance.
“It’s a little bit of a chicken and an egg problem because you need the data to really build a big system that demonstrates value,” W. Victor H. Yarlott, an A.I. researcher at Florida International University. “But to get all the data, you need money, which only really starts to come when people realize that there’s substantial value here.”
Most training data for image recognition software today generally lacks appropriate representation of minority groups, hence the misclassification of a Native American person as a “bird'' due to cultural regalia.
Moving Forward
For tech companies to address bias within image recognition programming, they have to provide a diverse range of images that reflect cultures equally. Considering the amount of time that it takes to train an algorithm on extremely large datasets, it is essential that we begin to see a shift as soon as possible.
Once there is value in this data, we could begin to see “cultural engines” where software will encourage and foster the spread of tradition. This could potentially look like chatbots that contain a database of culture specific resources on topics such as cooking or even traditions.
Love HTC? ❤️
Follow RAISO (our parent org) on social media for more updates, discussions, and events!
Instagram: @Raisogram
Twitter: @raisotweets
RAISO Website:
https://www.raiso.org
Written by Molly Pribble, Larina Chen, and Dwayne Morgan
Edited by Dwayne Morgan
Hold the Code
Responsible AI Student Organization (RAISO)
Northwestern University
You received this email because you signed up for our newsletter.