12.10.2018 Views

13-10-2018

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

SCIENCE & TECH<br />

SATUrDAY, OcTOBer <strong>13</strong>, <strong>2018</strong><br />

5<br />

Tech’s gender bias nothing new<br />

Marie Hicks<br />

A recent report revealed Amazon's AI recruiting<br />

technology developed a bias against women because it was<br />

trained predominantly on men's resumes. Although<br />

Amazon shut the project down, this kind of mechanized<br />

sexism is common and growing - and the problem isn't<br />

limited to AI mishaps.<br />

Facebook allows the targeting of job ads by gender,<br />

resulting in discrimination in online job advertisements<br />

for traditionally male-dominated jobs from construction<br />

to policing. The practice has long been illegal in traditional<br />

print media - but Facebook's targeting tools encourage it.<br />

Not only can this affect whether women and non-binary<br />

people can see ads; it also affects male job-seekers who are<br />

older and therefore viewed as less desirable by many<br />

employers. Facebook has come under fire for illegal<br />

advertising practices in the past: notably, it scrapped<br />

thousands of microtargeting categories after a 2016<br />

ProPublica report showed how it allowed racial<br />

discrimination in housing ads.<br />

The platform has repeatedly refused to take<br />

responsibility for what people do on it, echoing the<br />

behavior of other Silicon Valley companies. Gendered and<br />

racialized harassment online goes largely unchecked.<br />

Likewise, Google's YouTube has come under fire for<br />

algorithms that appear to push radicalizing far-right<br />

content onto casual viewers, while Google itself has faced<br />

accusations that its image search and autocomplete<br />

features rely on and strengthen racist and sexist<br />

stereotypes.<br />

As online platforms strip away civil rights protections<br />

intended to correct biases in earlier forms of<br />

communication, it serves as an example of the dangerous<br />

tendency of our current, and supposedly progressive,<br />

technologies to re-create discriminatory patterns of the<br />

past. Indeed, these problems fit a pattern in the long<br />

trajectory of the history of technology.<br />

Today, jobs in computing, if advertised on Facebook,<br />

would likely be targeted to men because these jobs are<br />

located in an already male-dominated field. In the early<br />

days of electronic computing, however, the work was<br />

strongly associated with women. It was feminized because<br />

it was seen as deskilled and unimportant. This quickly<br />

began to change as computers became indispensable in all<br />

areas of government and industry. Once it became clear<br />

that those who knew how to use them would have great<br />

power and influence, female programmers lost out despite<br />

having all the requisite skills. Britain's computerization is<br />

a cautionary tale: women were repeatedly and<br />

progressively denied promotions or boxed out of their<br />

jobs, particularly when they married or had children.<br />

Top executives of Facebook, Amazon and Alphabets Inc during a meeting at Trump Tower.<br />

Photo: Shannon Stapleton<br />

When they left, they were replaced by men. This created<br />

disastrous labor shortages that ultimately forced Britain's<br />

decline as a computing superpower.<br />

Women continued to program, but they had to do it<br />

without the support of major institutions. One example<br />

was the entrepreneur Stephanie "Steve" Shirley, who used<br />

a masculine nickname to sidestep sexism. Shirley started a<br />

freelance programming company with an explicitly<br />

feminist business model after finding herself unable to<br />

advance in government and industry. She employed<br />

hundreds of other women who had similarly had to leave<br />

the workforce. Shirley gave these women an opportunity<br />

to use their skills in the service of the nation's economy by<br />

giving them the option to work from home, filling some of<br />

the gaps left by this exodus of trained computer<br />

professionals from full-time computing work.<br />

Shirley's business, built on women's labor and<br />

expertise, went on to become a multimillion-dollar<br />

corporation that did mission-critical programming for<br />

government and private industry. As the government<br />

scrambled for male computing talent, for instance, a team<br />

of her female programmers, led by Ann Moffatt,<br />

programmed the black box for the Concorde jet. As<br />

Shirley's business flourished, many other companies and<br />

even the British government itself suffered for lack of<br />

programming talent.<br />

The irony is that this shortage had been intentionally<br />

engineered by the refusal to continue to employ female<br />

technologists in these newly prestigious jobs. Throughout<br />

history, when jobs are seen as more important, or are<br />

better paid, women are squeezed out - hence the need for<br />

protective legislation that ensures equality of opportunity<br />

in hiring and job advertisements.<br />

In computing today, a field that claims to value diversity,<br />

engineers at Facebook and other companies are building<br />

tools that rollback the advances of women in the<br />

workforce, as the industry undoes the civil rights<br />

protections enacted to ensure that what happened in early<br />

computing does not happen again.<br />

When industries ignore their pasts, they tend not only<br />

to repeat previous mistakes, but also to worsen current<br />

problems. Silicon Valley's gender problems are well<br />

known, and despite companies' claims that they are<br />

trying to address the problem, progress has been slow<br />

and uneven. This is not surprising when we consider the<br />

context. Although the industry is facing a reckoning<br />

today, for decades the stories that we told about<br />

computing technology focused on inexorable success,<br />

rather than taking seriously the possibility that our new<br />

technologies were failing us. High technology became<br />

virtually synonymous with progress and the greater<br />

application of computing to all manner of social problems<br />

was seen as a good in and of itself. As a result, we are<br />

largely blind to the errors of the past. We fail to see the<br />

problems in our present and the reasons behind them<br />

because we are too accustomed to seeing computing as a<br />

success story.<br />

The refusal to talk about computing's failures in the past<br />

has not served us, or present-day computing, well. Rather,<br />

it has hidden problems that have plagued the field since its<br />

inception. Facebook's discriminatory practices towards<br />

female users in everything from job advertisements to<br />

harassment can be traced back to its predecessor, the beta<br />

site set up by Mark Zuckerberg while at Harvard that stole<br />

female undergrads' pictures from internal Harvard<br />

servers. The site, known as Facemash, objectified the<br />

women for an audience invited to rate their relative<br />

attractiveness. When we consider Facebook's current<br />

problems in this light, they not only seem less surprising<br />

but also potentially more solvable.<br />

Lessons like this are critical today because high<br />

technology has an outsize effect on every aspect of our<br />

daily lives, and it is also, in many ways, steadily moving us<br />

back towards a past that we thought we had forgotten.<br />

Much of the anti-racist and anti-sexist legislation of the<br />

20th century has been invisibly rolled back by tech<br />

infrastructures that invite users to see their online actions<br />

as unmoored from real life - whether in the realm of hate<br />

speech or job advertisements.<br />

Strong representation of women in the labor market is<br />

key, historically and today, for women to be able to assert<br />

their rights in all aspects of their lives. Companies like<br />

Facebook cannot be allowed to divide and conquer by<br />

gender, race, sexuality, age, disability, or any other<br />

number of categories people have fought to protect by law<br />

as deserving of equal rights.<br />

Weaponized AI enabling<br />

perpetual wars<br />

A technician checks a server in a data centre.<br />

Do DWeb programs use as much<br />

energy as cloud-based services?<br />

Jack Schofield<br />

The main aim of the decentralised web<br />

(DWeb) is to remove the power of<br />

centralised "gatekeepers" such as<br />

Facebook and Google, who hoover up<br />

the world's data and monetise it by<br />

selling advertising. It reminds me of the<br />

original concept of the web, where every<br />

computer would be both a client and a<br />

server, sharing information on a more or<br />

less equal basis. Of course, that is not<br />

how real life works. What actually<br />

happens is that you get a power law<br />

distribution with a few large entities and<br />

a long tail of small ones.<br />

As Clay Shirky wrote in 2003: "In<br />

systems where many people are free to<br />

choose between many options, a small<br />

subset of the whole will get a<br />

disproportionate amount of traffic (or<br />

attention, or income), even if no<br />

members of the system actively work<br />

towards such an outcome. This has<br />

nothing to do with moral weakness,<br />

selling out, or any other psychological<br />

explanation. The very act of choosing,<br />

spread widely enough and freely<br />

enough, creates a power law<br />

distribution."<br />

The web still has plenty of variety, but<br />

almost everyone is familiar with one<br />

giant search engine, one giant retailer,<br />

one giant auction site, one giant social<br />

network, one giant encyclopaedia, and<br />

so on. Indeed, there is only one giant<br />

internet where there used to be dozens of<br />

competing networks using many<br />

different protocols.<br />

Obviously, it would be better if we all<br />

agreed these things in advance, based on<br />

open standards. However, people vote<br />

with their wallets, and competition<br />

results in de facto standards instead of<br />

de jure ones. Examples include<br />

Microsoft Windows, Google Search and<br />

Facebook. Each triumphed in a<br />

competitive marketplace. I am not<br />

saying this is the ideal solution, just that,<br />

in most cases, it's inevitable.<br />

One of the problems with returning to<br />

a decentralised web is that the internet is<br />

no longer decentralised. It has been<br />

redesigned around giant server farms,<br />

high-speed pipes and content delivery<br />

networks. It looks increasingly like a<br />

broadband television network because<br />

that is what it actually does most of the<br />

time.<br />

Today's web is being optimised for the<br />

delivery of Netflix movies, BBC<br />

programmes on iPlayer, Spotify music,<br />

live streams of every major sporting<br />

event, and so on. You can upload your<br />

own live streams but communications<br />

are asynchronous: your downloads are<br />

much faster, and much more reliable,<br />

than your uploads. It's really easy to<br />

watch 1TB of movies but an exercise in<br />

frustration trying to upload a 1TB harddrive<br />

backup.<br />

If you really want to save energy and<br />

internet resources, stop streaming stuff.<br />

Broadcast TV and radio can reach tens of<br />

millions of people, and adding another<br />

million adds relatively little in the way of<br />

extra power consumption. There is<br />

school of thought that it is better for the<br />

environment to use CDs or DVDs for<br />

albums or films you go back to again and<br />

again, or you could at least use digital<br />

files stored on your PC or smartphone.<br />

Photo: Juice Images<br />

And rather than using Graphite to<br />

replace Google Docs or Microsoft Office,<br />

just use a word processor offline. If you<br />

run Windows, you already have a text<br />

editor (Notepad) and a simple word<br />

processor (WordPad), and there are<br />

plenty of free alternatives. That will<br />

reduce global energy use and increase<br />

your privacy.<br />

It's really simple. If you don't want<br />

Google to read your documents, don't<br />

write your documents on Google's<br />

computers. And if you don't want cloud<br />

servers using energy, don't use the cloud.<br />

Companies such as Amazon AWS,<br />

Microsoft and Google are covering the<br />

world with server farms to make<br />

information more easily available. That's<br />

harder to do with real distributed<br />

systems because the thousands or<br />

millions of separate computers may be<br />

turned off or otherwise unavailable<br />

when you need the data they are storing.<br />

Worse, unless it's replicated, you could<br />

lose data.<br />

It's true that server farms consume<br />

an ever-growing amount of electricity,<br />

much of it used for cooling purposes.<br />

However, the cost is a powerful<br />

incentive for operators to use cheaper<br />

renewables, such as solar panels, and<br />

to reduce their power consumption in<br />

other ways. For example, Facebook<br />

has built a data centre in the north of<br />

Sweden where the air is freezing cold,<br />

while Microsoft is experimenting with<br />

underwater data centres that are easier<br />

to cool. Microsoft is also sponsoring<br />

tree planting in Ireland as part of its<br />

commitment to becoming carbon<br />

neutral.<br />

Ben Tarnoff<br />

Last month marked the 17th<br />

anniversary of 9/11. With it came a new<br />

milestone: we've been in Afghanistan<br />

for so long that someone born after the<br />

attacks is now old enough to go fight<br />

there. They can also serve in the six<br />

other places where we're officially at<br />

war, not to mention the <strong>13</strong>3 countries<br />

where special operations forces have<br />

conducted missions in just the first half<br />

of <strong>2018</strong>.<br />

The wars of 9/11 continue, with no<br />

end in sight. Now, the Pentagon is<br />

investing heavily in technologies that<br />

will intensify them. By embracing the<br />

latest tools that the tech industry has to<br />

offer, the US military is creating a more<br />

automated form of warfare - one that<br />

will greatly increase its capacity to wage<br />

war everywhere forever.<br />

On Friday, the defense department<br />

closes the bidding period for one of the<br />

biggest technology contracts in its<br />

history: the Joint Enterprise Defense<br />

Infrastructure (Jedi). Jedi is an<br />

ambitious project to build a cloud<br />

computing system that serves US forces<br />

all over the world, from analysts behind<br />

a desk in Virginia to soldiers on patrol in<br />

Niger. The contract is worth as much as<br />

$<strong>10</strong>bn over <strong>10</strong> years, which is why big<br />

tech companies are fighting hard to win<br />

it. (Not Google, however, where a<br />

pressure campaign by workers forced<br />

management to drop out of the<br />

running.)<br />

At first glance, Jedi might look like<br />

just another IT modernization project.<br />

Government IT tends to run a fair<br />

distance behind Silicon Valley, even in a<br />

place as lavishly funded as the<br />

Pentagon. With some 3.4 million users<br />

and 4 million devices, the defense<br />

department's digital footprint is<br />

immense. Moving even a portion of its<br />

workloads to a cloud provider such as<br />

Amazon will no doubt improve<br />

efficiency.<br />

But the real force driving Jedi is the<br />

desire to weaponize AI - what the<br />

defense department has begun calling<br />

"algorithmic warfare". By pooling the<br />

military's data into a modern cloud<br />

platform, and using the machinelearning<br />

services that such platforms<br />

provide to analyze that data, Jedi will<br />

help the Pentagon realize its AI<br />

ambitions.<br />

The scale of those ambitions has<br />

grown increasingly clear in recent<br />

months. In June, the Pentagon<br />

established the Joint Artificial<br />

Intelligence Center (JAIC), which will<br />

oversee the roughly 600 AI projects<br />

currently under way across the<br />

department at a planned cost of $1.7bn.<br />

And in September, the Defense<br />

Advanced Research Projects Agency<br />

(Darpa), the Pentagon's storied R&D<br />

wing, announced it would be investing<br />

up to $2bn over the next five years into<br />

AI weapons research.<br />

So far, the reporting on the Pentagon's<br />

AI spending spree has largely focused<br />

on the prospect of autonomous<br />

weapons - Terminator-style killer<br />

robots that mow people down without<br />

any input from a human operator. This<br />

is indeed a frightening near-future<br />

scenario, and a global ban on<br />

autonomous weaponry of the kind<br />

sought by the Campaign to Stop Killer<br />

Robots is absolutely essential.<br />

But AI has already begun rewiring<br />

warfare, even if it hasn't (yet) taken the<br />

form of literal Terminators. There are<br />

less cinematic but equally scary ways to<br />

weaponize AI. You don't need<br />

algorithms pulling the trigger for<br />

algorithms to play an extremely<br />

dangerous role.<br />

To understand that role, it helps to<br />

understand the particular difficulties<br />

posed by the forever war. The killing<br />

itself isn't particularly difficult. With a<br />

military budget larger than that of<br />

China, Russia, Saudi Arabia, India,<br />

France, Britain and Japan combined,<br />

and some 800 bases around the world,<br />

the US has an abundance of firepower<br />

and an unparalleled ability to deploy<br />

that firepower anywhere on the planet.<br />

The US military knows how to kill.<br />

The harder part is figuring out whom to<br />

kill. In a more traditional war, you<br />

simply kill the enemy. But who is the<br />

enemy in a conflict with no national<br />

boundaries, no fixed battlefields, and no<br />

conventional adversaries?<br />

This is the perennial question of the<br />

forever war. It is also a key feature of its<br />

design. The vagueness of the enemy is<br />

what has enabled the conflict to<br />

continue for nearly two decades and to<br />

expand to more than 70 countries - a<br />

boon to the contractors, bureaucrats<br />

and politicians who make their living<br />

from US militarism. If war is a racket, in<br />

the words of marine legend Smedley<br />

Butler, the forever war is one the longest<br />

cons yet.<br />

Automation has greatly increased US Military's capacity to wage war everywhere forever.<br />

Photo: Getty

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!