Doha Debates– Don't settle for a Divided World
Doha Debates is a production of Qatar Foundation
Learn more at www.qf.org.qa
Debate Recap / December 12 2019

Peace Tech Debate Recap

Technology infiltrates every aspect of our lives, from personal interactions on social media to the global ramifications of satellite imagery. The development of new technology often brings with it a certain idealism and the promise for a better future, but in reality, technology can be used for harm. Doha Debates teamed up with UNESCO-Qatar and the United Nations’ Department of Policy and Peacebuilding Affairs’ Innovation Cell to host a special debate event on the role of technology in the context of peace. Can emerging technologies be deployed for the good of humanity? Or do they spell disaster?

We brought together three experts to discuss and debate whether technology can help prevent violence and build peace. Consensus-building and searching for common ground are the aims of Doha Debates, and our three experts united around the idea of increased regulation of technology, but differed on how to get there. The debate sparked probing questions from the student audience, showing a desire to harness technology for good and a distrust of government and tech companies’ abilities to regulate efficiently. Each speaker provided different philosophies and frameworks for technology in peacebuilding.

Allison Puccioni has been an imagery analyst for over 25 years, analyzing remotely-sensed images to discern activity involving nuclear proliferation, military events, refugee migration and environmental degradation. She has worked for governments, academic institutions and the media. She argued for the use of technology in peacebuilding and advocated for its use in the future. “I am here to propose to you that technology can facilitate peace,” she said in her opening statement. “More specifically, the increasing presence and access to technology has helped to democratize information that can ultimately lead to a more equitable world.”

 

 

No longer only in the hands of governments and militaries, the transition of technology that’s more widely available to average citizens and businesses marks a momentous and positive shift, Allison argued. “We now have agency from this data. We have access to satellite imagery that comes from space. We in the research and academic communities apply this data to promote human safety, to monitor government activity and to hold state actors accountable for their actions, and to better inform ourselves about the events that affect and shape our lives.”

Allison pointed to companies leading the way forward, advancing rocket and space travel, machine-learning associated research, space-based Earth observation and computing technology. She called for the need to navigate the “Wild West of informational advancement” with pragmatism. “It is not and it will not be straightforward. But with prudence, in the right hands, in enough hands, in our hands, these newly accessible technologies can lead to more transparent research, to better informed analysis, and to the enfranchisement of a populace — us — to become a bigger part of our global narrative of peacebuilding.”

Our second speaker, Subbu Vincent, believes that technology is just a means to an end, and that if we’re going to fix anything first it should be humanity, not the technology. Subbu is the director of the Journalism and Media Ethics at Santa Clara University’s Markkula Center for Applied Ethics in Santa Clara, California, but he began his career as a computer science engineer. He argued that technology alone will not bring peace, reflecting on the initial promise of social media to bring people together that curdled into today’s disinformation, hate speech and remarkable ability to divide. It’s perhaps at its worst when considering social media’s effects on elections. “May I remind all of us here today that elections are important for peaceful transition of power — as you well already know — from one administration to another administration,” he said. “If we interfere with the elections of another country, we’re actually threatening peace. So there are fundamental issues with how social media technology, for example, has played itself out.”

Subbu pointed out that technology is fundamentally amoral. “In the hands of good actors it can be used for good. In the hands of bad actors it will be used for bad, as we have seen. So what I ask you to think about is the difference in ethics which we call as ‘means’ and ‘ends.’ Technology is a means to an end. So if you want to talk about peace, if you really want peace, the desire for peace comes from us, we have to start talking about the ends first.”

 

 

Our final expert, Ariel Conn, brings a wide-ranging resume to the debate. Her work covers a range of fields, including artificial intelligence (AI) safety and policy and the “existential threat to humanity.” She was the least optimistic of our speakers, pointing to technology’s use by governments and militaries to bring destruction. “Technology is primarily developed for military prowess, for profit, or for both,” she said in her opening statement. “And unfortunately, war is far more profitable than peace. We only have to look at recent history to see this connection between technology and warfare.” Computer science took off during World War II and advanced significantly during the Cold War as governments sought to get the upper hand on other nations. “Even today’s beneficial technology — satellites, GPS, computers — these were all advanced during the Cold War and connected to various arms races also associated with the Cold War. And now many people fear that we are on the brink of a new arms race, this time with artificial intelligence and lethal autonomous weapons.”

And while today it is entrepreneurs and companies that have taken the lead on new technological development, Ariel doesn’t see that as a reason to celebrate. The tech coming out of Silicon Valley has just as much potential for harm as the older iterations. “We’re already seeing examples of this, with problems arising with social media, with deep fakes, with algorithmic bias, with image and facial recognition software. And these technologies when used badly don’t just hurt the people using them. We’re already seeing that they can upend democratic procedures, and they enable human rights violations. If we continue to develop technology as we have, war and destruction will not only guide technological advancement — it will follow in its wake.”

 

“War is far more profitable than peace.”

—Ariel Conn

 

 

Doha Debates correspondent Nelufar Hedayat challenged each of the speakers and broadened the conversation to include the more tangible aspects of technology that a layperson can discuss, focusing on current human rights abuses by the Chinese government being aided by facial recognition technology and droning in war zones. Nelufar challenged Allison’s perspective by invoking her home country. “I’m from Afghanistan, where drones have been used to kill tens of thousands, if not hundreds of thousands, of my people,” Nelufar said. “They don’t share your perspective because they reap the rewards of what this technological advance has done. What do you say to an average Afghan citizen?”

A tense exchange followed, with Allison arguing that more of the technology, especially satellite imaging, is becoming accessible to more people, and Nelufar trying to pin down how that materially helps her compatriots being slaughtered by this technology. “I can’t argue that strike drone technology is going to enfranchise peace,” Allison admitted. “But I can argue that the movement of developmental technology, from governments into the open source community, and the access to technology — from solely governments into the open source community — is a tremendous advantage for people like you, Nel, to be able to look at the satellite imagery that was only available for the militaries 10 years ago. It puts agency in your hands, it puts agency in the hands of your countrymen that wasn’t there in 2005.”

Nel wondered who the “we” is in Allison’s view — who gets to decide how the technology is used, and who truly has access to it, even when it is technically open source? Each of the speakers agreed that technology will only become more ethical when a wide cross-section of the world’s citizens can actively participate in shaping and molding it. Allison used the phrase “Wild West” more than once to describe the current phase we’re in with the technology that’s emerged in the past 15 years. “Almost everything we’ve ever done — transportation systems, communications systems — have had this sort of phase, this nascent phase, where it was ungovernable,” Allison said. “And it was often up to governments to regulate that. Is it better that it’s now up to governments to regulate new technology? Or should this be in the hands of us? So I do think that transparency does confer an advantage.” Ariel pushed back, pointing out that the kinds of technologies we’re discussing are quite different from the mechanical nature of a car. “The risks associated with technology are greater than they were before,” she said. “These new technologies have the potential to be more harmful than, say, a car accident.”

 

“The fact that I know what’s happening in your world and you know what’s happening in my world is going to be better for us all.”

—Allison Puccioni

 

 

Govinda Clayton, a senior researcher in peace processes within the Center for Security Studies at ETH Zurich, and the debate’s bridge-builder and “connector,” focused on moving away from specific positions and encouraged the experts to speak to each other’s understandings. The speakers easily agreed on the why — why regulation is needed to improve the ethical use of new technologies — but the how proved more difficult.

Audience members asked engaging questions and voted by a show of cards on which speaker they agreed with. Subbu had the most who agreed with his position, both after the three opening statements and toward the end of the debate.

Subbu pointed to the economic factors underpinning technology and its ability to be used for good or bad. “One of the things we’ve got to look at when we talk about technology is the funding system that funds entrepreneurship in these situations — it’s called capitalism. But it’s become market fundamentalism, it’s become investment for exponential growth for technological solutions at any cost.” It’s not solely the responsibility of the people building and using new forms of technology to come up with solutions and regulations, Subbu said, but also the responsibility of the business side. Allison agreed: “There are thousands of examples where capitalist-governed technology is a terrible idea in the short term. I do think in the long term we have to open up the field for a lot of different people to understand technology.”

While Ariel voices the most caution of how technology is used for harm — she points to autonomous weapons used along contentious borders in Gaza and between North and South Korea — she is optimistic about the potential for technology to bring people together. Subbu reminded the audience that making decisions about right and wrong takes time. Technology moves fast, but ethics is slower. “People come together and discuss right and wrong from multiple types of models standards — fairness, justice, virtues, rights — and it’s not easy to say ‘use technology in that service’ unless you accommodate conversations that are deliberative and participative and bring lots of people with different histories into a conversation and then build out what the regulation will be,” Subbu said. “That is hard to do but if it is designed well then what you aspire to is possible.”

 

“Ethics is deliberative.”

—Subbu Vincent

 

 

The debate ended as constructively as it began, as a conversation, not a contest, and an example of how deliberative debates can reveal the intersections of different philosophies. The debate continues right now at @DohaDebates with the hashtag #DearWorld. We’ll see you there, and we’ll see you next year for more Doha Debates in 2020.

 

 

Watch the entire debate

Peacetech-thumb-fulldebate
play
1:07:25
FULL DEBATE

Can technology unlock world peace?