Enable javascript in your browser for better experience. Need to know to enable it?

ºÚÁÏÃÅ

Responsible tech in the age of hostile tech

Responsible tech in the age of hostile tech

From multi-million-dollar ransomware payouts, to data breaches that expose hundreds of millions of users’ private information, the headlines surrounding hostile tech are certainly eye-catching. But they don’t tell the whole story.

Ìý

The problem is, there’s far more to hostile tech than deliberate, targeted attempts to disrupt your systems and steal your data. Hacking, ransomware attacks, data breaches, and DDoS attacks dominate the hostile tech narrative. Undoubtedly, they can cause severe reputational damage, but in practice, they’re just one piece of a much bigger picture.

Ìý

Recognizing all forms of tech hostility

Ìý

It’s important not to get too alarmed by the term hostile tech. We’re not just talking about the kind of direct attacks that our technology assets suffer from every day. We need to think about hostility a lot more broadly.Ìý

Ìý

Hostile tech doesn’t just encompass things that are illegal. We’re not even talking about things that are necessarily malicious. For example, there are people who are perfectly happy with being surveilled online, if it means that they get more personalized ads. Others will go to incredible lengths to prevent any kind of digital surveillance tracking them, because they see surveillance as unethical.

Ìý

Hostile tech is sometimes completely unintentional. People, for example, didn’t set out to build image recognition software that delivers inconsistent results when identifying the faces of black women. They didn’t maliciously set out to make a biased product — they just had a poor data set.

Ìý

These aren’t malicious projects — and they’re certainly not direct attempts to undermine or damage the brands deploying them. In many cases, they aren’t even a failure of design or planning. Overwhelmingly, hostile tech emerges because teams haven’t fully considered how a tech decision could have different impacts across all the potential stakeholder groups.

Ìý

And that’s a critical point. When we begin our software or technology projects, we typically have a specific stakeholder group in mind that we’re trying to serve or meet the needs of directly. But what we don’t often think about is the impact of that product on other stakeholders.Ìý

Ìý

Maybe there’s an environmental impact of your product that you haven’t considered. Training a single natural language processing model can carry the same CO2 emissions footprint as 125 round trip flights between New York and Beijing — an impact few account for.

Ìý

Maybe there’s an equity issue you hadn’t accounted for. For example, one of the things we saw with home schooling as we went into the pandemic was that houses without good internet service simply couldn’t support two or three children attending online lessons as their parents also worked online. What many saw was a wonderful revolution in digital education but they didn’t see those left behind by digital inequality.

Ìý

Why now is the right time to embrace true ethical tech

Ìý

The digital inequalities exposed by the pandemic, and ongoing climate crisis are just two reasons why now is the time for organizations to acknowledge and address the hostile impacts of their technology decisions.

Ìý

Technology is now deeply rooted in virtually every aspect of our lives. Medical decisions, credit decisions, probation or sentencing decisions, all of these things that have huge impacts on human lives are now themselves massively impacted by our technology choices.Ìý

Ìý

The stakes are very real, and the impacts on the stakeholders we fail to account for when making these decisions are immense. That’s why it’s so important for organizations of all kinds to embrace a responsible tech mindset.

Ìý

Defining responsible tech

Ìý

Responsible tech — sometimes referred to as ethical tech or equitable tech — is an umbrella term that encompasses multiple notions, all centered around doing the right thing in and with technology. That could mean anything from taking steps to make an application more accessible, to implementing policies to help consistently deliver equitable tech experiences.

Ìý

On paper, it’s a relatively simple concept, yet it remains clouded by misconceptions. I recently read something that said responsibility is easy to define in areas like civil engineering, where your responsibilities are to ensure that buildings are stable, don’t collapse, and don’t otherwise negatively impact the lives of citizens — with the implication being that it’s somehow harder to define in software or technology.

Ìý

Yes, we’re not bound by any kind of Hippocratic Oath like the medical profession. But, we are often guilty of giving ourselves a little too much leeway when it comes to making ethical and responsible decisions with our technology. Take the for example. As the company leadership acknowledged, the decision to implement software designed to alter vehicle emissions under test conditions was deeply flawed from an ethical perspective.

Ìý

What steps can we take to be more responsible?

Ìý

The biggest thing that businesses today need to do to reduce unintentional tech hostility and make responsible decisions is to explicitly think about the ‘invisible’ stakeholders that could be impacted by any given technology decision. That means considering:

Ìý

  • The groups products and services are tested with — are they truly reflective of the end user groups we anticipate will use the product? And are all stakeholder groups represented and given a voice in that process?

  • The quality and accuracy of the data sets used to power data-driven services — are they free from bias, and are they capable of enabling truly reflective and inclusive experiences for all?

  • Whether you’re designing with equality and ease of use in mind — and whether complex features or capabilities are coming at the cost of overall usability and accessibility?

  • Whether the decision represents any kind of non-human hostility? For example, is it aligned with our sustainability goals, and is it likely to have a negative environmental impact?

Ìý

Another thing I like to encourage organizations to do is make an explicit statement about what you care about, and what you want your technology to help achieve. As Cathy O’Neil, author of , says, there are times when you have to trade off fairness and profit.Ìý

Ìý

It’s up to you where you want to sit along that spectrum, but what’s important is to make your goals and intentions clear. I worked with one organization that developed a framework that expressed their values and principles around the use of customer data, clearly laying out how they intend to operationalize that data, and why that decision was made. It took months to develop, but it made their intentions and ethical position completely clear, and easy to stay aligned with.

Ìý

Reducing tech hostility is our collective responsibility

Ìý

Hostile tech can take many forms, and can easily creep into any technology decision. As technology decision-makers, it’s up to us to ask the right questions and consider how the tech we deploy will be used by everyone — and how it’s likely to impact their lives and experiences — to reduce this hostility over time.

Ìý

Shifting to this responsible approach and mindset is relatively simple in theory, but will take real dedication for organizations and professionals across our industry before meaningful results are seen.

Ìý

Just as it’s our mandated responsibility to safeguard customer data from malicious threats, it’s our ethical responsibility to do what we can to prevent hostile tech experiences and build an equal, accessible digital world for all.

Ìý

If you’re interested in learning more about responsible tech, ºÚÁÏÃÅ’ Responsible Tech Playbook is a great place to start. Alternatively, you can take a look at our Perspectives article on ethical technology to find out more about what it really means to act ethically in the digital space.

Ìý

Discover how the Responsible tech playbook can help your organization