Listen on these platforms
Brief summary
We often describe our high-tech and digitally mediated world as "complex" but we rarely spend much time considering how that complexity can be cleverly deployed as a means of duping or manipulating us. However, trends like NFTs have brought this into clearer view. This is not to say it's a novel phenomenon — from dark patterns in UX design to pages and pages of end-user license agreements (EULAs), leveraging complexity for nefarious ends has long been an unsavory aspect of the technology industry.
Ìý
In this episode of the Technology Podcast, Mike Mason and Neal Ford are joined by ºÚÁÏÃÅ designer Kate Linton and ºÚÁÏÃÅ North America Head of Legal Jeremy Gordon to discuss what they describe as "the weaponization of complexity." Together, they grapple with the cynicism at the center of such activities and discuss some of the ways we can tackle it.
Episode transcript
Ìý
[Music]
Ìý
Mike Mason: Hello and welcome to the ºÚÁÏÃÅ Technology Podcast. I'm one of your regular hosts Mike Mason, and I'm joined today by another co-host Neal Ford. Hi, Neal.
Ìý
Neal Ford: Hello, everyone. Welcome back to the podcast.
Ìý
Mike: Today is a bit of an unusual podcast. The topic is going to be the weaponization of complexity, and we have a couple of guests for you today who are maybe not the usual guests that you would have on ºÚÁÏÃÅ Technology Podcast. First of all, we have Jeremy Gordon who's a lawyer. Jeremy, why don't you tell us what you do with ºÚÁÏÃÅ?
Ìý
Jeremy Gordon: Sure. Like you said, I'm Jeremy Gordon. I'm head of legal for ºÚÁÏÃÅ North America. Mainly focused on client-facing things. I sit on our North America leadership team. What I do is to influence, negotiate, teach and service Thoughtwork's mission to create extraordinary impact on the world through culture and technology excellence.
Ìý
Mike: Awesome. Thanks, Jeremy. We're also joined by Kate Linton who is a designer. Kate.
Ìý
Kate Linton: Hi, Mike and Neal. Thank you for having me. Yes, I'm a designer. I lead our global design practice. I'm joining this podcast from Sydney. I spend a lot of time thinking and talking about how we put a human lens on technology.
Ìý
Mike: Awesome. Thanks very much, Kate. Since we do have a lawyer on the podcast today, we should probably make a little disclaimer: we are going to be talking about some things that involve legal stuff, but we are not attempting to give legal advice. We are coming to you today speaking as individuals with our own opinions, which are not necessarily the opinions of ºÚÁÏÃÅ. Jeremy, is that a reasonable disclaimer, or would you add anything to that?
Ìý
Jeremy: I wouldn't add anything, I think that's perfect. We're not your lawyers and we're not speaking for ºÚÁÏÃÅ right now. We're having a fun conversation about an interesting topic.
Ìý
Mike: Neal, why don't you introduce the topic, weaponization of complexity? That's a big one.
Ìý
Neal: Indeed. This is a great example of one of these topics that comes up during our Doppler meetings when we're trying to put together the technology radar. Our most recent meeting was right around the time when there was a lot of news about NFTs and the hype kind of bubbled and exploded around NFTs around that time. We originally had some conversation about this and thought about creating a theme, but it was not appropriate for a theme. It was much more appropriate for a podcast and voila — here we are.
Ìý
It was the realization that NFT seemed to be the latest manifestation of building something really complicated to hide the fact that you are building a scam or you're doing something nefarious behind the scenes. It just seems like this is the latest great example of that. I know Jeremy has few opinions about — what he thinks about these kind of — [laughs]
Ìý
Jeremy: When you brought up NFTs, I went to crypto in general, and how for me, it's essentially the platformization of being able to commit financial crimes. If you think of crypto the way I think of it, which is that it's a greater fool scam. It's a worthless asset that only has exchange value, no use value. Whatever I paid for, I can only benefit financially if I find someone else who will pay me more later.
Ìý
That's the baseline, but then within that, there's multiple additional scams and they run the gamut from just plain old fraud to putting in a virus as a payload in an NFT, to a pump and dump, to a rug pole. You have to get your own glossary to even know what I'm talking about, some of these, but it's like a Russian nesting doll of financial crime is how I describe it.
Ìý
Neal: It's a great example of our premise here that it's so complicated that most people don't understand what an NFT is because it's based on a blockchain and at some point — what's the Arthur C. Clarke quote — that any sufficiently advanced technology is indistinguishable for magic. This at some point just becomes magic and it's easy to scam people with magic.
Ìý
Mike: I think within the crypto space as well when pushed on some of this stuff, crypto proponents actually do say, you obviously don't understand it. Which is very close to “have fun staying poor.†It's those two phrases, you obviously don't understand this and the other one.
Ìý
Jeremy: That's the standard response: you don't understand this, and that's why you won't make any money off of it and they're not wrong actually, but what they mean, or what they ought to mean is that you don't understand the scam and therefore, you won't make any money off it. If you understood the scam, you'd get in early knowing that it's a greater fool scam, and the only people who can really profit off of it are the people in at the beginning.
Ìý
Again, it's one of those kind of scams where you could either be on the front end or the back end. You could be the person who is buying early to then pump and dump it, or you could just be told you're the person who's buying in early, but really, you're the person it's getting dumped on, and you really never know. People hear cryptography and they hear blockchain, and then they hear that mashed up with some utopian vision of a technologically perfect future and they think, well, of course I want this, and of course, I could spend $500 to get some kind of hideous ape NFT, because it's going to be worth a million dollars at some point.
Ìý
Neal: One of the examples that we cited during our meeting was what we're calling the Dune thing. I did a little research for this podcast, This was the NFT where this crypto group bought basically a copy of a rare book that there were only 20 of, and they were going to slice it up and create NFT tokens of the pages, and then burn the physical copy to create some sort of alternative thing and sell NFTs of the actual pages. Then they were notified that all they had actually purchased was the NFT.
Ìý
They did not own a copyright, that they could not publish any of the contents of the book that they had purchased, that they had basically just bought a rare book. That's all they had, is a copy of a book. An official pointer on a blockchain somewhere that said they were the owners of this book. They were mad about that.
Ìý
Kate: Can I just point out the language that we often use with complex technology like this — so, NFT. An acronym that, probably for the average person, they don't know what an NFT is. Acronyms are widely used in technology, they're alienating. They're not user-friendly language. And most people don't understand what a non-fungible token is. When it comes to JPEGs on the internet, they really don't understand where the intrinsic value is.
Ìý
And the truth is, sadly, we've discovered there is no intrinsic value. It's a JPEG on the internet, it can be copied. Sure, it's on the blockchain, we can see who owns it, but owning it has very little value in reality. We've seen the value diminish even though buyers bought into the promise that this would be an investment that would go up in time. Part of this complexity is a language that we use, which is — we all work in technology, this is the language we talk day-to-day, but to the average person, they really don't understand it.
Ìý
Jeremy: I love that. Think about what is “ownership� I think lawyers, the law understands ownership to be a bundle of rights. It is the right to exclusive use, it's the right to exclude others, it's the right to do things with what you own. In intellectual property rights, there are a number of other rights that come with it. We talk about owning an NFT, what do you own? If it's a picture, you don't have the right to prevent others from copying it or using it or modifying it. Do you really own anything?
Ìý
It seems to me like what you've done is paid someone — you've given them something ostensibly of value or cryptocurrency — in exchange for an entry on a ledger that can't be changed unless some other stuff happens in which case that entry doesn't get changed, but a new entry comes and then I guess it's different.
Ìý
Mike: I think the language thing is interesting because we could sit here dunking on crypto all day long, and that's not the whole point of the podcast. I think one of the things that we started to realize though, is harnessing complexity to do slightly nefarious things. It's not new, but it's happening in very specific ways in the technology world at the moment.
Ìý
I think one of the things that we thought of fairly early on was a language thing, the language of end-user, license agreements in terms of service and I think that's something that our listeners will be familiar with. You've got these big reams of legalese that you need to scroll through and click “yes I've read all this stuff and I understand these things,†and it struck us that this was another weaponization of complexity. I thought we could talk about that.
Ìý
Jeremy: Yes. I would ask Kate to start with the design of EULAs in terms of service. I find that to be fascinating.
Ìý
Kate: To begin with, I would always recommend that a designer works with a lawyer together when we construct an end-user contractual agreement. They need to be designed with the user's perspective in mind. It needs to be a collaborative process. The reality we all know is that customers don't read large amounts of information online. Usually, they're looking at terms and conditions and end-user agreements online.
Ìý
We know that information needs to be chunked down, it needs to be summarized. We need bullet points, we need plain language, and we need to test this out with users. We need to test that they understand, that they comprehend, that they're going to spend the time to read them. We continually fall into the trap of hiding behind these contracts because companies, when they create these contracts, it's to protect the company. It's not to protect the rights of the end user.
Ìý
When we think about financial products and really quite complex products like insurance products, users don't understand what they're buying. They don't understand their rights. How do we work together to make sure that they really understand what they're purchasing upfront in plain language? There's a lot of simple design patterns that you can use to chunk down the information creating a really good index at the start so that they can go to the sections they find most interesting and relevant and making it user-friendly and making it work online.
Ìý
That's the first thing. If you want to see examples of who's doing this really well, it actually can't be fine in financial services. You need to go to Facebook or Google. They're actually pretty good at this. They've managed to present their privacy information in fairly plain language and beautifully presented with illustrations and chunk down in simple bullet points. They do present a very human face to their contractual terms and conditions. Of course, there's a lot in there that they don't tell customers.
Ìý
Jeremy: When I think about the EULAs or accepting cookies, things like that, there's this saying that I love, which is people do what they feel like and then they come up with a reason for it afterwards. Typically, when you're presented with the terms of service or you click through EULA or cookie button, it's in the way of something that you want to get to. You want to access the app or you want to access the website, or you want to do something and then you have to click this thing and read through all this language. You just click yes because you want to get to the thing that you want.
Ìý
Then afterwards, you rationalize it, "Well, I don't really care about that. I don't have anything to hide. I'm okay with — they're only going to use my information to provide me with the service, which is what I want anyway, so it's okay." And then I don't worry about those things until later when the hidden… it's not really hidden, but the things that have been obfuscated come back and can cause an outcome that I don't like.
Ìý
Mike: Well, it's funny, I bought a new vehicle a couple of summers ago. I actually bought a Ford Transit van, hashtag van life trying to convert the van. When I bought the thing, I'm at the dealership and they hand me a stack of papers and it say sign here and I sit down and start reading. They were very confused that I was reading the paperwork. I'm like, "This is not a small purchase. I've got no idea what any of this stuff says. I'm not just going to sign at the bottom." I'm not a lawyer, I'm not a vehicle purchase expert but I thought I would at least skim-read the 12-page document before signing.
Ìý
Jeremy: I agree. I am a lawyer and I bought a car a few years ago and about halfway through these 7,000 pages of documents they were asking me to sign, I just stopped caring because I was like, "Look, I know what's going to happen. If there's a problem then I'm going to have some rights or I'm not." At a certain point, it's just like, "Look, yes, give me the extended warranty too, it's basically an insurance program. I'm probably going to run into something."
Ìý
Neal: That is the payload of the weaponization of complexity right there. [Laughs]
Ìý
Jeremy: Absolutely, yes. I probably agreed to all manner of — I probably can't sue them; there's probably a binding arbitration clause; I probably can't join any class action lawsuits; they probably get to use my location data for things.
Ìý
Neal: That's another great example, arbitrations versus the ability to really go sue somebody.
Ìý
Jeremy: Well, yes. As a service provider, the company is acting from a position of great power with lots of resources. But what they're providing is, generally — regardless of how much profit they're making, it's irrelevant — they're providing low-dollar transactions at a high volume for hundreds of thousands or millions of people.
Ìý
The cost of litigation in the United States and in most places is extremely high. We're talking about to go from all the way through suing someone could cost a million dollars, and binding arbitration is not that much less expensive unless you do it all the time and you're set up to do it all the time or better yet, you sell something that's not worth arbitrating over.
Ìý
For example, if I'm a telco and I sell a two-year contract for $50 a month, yes, it's a fair amount of money, but the cost to arbitrate that if I don't perform for me is fine, okay, I would pay it, but no reasonable consumer would spend $100,000 over a $2,000 cell phone contract. So what part of that is, is it's using a common law doctrine of the freedom to contract, to lock people into a resolution that is meaningless to them, but it also stops them from joining together because everyone has to agree to arbitrate individually between them and the company so no class action lawsuits.
Ìý
Kate: What I find really interesting from a design perspective when it comes to reading these warranties, the fine print, so to speak, is that there's this common convention to put them in all caps in really tight blocks of paragraphs with no paragraph breaks and no line spacing. Jamming that up really tight as if we deliberately don't want that block of text to be legible to the user. That's a dark pattern — we know how to make content legible. It's so easy to make this information scannable, and yet often when you go into the warranties and look at them, that's a really common convention to put them in all caps with no paragraph breaks, why is that?
Ìý
Jeremy: Interestingly, it comes out of old decisions, saying that people wouldn't know to look at that warranty disclaimer if it was just in the same type as all the rest of the contract. You were supposed to put it in all caps to make it stand out like this is really important. You need to make sure that you read this. Because judges aren't designers. So some judge somewhere said, "Well, of course, they didn't read that. It wasn't conspicuous, so the law is to make it conspicuous, I'll put it in all caps."
Ìý
Neal: Well, this brings us to a fascinating place that I didn't even know existed before we started talking about this in our meeting, which is and Kate just mentioned is this idea of dark patterns, that there are catalogs of nefarious ways to do bad things to you from a user interface perspective or workflow perspective to complex perspective. Can you share a few of those dark patterns with us? There's a great website, we'll put in the show links called deceptive.design, but Kate, I think you're the only one who knows most about this.
Ìý
Kate: Look, and if you have a look at that site, there's 12 common design patterns or dark patterns that they identify. I think there's actually more dark patterns out there than what's identified on the website. These dark patterns are design decisions that are made to influence the end-users along a journey that is in the best interests of the business, not necessarily the end user. It's to complete a transaction; to encourage the user to accept the cookies; to complete the purchase in their shopping cart. There's so many really commonly understood ways that we can do that as designers, just the design of buttons, for example, they can be a dark pattern.
Ìý
So, for example, on the cookie banners, generally, the acceptable cookies, that's the really visible button. That suggests that there's a right action that you should do, just go for the big colorful button. Otherwise, there's this annoying little link to read about my options. Those are never presented as equally valid journeys. That's a dark pattern right there, and if you go to the ºÚÁÏÃÅ website and look at our cookie banner, we give three options. Each of those buttons looks identical in terms of the design, because they're all equally valid.Ìý
Ìý
There is no right or wrong decision for the user in this situation. The best decision is to actually understand how this cookie will use your data. That's just one dark pattern. Another one that's not even in the list of dark patterns that I personally love is infinite scroll. Infinite scroll is a way to capture users' attention to keep them on the page. When there's an infinite scroll, we can keep your attention forever. That's the dark pattern that advertisers take advantage of. We know that with infinite scroll, you can keep a user on the page 20% longer than they would normally choose to be.
Ìý
Jeremy: I have a question about this concept. Something I've noticed is I try to use Ghostery or Privacy Badger, there's tools from the browser plugins from EFF, and other organizations to keep track of the cookies. What I started to notice, and at first, really, it worked great, but over time, more and more websites simply don't work. It's not the ones that say, "Oh, hey, I've noticed you've got an ad blocker, could you please turn it off because we make money from ads?" It's just other websites that I pay for, or I'm a member of and I log into and it just won't work if I don't turn on a bunch of tracking cookies. Is that not a dark pattern?
Ìý
Neal: That's the sneaky thing because it might be. See, this really depends on how well you've implemented the site because the original intent of cookies was I need to add state to something that’s stateless, because web connection is stateless, and you can add all the state by dropping cookies. Then they started sharing cookies, and then people found out nefarious things to it. Now the question is, well, is this site just poorly designed and relies on cookies to work correctly, or are they in fact, refusing to work correctly unless they can snoop on you?
Ìý
Jeremy: The disturbing part is, it's my brokerage site. It's got like 50 cookies, and it doesn't work right if you don't enable them. I think they got bought by someone recently so…
Ìý
Mike: That to me is — I don’t know what the phrase is — never confuse incompetence for malice. In that particular case, I'd go for incompetence rather than malice, but it's a good question.
Ìý
Neal: It's a good segue into why and what are they hiding with its complexity. Another phrase that came out of this conversation was surveillance capitalism, which is something that is very popular with social media sites, and they do a lot of hiding of what they're actually gathering about you through various means.
Ìý
Kate: It's interesting that surveillance capitalism is, it sounds like such a pejorative term. When you look at advertising sites like Facebook and Google, they would never present their service to the customer in that light. At the end of the day, Facebook and Google, they're advertising companies, that's how they make their revenue. It's through advertising, they provide a free service, that effectively you, the user of that service, you are the product — your data is being productized. It's being used to sell to advertisers to serve up ads that are highly personalized and customized based on the evidence of your online behavior.
Ìý
That's not just your behavior whilst you're using the app, tracking your behavior across all of your online experience. The websites that you go to, the search terms that you use. And for the average user, they just don't understand this; this is a really complex concept. That's the way they thought to make money. Recently, Facebook was very angry about Apple who gave customers the option to turn off tracking on their mobile phones, and they declared that this could result in $10 billion less to their revenue every year. $10 billion. That's just for people who turn off the tracking on their Apple devices. It's a small percentage of their advertising revenue.
Ìý
Jeremy: I think about the complexity here is all of the things that you don't see, that we don't see. You can trace it through the evolution of web 2.0 to where — this isn't something that I came up with, this is I think a common narrative — is that back in the old days, a website was basically static text that you just put up there. Then it evolves into something that's much, much more than that to the point where regular people just can't go and build a website that does the things that they just want to be able to do, like share photos and cat memes and stuff, or message their friends or send a payment, split a check, whatever, but we have these platforms that do.
Ìý
When you're interacting with it, that is an astonishingly complex set of things that has to happen for you to feel like it was just easy. I just sent the money. It just happened. Underneath of that, there's the companies that are making money off it, they are happy for you to not experience that. Ultimately, we're happy to not experience that complexity. So many things that we take for granted just wouldn't happen without that, but we also are materially disadvantaged as a result of all of that complexity being hidden from us.
Ìý
The prices that we see, they're not the same for everybody because we don't all have the same disposable income and for other reasons as well that I won't speculate about, but we live in a deeply broken, globally-connected society and people do things for all kinds of reasons. That complexity you can't see, that to me, that's the weaponization of it. Maybe it's a few cents here or a few dollars there that I personally pay extra, or you pay extra because of the price that I was presented by the platform but all that money, collectively, turns into billions and billions of dollars of profits for the platform owners, whoever they might be. I think this is the full circle, right?
Ìý
It can feel like or look like some grand conspiracy where everybody's working together and there's this tight little cabal of people and they're all working together to rip us off, but they don't have to be working together. They can just be normally self-interested, but they all have the same end goal. They all have similar motivations. Powerful people don't need a conspiracy to want to maintain their power. I think that's a key theme that runs throughout this idea about the weaponization of complexity where humans are wired, I think, to detect patterns, and we detect patterns and behavior.
Ìý
Neal: Yes. Sharks don't collaborate to get the emergent behavior that a group of sharks end up showing. I think Richie Key famously makes this distinction between simple versus easy. A lot of times, when you have something that's complex, you can make it easy, but to do so, you insert yourself as a middle person and that's where you can start really weaponizing that middle, inserting yourself into a negotiation means that you start getting leverage in that, and so there you go. That's the lever point that a lot of companies use to insert themselves.
Ìý
Jeremy: When we think about weaponization or weapons, typically there's a goal. Someone's using it to do something and it's not necessarily just to make money. We see this across misinformation and disinformation campaigns to manipulate people's actions, get people to do things. Go out, and you could say, to go out and vote or not vote but there's a whole range of activities in society that people are spurred to do. We've seen a number of instances where capital cities have been overrun by angry mobs in the wake of a contested election in the past two years, three years in different countries. It's not just limited to the United States. We can see this kind of thing happening all over.
Ìý
Kate: It's been going on for a long time. I think back to Cambridge Analytica and their involvement in Brexit and their involvement in the Trump campaign. This is an organization that has harvested just millions and millions of user data, created profiles and personalized and targeted messages to them to influence their decisions in a democratic process and using misinformation to influence their decisions. It's been going on for a long time. We know that this happens and it is true weaponization.
Ìý
Neal: Well, before everyone gets too depressed and turns off our podcast, let's talk about ways that you can spot this and avoid being taken advantage of the weaponization of complexity. I'll start with the first piece of advice which is the most obvious is, if it sounds too good to be true, it's probably too good to be true [laughs]. That applies to cryptocurrency and lots of other things in the world, not just technology assets, but there are other ways you can spot this as well.
Ìý
Jeremy: I have a less of an optimistic opinion on it, but it's not that I don't think we can overcome it, it's that there are so many complications. I was thinking about this morning if it sounds too good to be true, it probably is, but there's an issue with that these days. There's so much misinformation that is regularly shared as fact that “too good to be true†just doesn't mean what it used to. You can look around and our environment it's full of lies and liars. There's books written about it, the lying liars and the lies that they lie.
Ìý
We have to become better at critically analyzing, not just the thing presented to us, but really all of this information in our environment to determine what too good to be true really means. And I think people can do that. We're really good at it, actually, if we slow down and think slow about it.
Ìý
Neal: Well, I think there are parallels in history. Back when the printing press first came out, pamphleteering was extremely popular. A lot of people believed everything that was printed on paper because, "Hey, if it's on paper, it carries weight." But then people slowly realized what is actual journalism and what is less reliable. We haven't achieved that yet in the digital world because it's so much easier to hide the differences.
Ìý
This is part of this weaponization — that you can make videos on YouTube of a very perfectly reasonable sounding person, a nice studio who's arguing with a straight face that the world is indeed flat and that every airline pilot is in on a vast conspiracy and try to make that sound… and there is a significant portion of people who believe that because it's spoken in a soft voice in a nice studio. So the ability to make things look good is a little bit of this weaponization complexity too.
Ìý
Kate: I think there are things that individuals can do to protect themselves. I don't think we can expect your average user to understand or spot surveillance capitalism or to be able to understand all of the terms and conditions before they purchase a product. But there's a lot that users can do to protect themselves. One thing that I encourage people to do is just turn off their notifications so that they're not continually getting interrupted by Facebook and Instagram when they're at work.
Ìý
When you're at work, you don't need to know that someone in New York just had a terrific blueberry bagel. That's not important. You might pick up your phone if it's a loved one ringing you, but you don't need those notifications in your life. Just turn them off and turn off tracking on your mobile phone and just maybe engineer your life so that you're spending less time on the advertising platforms that are productizing your data. Just being aware of how your data gets used. These little things are within our control. I always encourage people to do that.
Ìý
A lot of people don't want to just take themselves off Facebook altogether because it does provide a meaningful connection to their friends and families. One day, I would love to think that Facebook would offer a premium service that was ad-free and that their revenue model could change towards a user-pays model for that service. We've seen that happen with other — with LinkedIn, for example, they offer premium services and I think that there's an opportunity for these big advertising businesses to change their business model and diversify. In the meantime, I think it's up to individuals just to educate themselves about their own privacy and take small steps to protect themselves.
Ìý
Neal: It's even worse in places like India where Facebook and the internet are essentially the same thing. Particularly, if you're doing any kind of e-commerce., it's always on the Facebook platform, and conversely, they have many, many fewer people looking at the hate speech and other bad things in India than they do in the West because they get more regulated in the West about those things. That's supposed to be the proxy for individuals understanding really complex things. In fact, a lot of financial scams are illegal. In the US, Section 230 has really crippled the government from being able to address this because for those of you who are not familiar, it's this piece of an act that basically said a social media site is not responsible for any of the content on its platform so that they can basically go wild.
Ìý
Now, maybe somebody like GDPR is going to finally hold them to account. They're doing something, I think is interesting because these companies are so big. Billion dollar fine is rounding error, but Europe has started charging them a percentage of global income as a fine. Now, I think that's a true disincentive because the bigger you get, the bigger the fine gets, and that starts really stinging. If you have to change your behavior in Europe, a lot of times you have to change your behavior everywhere. Maybe our proxies who are supposed to be protecting us around these things are finally coming around to seeing some of this weaponization of complexity.
Ìý
Jeremy: I think there's a great metaphor there for me. There's the individual things we can do, and then there's the things that we need to do as citizens in a society. The metaphor I think about is, if you saw someone get hurt at an intersection, maybe they were on their bike, and they got hit by a car, you would never just run over and stand above them and scream at them, "I'm calling the local politician to make sure we get a stoplight here." You would go and offer them assistance immediately because there's two sides of that problem. There's that individual responsibility that we can all take, but then there's a systemic issue. Both of those things are what ought to happen.
Ìý
For me, I have a hard time believing that companies could be relied on to do anything to change this because the way the system is set up, it's not that they want to do bad things, or that they want to weaponize complexity, but it's a system where they have to compete with each other, and when the competitors are doing it, you have to follow suit. A group of people who are organized to make money are always going to come into a situation where the choice is going to be between making money and between doing the right thing, whatever you define that is.
Ìý
We should protect them from having to make that decision by acting as citizens to make those kind of behaviors illegal or criminal. We do it all the time. It's a crime to rob someone, so maybe it should also be illegal to collect someone's data without their explicit permission every single time and that is the law in Canada. So for me, there's those two elements, there's two prongs of action.
Ìý
We do need to take some personal responsibility, understand the world, think critically, change our behavior to mitigate those dark patterns, but also, we have to come together as citizens to get the one entity that has a monopoly on the use of force, the government, to go and force us all to behave properly, maybe to reduce some of this complexity, maybe to say that really unfair bargaining positions — those contracts just won't be enforceable.
Ìý
We already have these doctrines, they just haven't really been ported over — to use a technology word — they haven't been boarded from analog to digital yet. There's this thing called a contract of adhesion. You can't come across somebody who's stuck in a flooded river and say, "I'll help you, I'll help you. It's going to be $5,000 and then I'll help you. You agree, right?" They say, "Of course, I agree." You pull them out and then you can't demand $5,000 for that, right?!
Ìý
Kate: I think we know that citizen action can actually motivate government regulation. We've seen that in the past. We've seen citizens get organized. It's resulted in marriage equality in many countries, for example. We're seeing it in terms of the environment. Citizens who listen to the science get organized and motivate their governments to make change. We know that it can work. It's more of a long-term solution. The short-term thing is just as individuals, you need to also protect your own privacy. Think about your children and not exposing them too much to those advertising platforms and just making small changes in our own life.
Ìý
Jeremy: There's another long-term trend that is helpful in this specific context. If you look at intelligence tests that were administered 100 years ago, the average scores are much lower than today. What the intelligence tests are generally measuring is people's ability to synthesize complexity and deal with complexity. As the world becomes more and more complex, on average, people get better at dealing with complexity. That trend would be a hopeful trend. As things get more complex, we do get better dealing with that complexity. Maybe in another 100 years, we'll be that much more intelligent, more capable of handling this complexity and we won't necessarily see the same outcomes.
Ìý
Neal: The only problem is that the nefarious actors will be that much more sophisticated too… [Laughter]
Ìý
Mike: It'll be our AI overlords by that point. Well, on that note, I think we should wrap. I'd like to say thank you very much to Kate Linton and Jeremy Gordon, our guests, and of course to Neal my co-host. Thank you, all.
Ìý
Neal: Great, pleasure. Thanks very much.
Ìý
Mike: Thank you.
Ìý
Kate: Thank you.
Ìý
Mike: This is wonderful.
Ìý
Kate: It's been fun.
Ìý
[END OF AUDIO]
Ìý