"Think about the websites, apps, or communications platforms you use most. What behavioral metric do you think they’re trying to maximize in their design of your attentional environment? I mean, what do you think is actually on the dashboards in their weekly product design meetings?
"Whatever metric you think they’re nudging you toward—how do you know? Wouldn’t you like to know? Why shouldn’t you know? Isn’t there an entire realm of transparency and corporate responsibility going undemanded here?
"I’ll give you a hint, though: it’s probably not any of the goals you have for yourself. Your goals are things like “spend more time with the kids,” “learn to play the zither,” “lose twenty pounds by summer,” “finish my degree,” etc. Your time is scarce, and you know it.
"Your technologies, on the other hand, are trying to maximize goals like “Time on Site,” “Number of Video Views,” “Number of Pageviews,” and so on. Hence clickbait, hence auto-playing videos, hence avalanches of notifications. Your time is scarce, and your technologies know it.
"But these design goals are petty and perverse. They don’t recognize our humanity because they don’t bother to ask about it in the first place. In fact, these goals often clash with the mission statements and marketing claims that technology companies craft for themselves.
"These petty and perverse goals exist largely because they serve the goals of advertising. Most advertising incentivizes design that optimizes for our attention rather than our intentions. (Where advertising does respect & support user intent, it’s arguable whether “advertising” is even the right thing to call it.) And because digital interfaces are far more malleable (by virtue of their basis in software) than “traditional” media such as TV and radio ever were, digital environments can be bent more fully to the design logic of advertising. Before software, advertising was always the exception to the rule—but now, in the digital world, advertising has become the rule.
"I often hear people say, “I use AdBlock, so the ads don’t affect me at all.” How head-smackingly wrong they are. (I know, because I used to say this myself.) If you use products and services whose fundamental design logic is rooted in maximizing advertising performance—that is to say, in getting you to spend as much of your precious time and attention using the product as possible—then even if you don’t see the ads, you still see the ad for the ad (i.e. the product itself). You still get design that exploits your non-rational psychological biases in ways that work against you. You still get the flypaper even if you don’t get the swatter. A product or service does not magically redesign itself around your goals just because you block it from reaching its own.
"So if you wanted to cast a vote against the attention economy, how would you do it?
"There is no paid version of Facebook. Most websites don’t give you the option to pay them directly. Meaningful governmental regulation is unlikely. And the “attention economy” can’t fix itself: players in the ecosystem don’t even measure the things they’d need to measure in order to monetize our intentions rather than our attention. Ultimately, the ethical challenge of the attention economy is not one of individual actors but rather the system as a whole (a perspective Luciano Floridi has termed “infraethics”).
"In reality, ad blockers are one of the few tools that we as users have if we want to push back against the perverse design logic that has cannibalized the soul of the Web…."