View from
The Center

​​A Millennial Technologist’s View on Jaron Lanier’s “Who Owns the Future?”

“Lanier draws a straight line from the idealism and optimism that ‘information wants to be free’ to the dystopian problems facing today’s world, including but not limited to a declining middle class, filter bubbles, and dysfunctional politics.”

Among the few attitudes many of us share today is a general apprehension about the role and influence of large technology platforms. Thinkers from Jonathan Haidt to Noam Chomsky have criticized social media for undermining the foundations of democracy. Frances Haugen’s leak of internal Facebook documents triggered widely covered congressional hearings. How did we get here?

Jaron Lanier, a quirky polymath technologist who was “in the room where it happened” as the World Wide Web took shape, tried to warn us nearly a decade ago about what was coming. His 2013 book “Who Owns the Future?” puts forward a compelling analysis of the economic systems we have set up around information technology, warns us about the dystopia that awaits if we follow the current path, and presents a human-centered vision for an alternative future. 

Lanier argues that the main outlines of today’s technology world can be traced to a decision between two competing technical and economic models. The model that ultimately shaped the Internet as we know it today is represented by the maxim “information wants to be free.” That radical idea proposed that the world’s accumulated knowledge (and intellectual property more generally) should be free for everyone to access. Such a future was easy to envision at a time when technology enabled copying and sharing information at a level never before imaginable. Indeed, by the turn of the century, not only could songs and digital newspapers be transferred in seconds, “libraries of congress” became a unit of measurement for boasting about how large or fast one’s server or internet connection was. 

For proponents of free information (that is, the majority of Silicon Valley and technologists in the aughts), the protections of copyrights, patents, and commercial rights seemed about as useful as Chesterton’s hypothetical old fence in the middle of the road, ready to be cleared away in order to make progress toward a freer, more creative, and more connected future. For example, in technical communities, the recording industry was presented as a cartel for implementing measures to manage commercial rights of music, and it was easy to cheer for the underdog in this seemingly David-versus-Goliath battle for freedom. The battle over the price of information echoes in plain form today in France and Australia’s recent laws demanding payments for links to news sites, which is met with strong opposition from the technology industry, with Google stating that the Australian law, “would break the fundamental principle of the web—that it should be free and easy to link to websites.”

Fallout from the free flow of bits and bytes across the World Wide Web was soon evident, in particular in the music, journalism, and photography industries. Piracy led to musicians losing steady streams of royalties, forced, precariously, into what we now know as the gig economy. Middle class supporting positions, like those for studio musicians and editors, disappeared. Newspapers collapsed—no longer needed as intermediaries—as news and classified advertisements shifted online, often available on advertisement-supported free websites. These stories are all common knowledge by now, and a common refrain would say that these are the necessary and inevitable costs of progress—that, overall, technology creates more opportunity and wealth in the long term than it destroys. Here is where Lanier’s argument takes a surprising turn.

The contraction of these industries—Lanier argues—was not inevitable but instead followed from the choice to adopt the principles of free information. Specifically, there was a decision by major players in the technology industry about how to assign commercial rights for content on the Internet. On seeing the term, “commercial rights,” one might be forgiven if his eyes glazed over as if reading an end-user license agreement, but this concept is key to the argument. Commercial rights ensure that musicians—if their songs are not pirated—receive royalties when a third party plays their music for commercial benefit, for example, at a restaurant or in an advertisement. On the other hand, users of social networks freely surrender their creations—be they cat videos, wedding photos, or political opinions—and even their own likeness in return for the privilege of accessing the network and for the manipulative award of positive reinforcement by likes. Is this a fair exchange?

Lanier draws a straight line from the idealism and optimism that “information wants to be free” to the dystopian problems facing today’s world, including but not limited to a declining middle class, filter bubbles, and dysfunctional politics. These are, of course, complex issues, but here is a rough sketch of Lanier’s argument for the causal connection:

  1. When information is free, whoever has the most powerful computers will be able to capture the information’s economic value. Without any artificial barriers, it’s a winner-take-all world; the best search engine attracts all the web surfers, the online store with the lowest prices gets all the consumers, and the best video streaming platform draws in all the viewers. After adding in network effects that increase barriers to entry, centralized platforms are inevitable.
  2. Centralization of platforms leads to increasing competition, meaning that the most talented musician, the most entertaining video streamer, the most engaging calculus tutorial, and the best-rated cell phone apps win outsize audiences. Success and economic gains flow to fewer and fewer.
  3. Centralized platforms sit at arm’s length to push operating risks onto others and drive creative work off the books. For example, digital marketplaces match buyers to sellers, a position which may allow them to evade responsibility for managing inventories or ensuring quality. Ride-sharing services push the risk of unsteady income onto the gig workers who sign up as drivers. The users who generate content like product reviews are rarely rewarded monetarily for the value they bring to the system.
  4. Platforms are monetized via advertising or by taking a cut from the transactions they facilitate, since charging for access to the platform itself violates the principles of free information.
  5. To maximize profits in the advertising regime, a platform’s business model is to choose content for the benefit of the third-party advertisers that generate the revenue. However, this is not the same model as a billboard owner or printed newspaper of old. Digital platforms have access to unprecedented data on user behavior and demographics, all of which is used as input for algorithms that modify human behavior for the benefit of advertisers.
  6. Interaction between a user and a platform that is optimized for the economic benefit of a third party becomes inevitably creepy. Furthermore, manipulative algorithms will unavoidably exploit human psychology, calling forth the worst in groupthink by rewarding people for acting in accordance with the expectations of the platform.

What would the alternative world look like? We could start with some red lines. In a discussion with former presidential candidate Andrew Yang recorded in May 2020, Lanier argues for the outlawing of so-called behavior modification algorithms that “measure human behavior in order to customize content for the benefit of a third party who’s paying.” Instead, social networks and search engines could be subscription services that operate to serve users rather than advertisers. Such an alignment of interests between the platform and the user would preclude situations where the platform needs to find a trade-off between features that benefit users and features that maximize advertising revenue. Unsurprisingly, these decisions often fall in favor of the latter, as documented in Frances Haugen’s recent leak of Facebook documents.

However, Lanier’s proposal goes much further than this to outline a future with a new economic model for the web based on universal, inalienable commercial rights for data. In this future, a search engine might send micropayments to the sources of information it cites. A blog author might receive royalties for a widely linked article. A landscaper would be offered a stream of revenue for her creative garden designs, which could themselves be implemented by robots. A drug trial participant might earn a tiny fraction of the revenue of a successful treatment. A misdirected traveler might be financially rewarded for correcting a mistaken address on a digital map. A retired truck driver might see a stream of deposits in his bank account for the data he recorded that trained the autonomous driving system now used throughout the country.

As suggested by some of these examples, a model of compensating humans for their data contributions provides a mechanism for the benefits of technology to be widely enjoyed rather than concentrated in fewer and fewer hands. The solution, according to Lanier, is not a Marxist top-down redistribution of capital or universal basic income but a recognition that in a digital world, production of data is labor that should be on the books. While such accounting may be complex, it is surely technically possible; the decision of which system to use rests with us.

So, who owns the future? Will we follow our current path where centralized platforms may view humans as if they were vessels of data to be manipulated, or will we design interfaces that treat people as the true source and end of meaning? Will we hasten a technological “singularity” that results in human extinction (as many, including Stephen Hawking and Elon Musk, have warned) or will we use technology as a medium that enables creative endeavors, provides dignity in work, and is subservient to human desires? These are questions for our time.

No easy path exists for reaching the possible future that Lanier describes, and we do not know what unforeseen flaws may lurk in such a new, untested system. Nevertheless, Lanier convinces us that our society is grappling with the unintended consequences of the information-wants-to-be-free ethos that guided the design of the web. At the very least, “Who owns the future?” helps us to understand some of the forces that are shaping our world and, as a result, provides more concrete problems to solve, helping dispel a general malaise and helplessness that one might feel when reading the news.

In the nine years since the publication of “Who owns the future?”, we have seen both a continuation of the forces previously set in motion and some faint signs in favor of Lanier’s project. Many observers claim that recent social upheavals have been triggered or amplified by manipulative algorithms (see also Lanier’s 2018 book Ten Arguments for Deleting Your Social Media Accounts Right Now). On the other hand, the idea of data dignity, a new name for Lanier’s and others’ proposals, has taken hold in state-level legislation and is being championed by Andrew Yang. Researchers now debate how much income Americans might earn from a data dividend scheme. A boom of subscription-based services for music, video, and newsletters suggests a sustainable model for content creators. Non-fungible tokens (NFTs) illustrate the demand for some way to own data in a digital world, regardless of how useful these particular digital assets are. As we consider how we want to integrate and regulate information technology within our society, we must look beyond easily visible, albeit important, questions like privacy and censorship, and instead develop an economic and moral framework for how technology and humanity overlap.

Miles Lubin is an applied mathematician and open source contributor. After completing his Ph.D. in Operations Research at the Massachusetts Institute of Technology, he worked at Google as a research scientist between 2017 and 2022. Opinions are his own.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.