A range of input by some respondents covered additional aspects of the issues.

‘The emotional climate around privacy and security will only increase’

Mike Liebhold, senior researcher and distinguished fellow at the Institute for the Future, wrote, “There will be many political, technological, and service efforts to improve privacy but, likely, even greater efforts by dotcoms to mine personal data, by black hat intruders to steal whatever they can, and by government’s pervasive surveillance of the entire Internet. There will undoubtedly be some very lurid tragedies as a result of mining, stealing, or surveillance, so the emotional climate around privacy and security will only increase.”

‘It is absurd to believe this is solvable at the technical infrastructure level’

Seth Finkelstein, a programmer, consultant, and EFF Pioneer of the Electronic Frontier Award winner, responded, “This is a classic case of bargaining power imbalance and asymmetric information. There is such an enormous disparity between individuals supposedly making these ‘choices’ for their information, and the businesses profiting from the monetization, that it is absurd to believe this is solvable at the technical infrastructure level. Every such proposal I have ever seen has struck me as ‘ending up’ replicating what happened with ‘license agreements’—that is, creating a take-it-or-leave-it system, where a person is essentially powerless to do anything but completely accept the corporation’s terms, which are constrained only by consumer-protection law (which has been very much weakened over the years)…”

2025 debates about privacy will be more sophisticated than they are today

Jamais Cascio, a writer and futurist specializing in possible futures scenario outcomes, wrote, “I have little doubt that policy makers and technology innovators will have attempted to create a ‘secure, etc.’ information infrastructure by 2025, but I do not believe that it will yet be simultaneously secure, accepted, and trusted. We will likely see myriad smaller efforts, attempts to provide secure and acceptable service within a narrower framework (i.e., for a particular hardware vendor, within a particular community), but incompatibilities will continue to confound users, and multiple interested parties (including, but not limited to, governments and advertisers) will continue to push for exceptions and special access. I also suspect that, by 2025, we will have experienced at least one massive breach-of-trust incident, where a supposedly secure and trusted system will be broken open in an especially damaging way (i.e., Google’s Gmail archives are cracked open and released). This may not set back the technical efforts, but it will severely undermine any hope for public trust in these systems. The common understanding about privacy is that it is an issue of ‘visibility’—can I, or my information, be seen by others? While that is superficially true, it is not the entirety of the issue. Privacy is about ‘control’—can I decide who gets to see my information, or is that decided for me, without my knowledge or consent? I suspect that, by 2025, the debates about privacy will be more sophisticated than they are today and will focus on this control aspect (versus the crude fear-mongering about teenagers taking selfies, etc.).”

‘Capacities of the surveillance state will always exceed protections of the people’

Jason Pontin, editor-in-chief and publisher of MIT Technology Review, responded, “The end you describe is highly utopian and combines a large number of goods, each of which would be wickedly hard to achieve. To give one example, a really ‘secure’ Internet does not exist and could not be built on the current infrastructure; we would need another Internet. On the other hand, I do anticipate significant progress on some of these goods. For instance, I think there will be renewed demands for more privacy controls from consumers and citizens, and I believe that companies and policymakers will have to satisfy those demands. On the other hand, I expect the capacities of the surveillance state to always exceed the protections of ordinary people. Perhaps, people will come to think of their private information as an asset, which they will selectively release to organizations and companies in exchange for certain conveniences or services.”

Business models are about tracking; we have not yet seen the backlash

Joe Touch, director of the Information Sciences Institute’s Postel Center at the University of Southern California, replied, “Privacy is in direct opposition to the business models of the largest Internet companies. The Internet does not require a login, birthdate, or username, yet these companies continue to create ‘walled gardens’ that do—to create the information that fuels their revenue stream… The issue is not about policymakers and corporations, but rather, whether the public will continue to be comfortable exposing that information. Such norms already vary widely, and I continue to be surprised at the extent to which posts within the frame of a personal video screen, and thus to the entire world, exceed what would be posted—by the same person—to their own front door. I think we have not yet seen the backlash of the current norms of personal public exposure; we might when that generation shifts from being ‘kids just posting stuff’ to being in the position of establishing and protecting their company’s reputation as managers.”

An ‘arms race’ between surveillance and personal protection that goes on and on

Brian Behlendorf, Internet pioneer and board member of several nonprofits and for-profits, wrote, “This struggle for the boundary of personal digital space—the digital equivalent of the boundary of my own home, in both legal and technical senses, but also the boundary of my own body and brain—i.e., the Fifth Amendment—will be an ongoing debate, unresolved and only more vigorous in 2025. We will likely give up the notion of public physical location as personal data, due to both official location tracking by governments (i.e., toll road payments, police car license plate scanning) and private-sector tools that track phone IDs, faces, and other personally-identifiable bits of data when people walk by or into retail shops or other interesting points. But, in the other direction, we will have even stronger rules and societal expectations against surveillance (government and private) upon the activities within people’s homes or other enclosed spaces. There will be no tolerance for drone peeping toms, sniffing the wireless emissions from tablets, displays, and more. There will continually be new technologies for surveillance—each of which will spawn demand for counter-technologies. This arms race will become more a part of our national conversation about human rights, the concept of the confidential vote, and the rights of private individuals and companies to not be compelled to become agents of the surveillance state. I feel like I am compelled to answer, ‘Yes,’ because the question posits the existence of something we have today and will always have—but it does not ask any qualifying questions, such as the quality of those choices, the cost of different levels of privacy, what ‘easy to use’ means, etc. It also assumes that policy makers and technology innovators would work together on this, when, in reality, they may take diametrically opposed actions, as they often do today.”

‘There will be no privacy to speak of. We will have given it all away’

Rex Miller, a thought leader, and principal at a consultancy, responded, “The idea of nation-states will undergo major redefinition. The idea is now obsolete. They have been transcended by global commerce and global platforms like Google, Facebook, etc. These will provide secured enclaves as a value-added service. Policymakers move too slow in the current structure and cannot coordinate between different governance structures to be effective. There will be no privacy to speak of. We will have given away all of it, and there will rise groups who protect the different interests of vulnerable groups.”

Many social and economic cues will depend on disclosure of private information

Jerry Michalski, founder of REX, the Relationship Economy eXpedition, wrote, “Data is easily copied anywhere. The idea that some entity is going to relent and not store our data, and that we will have confidence that our data is not replicated for nefarious use somewhere, is naive. I do not think governments and businesses, motivated as they are today to collect as much personal data as they possibly can, store it, and analyze it, will come to a reasonable understanding that works for citizens. At best, there may be a citizen revolt that sets whole new guidelines, but I am not optimistic that it will happen. By 2025, you will be considered a non-person if you do not have embarrassing photos or videos online from your misspent youth. People who were very parsimonious about sharing personal information will be less credible, and will be trusted less, because others will not be able to see any of their indiscretions—the things that make them human and more trustworthy.”

‘Every person’s actions tracked and monetized continuously and pervasively’

Fred Baker, Internet pioneer, longtime leader in the IETF, and Cisco Systems Fellow, responded, “The Chinese startup Face++ is creating a technology from which it would be easy to imagine the year 2025, seeing a world similar to what is described in the movie Minority Report, in which every person’s actions are tracked and monetized continuously and pervasively. I am hesitant to make predictions there, beyond that, if we cannot counter it, we must expect it to become reality. Per its website, ‘Face++ uses the cutting-edge technology of computer vision and data mining to provide three core vision services (Detection, Recognition, and Analysis).’ If we must assume continuous and pervasive service-based and crowd-sourced surveillance, and monetization of its results, we must also assume that the information gleaned will be available to anyone that can pay to obtain it. That essentially creates a ‘small town’ dynamic on a global scale—people become more careful about what they reveal, and everybody knows the dirty secrets anyway.”

‘Invasion of privacy will be normed by public acceptance’

Alison Alexander, a professor at the Grady College of Journalism and Mass Communication at the University of Georgia, wrote, “Privacy infrastructure will be ever evolving and never finished. Hardware and software changes will present new issues on a regular basis. Governments’ need for data will continue to raise questions for the First and Fourth Amendment. Corporations will try new ideas with intended and unintended consequences. Currently, social views on privacy vary dramatically, ranging from, ‘Nothing to hide,’ to, ‘Be careful: it lasts forever,’ to the ‘Right to be forgotten.’ … Invasion of privacy will be normed by public acceptance of what was previously considered improper. Privacy will continue to be threatened by new ways to learn more about everyone.”

The ‘other’ 1% will emerge and want to live off the grid—and that will bring scrutiny

William Schrader, the co-founder and CEO of PSINet Inc., the first commercial ISP, observed, “A small percentage of the world’s population, perhaps a tiny fraction of 1% of mankind, will attempt to go off-grid or in some way disengage from big data. To accomplish this, they must own nothing that is tracked by government, such as real estate or autos, have no utilities in their name, have no bank account, and not earn a living by receiving a check or direct deposit. In short, they would only use cash, not own a phone, not have a tax identification number, etc. It is a challenging existence by today’s normal standards, and it is not one that is easy to maintain without sincere discipline. I expect that these off-grid people will be treated by authorities worldwide as suspect in some way, simply because they choose not to be tracked. That alone, being off-grid, will likely be made a serious crime… The original concept of privacy is dead. The new concept of privacy is: ‘Only the government and my friends know.’”

The change in norms will even affect dinner parties and dates

Alexander B. Howard, an expert on digital issues and government, wrote, “A much higher percentage of the public will understand that any action taken in view of another human with a connected smartphone or made upon a social media platform online could end up on YouTube or the evening news nearly instantly and potentially irrevocably. The ability of politicians and other public figures to keep the public’s business private will be substantially hindered, although wealthy and powerful people will continue to have the ability to pay to keep their private lives somewhat obfuscated. Social norms will evolve to a point where participants in dates and dinner parties will need to explicitly ask for agreement that conversations or other interactions be kept unrecorded.”

‘Transparency’ will replace ‘privacy’ as the social norm and ideal

Marc Prensky, director of the Global Future Education Foundation and Institute, wrote, “This genie is now out of the bottle: Protection of ‘private’ information will become almost (or, perhaps, completely) impossible because those who want it will always be ahead of those trying to protect it. So, as the last pre-Internet generation cedes control to the new global Internet generations, attitudes toward security, privacy, and intellectual property will be very different than the way we have thought of them in the past. In many, or most, areas, transparency will replace secrecy as the norm. These changes will not happen, though, at Internet speeds but more gradually as the last pre-Internet generation slowly dies off. In the future, there will be no privacy of information as we as we now know it and have known it in the past—any data put online will become transparently available to all, despite any and all efforts to prevent this. ‘Transparency’ will replace ‘privacy’ as the social norm and ideal.”

‘Technical innovation is outpacing regulators’ ability to act and react’

Glenn Edens, a director of research in networking, security, and distributed systems within the Computer Science Laboratory at PARC, a Xerox Company, wrote, “A major overhaul of the architecture of the Internet is required to meet the goals of privacy and the rampant use of personal information by commercial interests. It is not clear that these issues can be resolved by 2025 at our current pace. Technical innovation is outpacing regulators’ ability to act and react. It is not clear what direction public norms about privacy will emerge. There is evidence of change, as well as a lack of interest or education about the issues. Scott McNealy once said that ‘privacy is dead’—in some respects he might have been right.”

A ‘third option’ might emerge in independent data warehouses

Bryan Padgett, research systems manager for a major US entertainment company, wrote, “The current two-sided security-versus-privacy pendulum will be replaced by a third option—perhaps independent warehouses of data controlled by independent parties, fed by data providers, and accessed by government only when necessary. With increasing amounts of data being generated for and by all users worldwide, it will continue to be used for good and bad in increasing amounts… I can see a future where it is accepted that anonymity has fallen by the wayside as the online world and the real world become even more fused; however, along with the loss of anonymity, the ability to remove and prevent others from seeing and/or using your data (or data about you) will emerge to become clearer and easier to manage from a single entity. If that comes to pass, it would only come from a government or international agreement, with academia and the private sector creating the technical solution that allows it to work.”

Media literacy will be the key as technological evolution keeps changing the rules

Pamela Rutledge, PhD, and director of the Media Psychology Research Center, responded, “The privacy horse is out of the barn, in spite of the people arguing whether or not the barn door should be open or closed. A more critical issue is overcoming our anxiety over ‘the way things were’ and evaluating what needs to be protected for individuals, institutions, and governments. Policymakers do not have the expertise, or the incentive structure, to create adaptive regulations in an evolving environment. Technology innovators have the burden of financial accountability and will continue to balance the expansion of technology capabilities and features with the majority of consumer demands. Public perception is understandably narrow; most see privacy as being about Facebook settings or identity theft. We have unleashed a powerful tool on society without bothering to teach people how to use it. Media literacy will increasingly become the key to creating the demand for a reasonable balance of privacy and information control with commercial interests and personal experience. Our perceptions about privacy change as technology creates more things that define us and more ways to share. With technology increasingly reflecting our identity, privacy becomes equated with liberty, heightening our sensitivity to having choice. Social norms about privacy will change because increased technology adoption reduces technophobia, and technology use increases individual agency across all sectors of society. Individuals will increasingly demand to decide for themselves where, and when, the benefits of sharing outweigh the costs.”

‘I am not even sure if we really have a problem’

Nicholas Bowman, a professor at West Virginia University, commented, “I do not believe this will ever be a ‘solved’ problem, given that there is a diametric opposition between ‘monetization’ and ‘personal informative.’ A wise scholar, Steve Jones (of the University of Illinois at Chicago), once said that ‘if you don’t pay for content, you are the content,’ and there is an enduring truth to this… At least, in 2013, the only monetary value of the Internet seems to be for nano-casted advertising, which is only possible when users tell us who they are… Unless we find a different economic model to base our information infrastructure on, there will be no solution. Frankly, I am not even sure if we really have a problem. By definition, norms are always subject to the influence of time—in the 1900s, it was inappropriate for one to show their ankles in public at a beach, yet, by the 1940s, the revealing two-piece bikini was sold to the public as a way to conserve water-proof fabrics for the war effort; in just 40 years, one’s skin went from being a private affair to an expositionist one.”

Two Internets— or more—might evolve

D.K. Sachdev, a consultant and adjunct professor in satellite systems, wrote, “The objectives, as stated in the question, are, technically, achievable; however, it conflicts with the business plans of major social networks that, in fact, encourage users to act against their own privacy! I believe that there is scope for two separate networks: one totally secure and the other driven by social media. In a way, that is what Blackberry created, and government users all over the world relied on it. Unfortunately, it is shrinking because of the conflicting objectives of the market place.”

Thorlaug Agustsdottir, public relations manager for the Icelandic Pirate Party, replied, “There will be an alternative Web or alternative software that will offer people protection from snooping companies, while the ability to mask IP will probably still be a somewhat ‘advanced knowledge,’ while true anonymity will be, as it is today, just a myth (at least down to the IP level, if not to some kind of a central ‘official’ personal identification level). In 2025, people will be ever more concerned with online privacy.”

Generational change will shift norms—a president’s ‘drunk selfies’ won’t matter

Erin Stark, a respondent who shared no additional identifying details, said, “We have been sharing so much, with so little concern, for so long; personal information is no longer owned by the individual, but by the Google’s, Amazon’s, and Facebook’s of the world… We have grown up online… Today’s concern with privacy will be non-existent by 2025; our presidents will have drunk selfies made public, and our Supreme Court justices will have tweeted and blogged their hopes and fears. This is going to result in total openness.”

‘The sting of revealing too much will lessen’

Pamela Wright, chief innovation officer for the US National Archives, wrote, “A new way of looking at privacy may be established. The Internet will know you—your family, your doctor, your bank, where you got coffee this morning, everything substantive and seemingly trivial about your life and what you do—and that will erase your privacy, but will also protect you. This is a frightening concept, but it is already well down the road. Norms are already changing due, in part, to the ubiquity of social media use. What my generation considered strictly private is completely shareable for the next generation… I was recently taken aback when I saw a colleague had sent out a picture on social media less than two hours after she gave birth. By 2025, this will be considered a very private way to handle the news, as everything about the birth will be available online as it is happening—from pictures to all kinds of health data. We may be more forgiving of people as we see everyone’s personal foibles everywhere… In the future, I expect that no one will be able to control one’s image online enough to be spotless, and the sting of revealing too much will lessen.”

A ‘peace architecture’ is best for an ‘always-on’ world

Chris Uwaje, president of the Institute of Software Practitioners of Nigeria, wrote, “The stabilizing element of the future of the new, ‘always-on’ (AO) world will be overwhelmingly determined by a ‘peace architecture’ that has stubbornly eluded humanity. Therefore, today, policy makers and technology innovators may not have the ability to create a secured, popularly accepted, and trusted privacy-rights infrastructure by 2025 without first of all understanding that a global peace architecture is fundamental to the stability and survivability of the future world. Today, we assume that ‘peace’ is an integral part of human behavior, which, by extension, negates the philosophy of security, liberty, and privacy. Therefore, the AO world must deal with a ‘peace engineering infrastructure’ as the survivability tool of the future.”

This is a tipping-point moment—teetering on the brink of effective ‘mind control’

Mikey O’Connor, one of two elected representatives to ICANN’s GNSO Council, wrote, “The public will cheerfully trade massive invasion of their personal privacy in exchange for goods and services they would otherwise have to pay for. In so doing, they will also increasingly compromise the privacy of those they interact with, albeit inadvertently. If the current privacy-awareness surge is turned back by the well-organized coalition of private and governmental surveillance lobbyists, it seems quite possible that this will be the tipping point, beyond which there is no return. Thus, by 2025, this battle will be lost—and much of our humanity with it. On the other hand, let us think positively. Global climate change may have reached that tipping point, as well—in which case, we can be spectators in a race to see which exterminates us first—humans or Mother Nature. We are at a tipping point. We are teetering on the capability of truly effective mind control. Once we have actually arrived there, the concern about privacy will simply be scrubbed off the agenda, and privacy will become ever less of a concern as the older, less plugged-in generation dies off.”

‘Privacy will be perceived as a part of exchange’

Polina Kolozaridi, a faculty member at the Center for New Media and Society, based in Russia, responded, “The idea of what privacy is can change noticeably in 2025. Partly, privacy will be perceived as a part of exchange. It will be more difficult to have self-image without public profile, at least when it is opened to some institutions (starting with educational and healthcare systems). Partly, there will appear new sorts of private information (like thoughts, if neuro-tech will be fast enough). The problem of ‘whom I can trust here’ will probably remain… It will be like Pacific Ocean of transparency and some big islands (or even continents) of abilities to hide one’s personal data. It will not be easy to use such abilities… Having some profiles with information we consider private will be like owning an ID or a passport. It will be OK to trust some corporation or state to own it, but not OK to share it in some public profiles; social networking, like Facebook or Instagram, will not disclose more than now. But, there will appear chill-out, or media-out, zones when and where one may be out of all digitalization.”

A distributed system of user rights will replace the current hierarchical system

One extended answer came from Doc Searls, director of ProjectVRM at Harvard University’s Berkman Center for Internet & Society. He outlined how citizen-centered privacy protection can be created:

“There will be a privacy rights infrastructure in place long before 2025. I believe it will materialize within the next three to five years. It will not be a top-down system, however—meaning that it will not come from big companies, or from policy makers in the United States, the European Union, or other familiar targets of today’s privacy activism. Instead, it will come from new technological approaches that enable individuals and organizations to operate in full privacy without fear of surveillance. These approaches will be distributed, rather than centralized.

The spread of these approaches will follow the rules of heterarchy more than those of hierarchy. Adriana Lukas defines heterarchy as ‘a network of elements in which each element shares the same horizontal position of power and authority, each playing a theoretically equal role.’ In fact, this is not new, nor unfamiliar. It is embodied in the Internet’s founding protocols, as well as why the Internet grew so rapidly, wildly, and outside the control of companies and government.

Key to our emerging privacy-creating system will be the ability of individuals to assert their own terms, policies, and preferences in dealings with others, including companies and governments—and for equal consenting parties to work out norms that do not require intervention or control by large companies or governments. The principles and practices here are also not new. They are at the heart of freedom of contract, which was abandoned by large mass-marketing and mass-manufacturing companies in the Industrial Age, when scale required ‘contracts of adhesion,’ such as those we ‘accept’ without reading. Adhesive contracts brought ease to Industrial Age hierarchical systems but are obsolete in the Internet age, when everybody brings their own unique assets to the market’s vast table, as well as growing power over what can be done with those assets.

Freedom of contract is also central to a free and open society, as well as to the architecture of the Internet’s founding protocols. It is also anathema to the defaulted approaches of the phone and cable companies, by whose graces we enjoy access to the Internet. Fortunately, the Internet’s system is deeper than theirs and, therefore, will prevail. Oceans outlive boats—even the biggest ones.

The end state will be one in which individuals will enjoy far more control of their personal data, and privacy in general, than they do today, and that will be good for business.

We now live in two worlds:

One is the physical world that has been around since the Big Bang, and where we have operated civilization for the last few thousand years. The norms around privacy are highly developed, and very deep, in this world. The technologies providing privacy—clothing, doors, windows, curtains, shades, shutters, and so on—are familiar and easy for everybody to understand and to use. In this world, most of us also understand and respect private spaces, even when we can see and hear into them. This is why we at least try to ignore sounds made by people sitting at the next table at a restaurant or in line at a theater. Without these small courtesies, civilization would be much less civilized.

The other is the virtual world. This world is composed of binary math—ones and zeroes—and is structured around the Internet, which puts every end at a functional distance of zero from every other end, at a cost that veers toward zero as well. This world coexists with the physical one and is very new. We can date it from the appearance of the graphical browser, the ISP, and universal email, which came together in 1995. This world is going on nineteen years old, and no norms within it approach the maturity of those in the physical world. Behavioral norms in the virtual world are provisional, immature, and far from civilized. A store on Main Street, for example, would never plant a tracking beacon in a customer’s pants to report back on what the customer does after they leave the store; yet, this rude behavior is normative today on the commercial Web. By 2025, however, this kind of rudeness will be as gone in the virtual world as living naked in caves is gone in the physical world, simply because we will have invented the digital equivalents of clothing, doors, windows, sealed envelopes, and simple human courtesies.”

New identification standards will emerge

Another extensive analysis of how a system could be created—and what the obstacles might be—came from Francis Heylighen, a Belgian cyberneticist investigating the evolution of intelligent organization. He wrote:

“A key enabling technology for the future Internet will be a universal, secure standard for unambiguously establishing a person’s identity. Several, albeit uncoordinated, steps have already been made in order to create such a standard, including Web-enabled electronic identification cards in several European countries, the OpenID standard, and ORCID, an attempt to ensure that publications are attributed to the right author.

The reasons why standardization is slow to emerge tend to be social, economic, and political, rather than technological, as different corporations, governments and organizations are not inclined to exchange the valuable information they hold. An additional obstacle is people’s legitimate fear for invasion of privacy and abuse,

However, without universal regulation, abuse of private information by hackers, corporations, or governments is more, rather than less, likely, as no one knows who has access to which personal information, and as hardly any laws exist that specify what organizations can and cannot do with the information they possess.

Technologically, it is perfectly possible (albeit non-trivial) to develop secure schemes that anonymize data so that only the ones that really need information about an individual can get access to the specific data they require, and to nothing else. For example, a doctor who finds you collapsed in the street should be able to consult your medical record and to send a message to your next of kin, but should not have access to your financial record. Your bank, on the other hand, should know the transactions made from your account, but not your state of health.

Next to the technological challenge, the larger challenge will be to institute a system of rules and laws that specify exactly who can use which information about a person. This system should be perfectly transparent to the individual so that you can find out exactly what happens with your data and have the right to withhold information that is not crucial to the functioning of an organization.

The general principle is that you should be able to act anonymously for any non-crucial transaction, but that the distributed intelligence system should be able to maximally extract the collective (anonymous or non-anonymous) information that will help it to make better decisions, while also being able to securely and transparently address a specific individual with personalized recommendations.

Such a regulatory standard for data protection is, at this moment, being developed by the European Union. Once such a computational and legal technology is in place, interactions across the Internet are likely to become much safer and more efficient. People are less likely to worry about the free use of public, anonymized data, such as which kinds of people are most likely to get diabetes or to buy motorcycles, but who are less willing to tolerate that commercial or government organizations would claim property or control over their personal data.”