A large share of respondents said that technology alone can’t work to improve the information environment. Among these respondents, most pointed out two areas of concern: 1) The need for better funding of and support for journalism that serves the common good. The attention economy of the digital age does not support journalism of the general quality of the news media of the late 20th century, which was fairly well-respected for serving the public good with information that helped create an informed citizenry capable of informed decisions; 2) The need for massive efforts to imbue the public with much better information literacy skills; this requires an education effort that reaches out to those of all ages, everywhere.
Funding and support must be directed to the restoration of a well-fortified, ethical and trusted public press
Many respondents said the information environment can’t be improved without more well-staffed, financially stable, independent news organizations capable of rising above the clamor of false and misleading content to deliver accurate, trusted content.
The credibility of the journalism industry is at stake and the livelihood of many people is hanging in the balance of finding the tools, systems and techniques for validating the credibility of news.
Thomas Frey
Susan Landau, a North American scientist/educator, wrote, “The underlying question is whether this dissemination will expand or not lies with many players, many in the private sector. How will the press handle ‘fake news’? How will the internet companies do so? And how will politicians, at least politicians post-Trump? The rise of ‘fake news’ is a serious threat to democracy. Post-election [U.S. 2016], some in the press have been pursuing news with the same care and incisiveness that we saw in the Watergate era, but others are not. We have a serious threat here, but it is not clear that interests are aligned in responding to it. And it is not cheap to do so: securing sites against hacking is very difficult when the threat comes from a powerful nation state. Is there a way to create trusted, unhackable verification systems? This depends on what the use case is; it is a not 0-1 answer, but an answer in scales of grey. … If society cannot adequately protect itself against the co-opting of public information by bad actors, then democracy itself is in serious risk. We have had this problem for quite some time. … What has changed is the scope and scale of these efforts, partially through domestic funding, partially through foreign actors and partially through the ability of digital technologies to change the spread of ‘false news.’ What is needed to protect society against the coopting of public information is not only protecting the sources of the information, but also creating greater public capability to discern nonsense from sense. … I do not see a role for government in preventing the spread of ‘fake news’ – that comes too close to government control of speech – but I do see one for government in preventing tampering with news and research organizations, disrupting flows of information, etc.”
Timothy Herbst, senior vice president at ICF International, noted, “We have no choice but to come up with mechanisms to improve our information environment. The implications of not doing so will further shake trust and credibility in our institutions needed for a growing and stable democracy. Artificial intelligence (AI) should help but technological solutions won’t be enough. We also need high-touch solutions and a reinforcement of norms that value accuracy to address this challenge.”
Peter Jones, associate professor in strategic foresight and innovation at OCAD University in Toronto, predicted, “By 2027 decentralized internet services will displace mainstream news, as corporate media continues to erode trust and fails to find a working business model. Field-level investigative journalism will be crowdfunded by smaller consortiums, as current news organizations will have moved into entertainment, such as CNN already has.”
A senior international communications advisor commented, “I don’t believe that the next 10 years will yield a business model that will replace the one left behind – particularly with respect to print journalism, which in the past offered audiences more in-depth coverage than was possible with video or radio. Today, print journalists effectively work for nothing [and] are exposed to liability and danger that would have been unheard of 25 years ago. Moreover, the separation between the interests of those corporations interested in disseminating news and editorial has all but closed – aside from a few noteworthy exceptions. Moreover, consumers of media appear to be having a harder time distinguishing spurious from credible sources – this could be the end result of decades of neglect regarding the public school system, a growing reliance on unsourced and uncross-checked social media or any number of other factors. Bottom line is that very few corporations seem willing [to] engage in a business enterprise that has become increasingly unfeasible from a financial point of view.”
A futurist/consultant based in Europe said, “News has always been biased, but the apparent value of news on the internet has been magnified and so the value of exploiting it has also increased. Where there is such perceived value, the efforts to generate misleading news, false news and fake news will increase.”
An anonymous respondent wrote, “There are too many pressures from the need to generate ‘clicks’ and increase advertising revenue.”
There were complaints about news organizations in survival mode that neglect their role of informing the public in favor of pandering to it to stay afloat. Other experts worried about the quality of reporting in an age when newsrooms have been decimated.
An anonymous respondent wrote, “The talent pool the media system draws its personnel from will further deteriorate. Media personnel are influenced by defective information, and – even more – the quality of inferences and interpretations will decrease.”
Some expressed concerns about finding unbiased details about the world in an online environment that becomes more cluttered all the time with content that does not feature this. An anonymous survey participant wrote, “I worry that sources of information will proliferate to the point at which it will be difficult to discern relatively unbiased sources from sources that are trying to communicate a point of view independent of supporting facts.”
Thomas Frey, executive director and senior futurist at the DaVinci Institute, replied, “The credibility of the journalism industry is at stake and the livelihood of many people is hanging in the balance of finding the tools, systems and techniques for validating the credibility of news.”
Eileen Rudden, co-founder of LearnLaunch, wrote, “The lack of trust in established institutions is at the root of the issue. Trust will need to be re-established.”
An international internet policy expert said, “Demand for trusted actors will rise.”
This is not an easy fix, by any means. Kelly Garrett, associate professor in the School of Communication at Ohio State University, said, “Although technology has altered how people communicate, it is not the primary source of distrust in authority, expertise, the media, etc. There are no simple technical solutions to the erosion of trust in those who produce and disseminate knowledge.”
Rob Lerman, a retired information science professional, commented, “The combination of an established media which has encouraged opinion-based ‘news.’ The relative cheapness of websites, the proliferation of state-based misinformation and the seeming laziness of news consumers seems like an insurmountable obstacle to the improvement of the information environment.”
Elevate information literacy: It must become a primary goal at all levels of education
A number of participants in this canvassing urged an all-out effort to expand people’s knowledge about the ways in which misinformation is prepared and spread – an education in ways they can be wise and well-informed citizens in the digital age.
The only way is to reduce the value of fake news by ensuring that people do not fall for it …
Jacqueline Morris
Jeff MacKie-Mason, university librarian and professor of information science and economics at the University of California, Berkeley, commented, “One wonder of the internet is that it created a platform on which essentially anyone can publish anything, at essentially zero cost. That will become only more true. As a result, there will be a lot of information pollution. What we must do is better educate information consumers and provide better systems for reputation to help us distinguish the wheat from the chaff.”
Sharon Roberts, a Ph.D. candidate, wrote, “Social changes will be the ones that will affect our perception of the information environment. Just like there are still 1-888 psychic call lines content on television or ‘Nigerian princes’ promising money sending me email, it’s a social understanding of those meanings to be scams that have curtailed their [proliferation], not any actual TV or email technology ‘trusted methods.’”
Sharon Haleva-Amir, lecturer in the School of Communication at Bar Ilan University in Israel, said, “I fear that the phenomenon of fake news will not improve due to two main reasons: 1) There are too many interested actors in this field (both business and politics wise) who gain from dispersion of false news and therefore are interested in keeping things the way they are; 2) Echo chambers and filter bubbles will continue to exist as these attitudes are typical to people’s behavior offline and online. In order to change that, people will have to be educated since early childhood about the importance of both [the] credibility of sources as well as variability of opinions that create the market of ideas.”
Sandra Garcia-Rivadulla, a librarian based in Latin America, replied, “It will be more important to educate people to be able to curate the information they get more effectively.”
Jacqueline Morris, a respondent who did not share additional personal details, replied, “I doubt there will be systems that will halt the proliferation of fake news. … The only way is to reduce the value of fake news by ensuring that people do not fall for it, basically, by educating the population.”
Mike O’Connor, a self-employed entrepreneur, wrote, “The internet is just like real life; bad actors will find ways to fool people. Healthy skepticism will be part of the mix.”
Tomslin Samme-Nlar, technical lead at Dimension Data in Australia, commented, “I expect the information environment to improve if user-awareness programs and campaigns are incorporated in whatever solutions that are designed to combat fake news.”
Geoff Scott, CEO of Hackerati, commented, “This isn’t a technical or information problem; it’s a social problem. Fake news works because it supports the point of view of the people it targets, which makes them feel good, right or vindicated in their beliefs. It takes critical thinking to overcome this, which requires effort and education.”
Andreas Vlachos, lecturer in artificial intelligence at the University of Sheffield, commented, “I believe we will educate the public to identify misinformation better.”
Iain MacLaren, director of the Centre for Excellence in Learning & Teaching at the National University of Ireland, Galway, commented, “The fact that more people are now fully aware of the existence of fake news, or propaganda, as it used to be known, means that there is increasing distrust of unverified/unrecognised providers of news and information. … I would like to hope, therefore, that a more sophisticated, critical awareness is growing across society, and I certainly hear much to that effect amongst the young people/students I work with. This also shows the importance of education.”
Greg Wood, director of communications planning and operations for the Internet Society, replied, “The information environment will remain problematic – rumors, false information and outright lies will continue to propagate. However, I have hope that endpoints (people) can become more sophisticated consumers and thus apply improved filters. The evolution of email spam and how it has been dealt with provides a rough analogy.”
Some people said, though, that information-literacy efforts, while possibly somewhat helpful in some cases, will not have an effect in many situations.
Sam Punnett, research officer at TableRock Media, replied, “The information environment will improve but what will determine this will be a matter of individual choice. Media literacy, information literacy, is a matter of choosing to be educated.”
David Manz, a cybersecurity scientist, replied, “Technology exists and will be created to attribute statements to their source in an easy-to-understand manner. However, this will still require the public to want to know the quality and source of their information.”
Carol Chetkovich, professor emerita of public policy at Mills College, commented, “My negative assessment of the information environment has to do primarily with my sense that consumers of media (the public at large) are not sufficiently motivated and well-enough educated to think critically about what they read. There will always be some garbage put out by certain sources, so – even though it’s important that garbage be countered by good journalism – without an educated public, the task of countering fake news will be impossible.”
Peter and Trudy Johnson-Lenz, founders of the online learning community Awakening Technology, combined on this response: “If we rely on technological solutions to verify trust and reliability of facts, then the number of states of the control mechanisms must be greater or equal to the number of states being controlled. With bots and trolls and all sorts of disinformation, that’s virtually impossible. There are probably some tech solutions, but that won’t solve the entire problem. And walling off some sections of the information ecosystem as ‘trusted’ or ‘verified fact-filled’ defeats the purpose of open communication. … If you study microtargeting during the 2016 election, it’s clear that Facebook in particular was used to spread disinformation and propaganda and discourage voting in a very effective manner. This kind of activity is hard to discern and uncover in real time, it adds greatly to the polluted ecosystem and it is virtually impossible to control. Ultimately, people are going to have to make critical-thinking discernments themselves. Unfortunately, there are people who have no interest in doing that, and in fact discourage anyone else from doing that. The echo chamber is noisy and chaotic and full of lies. The only hope is some combination of technological advances to trust and verify, people being willing to take the time to listen, learn and think critically, and a rebuilding of trust. In our accelerating world, that’s a very big ask! For an eye-opening perspective on acceleration, see Peter Russell’s recent essay, ‘Blind Spot: The Unforeseen End of Accelerating Change.’”
Bruce Edmonds, a respondent who shared no additional identifying details, noted, “Lack of trust and misinformation are social problems that will not be solved with technical or central fixes. Rather, political and new normative standards will need to be developed in society.”
Anonymous respondents wrote:
- “Bad information has always been produced and promulgated. The challenge remains for individuals to stay skeptical, consider numerous sources and consider their biases.”
- “The way to solve the issue is not so much in designing systems for detecting and eliminating fake news but rather in educating people to manage information appropriately. Media and information literacy is the answer.”
- “Continued misinformation will help people to learn first-hand how bad information functions in any system.”