The Pew Research Center often receives questions from visitors to our site and users of our studies about our findings and how the research behind them is carried out. In this feature, senior research staff answers questions relating to the areas covered by our seven projects ranging from polling techniques and findings, to media, technology, religious, demographic and global attitudes trends. We can’t promise to respond to all the questions that we receive from you, our readers, but we will try to provide answers to the most frequently received inquiries as well as to those that raise issues of particular interest.
If you have a question related to our work, please send it toinfo@pewresearch.org.
Q. The Pew Internet & American Life Project has tracked the growing importance of the Internet and digital media and how these tools have been used during past campaigns. What kinds of new trends are you expecting to see in the 2012 campaign?
One of the most exciting things about our work studying politics is that each campaign cycle since 1994 has generated its own internet story. We expect that to happen again in 2012.
A little history: in 1994, the story was the appearance of the first campaign web site, by Senator Diane Feinstein. In 1996, web politics turned presidential. One of the top stories that year occurred when Republican nominee Bob Dole gave the wrong URL for his site during a nationally televised debate. But he didn’t seem to suffer any serious problem from the gaffe, demonstrating how peripheral internet politics was to the basic structure of a campaign just a decade ago.
By 1998, some clear internet effects began to emerge. Independent Jesse Ventura stunned the Minnesota political establishment by conducting an insurgent campaign that featured email communications. Two years later, GOP presidential candidate John McCain proved that candidates could raise a lot of money online and harness the power of the Internet as a source of political news and information.
In 2004 presidential race, Howard Dean’s campaign demonstrated how social networking tools like blogs and meetups could be effective in generating voter interest, recruiting and motivating volunteers and changing the interplay between citizens and campaigns.
The 2006 midterm election campaign was perhaps most famous for the rise of online video, highlighted by Virginia Republican
“>George Allen’s “macaca” video. Additionally, robo-calls (pre-recorded telephone calls soliciting votes) became a prominent part of the political environment.
The main story of the historic 2008 presidential race was the way in which all kinds of social media tools came to prominence: Our survey and report demonstrated the importance of candidate Facebook pages, Twitter feeds and texting services. And the 2010 race was notable for innovation in the mobile space and the growing reliance of voters on social networking sites and Twitter.
As we plan for the 2012 race, we’ll be paying attention to how these previous trends extend themselves. We’ll be looking for changes in social media use of all kinds and for more activity tied to mobile connectivity; not only will that include the use of texting (which we recently found is increasingly important to people in their general communication exchanges), but we will also ask people for the first time about mobile apps that are tied to political activity. There is already an interesting debate brewing about political apps that has been covered well – see here and here. It will be fascinating to watch which candidates try to exploit this new tool.
Perhaps the biggest phenomenon we’ll try to explore is something that worries a lot of observers: the degree to which people live in political echo chambers. This refers to the degree to which they seal themselves in info-bubbles in which they might only encounter people who share their political views and exchange only similar ideas.
We took a serious stab at exploring this phenomenon in 2004 and came away with findings that did not match the views of those who worry about echo chambers. That was before the rise of Facebook and other social networking sites, so there is reason to return to our study of this concept with fresh ideas and new questions.
Lee Rainie is Director of the Pew Internet and American Life Project
Q. What is the value of polls that match an incumbent or specific candidate against a “generic” Republican or Democrat? Some of these polls give respondents an extra option by choosing the answer “it depends on the candidate,” and some just leave as the only other choice being “undecided” or expressing “no opinion.” I assume that makes a difference.
The so-called “generic ballot” provides a snapshot of how an incumbent president would fare against an unnamed challenger from the opposing party. In this regard, the question is simultaneously posing a referendum on the incumbent and a test of the popularity of the other party. The trend on this question can be informative. In Pew Research Center polls in July and August, Barack Obama ran about even with an unnamed Republican; he held a significant lead earlier in the year. At the same time, however, the generic ballot has no predictive power, particularly at this very early stage in the presidential race.
Generic questions that invite respondents to say “it depends” will certainly get a much higher percentage declining to choose between the candidates. That may be a more honest reading of public opinion at this point in the campaign, but is even further from simulating the actual choice that voters will face in November 2012.
The so-called “generic ballot” tends to be a good predictor of the outcome of congressional elections in the off-years. Many survey organizations, including Pew Research Center, use such questions to help gauge the size of possible swings in party representation in Congress. But because presidential elections ultimately hinge on a much more personal choice, with far more information available to the average voter, generic questions are rarely used once the field of nominees has been winnowed.
Scott Keeter, Director of Survey Research, Pew Research Center
President, American Association for Public Opinion Research, 2011-2012
Q. There is so much always changing about the Internet and how people use it, plus the rise of social media and all the new devices people use. How does the Pew Internet Project decide what topics and trends are important to study?
In the course of doing those surveys, we always collect demographic data and we frequently issue reports and statistics about teens, seniors, men and women, digital divide issues, rural technology use, and a host of other subjects tied to tech-user data.
The second broad strategy driving our research is to focus on six subjects that cover key aspects of the way the internet is affecting people. We look at the impact of technology on 1) families; 2) communities, both in the real world and the virtual world; 3) health and health care; 4) education, both formal and informal; 5) civic and political life; and 6) work places.
Our writ from The Pew Charitable Trusts is to try to generate data and analysis that will be useful to policy makers, scholars, important organizations of all kinds, and interested citizens. However, we do not do that research with policy recommendations in mind. We do not take positions on policy matters, or promote (or challenge) particular technologies or companies. So, we do our research in a way that we hope those communities might find useful and will interpret in their own way.
From time to time, we feel that this mandate from The Pew Charitable Trusts necessitates that we try to get survey readings on important policy issues such as privacy and identity matters, the way people use and think about e-government services, and the impact of spam. We pick those topics when we believe that insights from technology users will help inform policy debates, so we try to be topical and timely.
We are always assessing the technology environment to see what new gadgets, activities and applications are emerging, and we change our questions based on our sense of when these have reached a critical mass of adoption in the general population. One of the key tools we employ to explore what’s coming next is to ask experts every so often about their views about the future of the internet and the likely social impacts that will occur. This is one of the best ways we know to keep our eyes on the horizon.
We are interested in hearing from stakeholders about the kind of research questions we might tackle. We invite you to send your ideas to info@pewresearch.org/internet. And I invite you to sign up to participate in occasional surveys that we conduct of long-time technology users. Email me at lrainie@pewresearch.org/internet if you’d like to participate in those surveys.
Lee Rainie, Director, Pew Research Center’s Internet & American Life Project
Q. With the 2012 election approaching, shouldn’t the Pew Research Center now concentrate on conducting opinion surveys among voters, rather than all adults? What do I need to know in comparing an “all adults” to a “registered voters” poll?
The presidential campaign is moving into high gear and many polling organizations are, in fact, conducting surveys only of registered voters, or even in some cases only likely voters. This is not the approach of the Pew Research Center, which studies public attitudes toward politics, the press and policy issues. On subjects ranging from the debt crisis to the war in Afghanistan — issues that affect all Americans, voters and non-voters alike — the Center attempts to get the broadest possible measure of public attitudes.
This is not to say that the Center does not track the electoral preferences of voters. While we ask election questions — indeed, all questions — of the public, we present the results of election questions based on registered voters in our survey reports. Registered voters are adults who say that they are “absolutely certain” they are registered to vote in their precinct or election district (usually around three-quarters of all survey respondents). Typically, the views of registered voters are not very different from those of the general public. Yet on election questions, we feel it is important to present the results based on those who are at least certain they are registered to vote; a person who tells us they are not registered, or is not sure, is very unlikely to cast a ballot. As the election approaches in fall 2012, we will increasingly report on the preferences and attitudes of likely voters as well as registered voters. More detail on how we identify likely voters is available here.
In our July 28 survey report — see “Obama Loses Ground in 2012 Reelection Bid” — 41% of registered voters said they would like to see Obama reelected while 40% said they would prefer that a Republican candidate win the election. The accompanying topline questionnaire shows the responses of all adults — as well as registered voters — on this question: 42% of all adults said they favored Obama’s reelection while 37% preferred a Republican. The slight Obama edge among all adults is not unusual; those who tell us they are not registered to vote include more young adults and minorities — groups that tend to be more supportive of Obama.
On election questions, the Pew Research Center wants to provide an accurate gauge of voter intentions. Yet the Center also has conducted extensive research on non-voters — who they are, what they think, and why they do not vote. For more, see “The Party of Non-Voters” from October 2010 and “Who Votes, Who Doesn’t, and Why” from October 2006.
Carroll Doherty, Associate Director, Pew Research Center for the People & the Press
Q. I am always frustrated by polls asking whether one is a liberal, moderate or conservative. My feeling is that about two thirds of Americans are liberal on social issues and conservative on economic issues. (In other words they are actually Libertarians) Can’t you ask this question better? Even laying out “litmus test” questions on gun control, abortion, the effect of more or less taxes and deficits, gay marriage, national defense (foreign adventures), space exploration, size of government, global warming (and what to do about it, assuming it exists), etc. I fear that many people answer “moderate” because they are taking an average, so to speak, while having very strong but inconsistent and diverging opinions — anything but moderate.
The summary measure of political ideology you refer to has been in use — in one form or another — since the 1930s. It is useful to us for summarizing trends in ideology, and when used in conjunction with party affiliation provides a powerful way of segmenting the public. We certainly find that self-labeled conservatives tend to take conservative positions on issues, while self-described liberals tend to take liberal positions. Moderates, as you suggest, often express a mix of views.
But the question is far from perfect. For one thing, some people do not understand the terms “liberal” and “conservative.” More important, the views of some people do not fit neatly into what we have come to think of as the liberal or conservative traditions, as you suggest in your question.
In order to account for the many dimensions of public attitudes, we periodically take a look at the public through the lens of our “political typology” (see “Beyond Red vs. Blue: The Political Typology“), which uses a series of questions about basic political values to divide the public into nine core political groups. There is indeed a group of Americans — whom we labeled “Libertarians” — who express fairly liberal views on social issues and conservative views on economic issues (see the profiles of all the groups here). But this GOP-leaning group accounts for only 9% of the U.S. public. There also are two Democratic-leaning groups who hold a blend of socially conservative values and moderate-to-liberal economic values. In fact, relatively few people hew strictly to consistent liberal and conservative opinions on all issues. That doesn’t mean that the terms have completely lost their utility in American politics, but it’s one of the reasons that our approach to the study of public opinion tends to focus more heavily on questions about specific political issues than on broader questions of ideological sentiment.
Scott Keeter, Director of Survey Research, Pew Research Center
President, American Association for Public Opinion Research, 2011-2012
Q. My wife and I did not receive the 2010 census form and were not polled/interviewed/counted. How many U.S. residents were missed in the census?
The 2010 Census certainly did miss some people; in that respect, it is like every other census. The Census Bureau conducted an independent follow-up survey in order to estimate how many people were not included. Results will be published next year. Some other quality indicators already are out — for example, the bureau said it obtained at least some useable information from 99.62% of the nation’s housing units, an increase from 99.45% in 2000.
If the Census Bureau does not receive a completed form from a particular address, and census-takers cannot reach anyone there after multiple visits or calls, the agency tries to obtain information about the household from neighbors or a building manager. If that fails, the Census Bureau resorts to a last-ditch statistical technique called “imputation” to fill in the missing data. As this article on our “All Things Census” page explains, agency analysts use imputation when they do not even know whether someone lives at a particular address, when they know someone lives there but not how many people, and when they know how many people live somewhere but do not know their race or other characteristics. Imputation is based on what bureau analysts know about the size and type of neighboring households. In the 2010 Census, 1.16 million people were added to the count through imputation, or .39% (less than half a percent) of the total.