.

  • Written by Spencer Kimball, Associate Professor of Communications, Director of Emerson College Polling, Emerson College
How pollsters have adapted to changing technology and voters who don’t answer the phone

As the U.S. presidential election approaches, news reports and social media feeds are increasingly filled with data from public opinion polls. How do pollsters know which candidate is ahead in what swing state or with which key demographic group? Or what issues are most important to as many as 264 million eligible voters[1] across a vast country?

In other words: How do pollsters do what they do?

At Emerson College Polling, we lead[2] a dynamic survey operation that, like many others, has continuously evolved to keep pace with shifting trends and technologies in survey research. At the inception of survey research – about 100 years ago – data was primarily collected through mail and in-person interviews[3]. That’s not true nowadays, of course.

In the early days of the survey industry, being asked to participate in a poll was novel, and response rates were high. Today, we’re bombarded with survey requests via email, text, online pop-ups, and phone calls from unknown numbers. With fewer landlines, busy parents juggling work and family, and younger adults who rarely answer calls, preferring text communication, it has become much harder to engage respondents. This shift in behavior reflects the evolving challenges of reaching diverse populations[4] in modern survey research.

An overhead view of a crowd of people.
The goal is to describe a diverse community with a variety of viewpoints. ferrantraite/E+ via Getty Images[5]

Evolution of data collection

In the broadest possible terms, polls and surveys have two elements – choosing whom to contact, and reaching them in a way that’s likely to get a response. These elements are often intertwined.

In the 1970s, after household telephones had become widespread[6] in the U.S., survey operators adopted a random-sampling method called random digit dialing[7], in which the survey’s designers would choose the area codes they wanted to reach and live operators randomly dialed seven-digit phone numbers within that area code.

By the 1990s, pollsters began moving away from random digit dialing, which was time-consuming and expensive because the random selection often picked phone numbers that were out of service or not useful for opinion surveys[8], such as businesses or government offices. Instead, pollsters began adopting registration-based sampling[9], in which public voter registration records were used to compile the lists from which respondents were randomly selected.

The information in these and other associated public records, such as those detailing gender, age and educational attainment, allowed a refinement of random sampling called stratified sampling[10]. That’s where the one big list was split into subgroups based on these different characteristics, such as party affiliation, voting frequency, gender, race or ethnicity, income or educational attainment.

Survey-takers then chose randomly from among those subgroups in proportion to the population as a whole. So if 40% of the overall population have college degrees and 60% do not, a poll of 100 people would randomly select 40 people from the list of those with a college degree and 60 from the list of those without.

Other advances in ways to reach respondents emerged late in the 20th century, such as interactive voice response[11], which did not require live operators[12]. Instead, automated systems played recordings of the questions and registered the spoken responses. In 2000, internet-based polling[13] also began to emerge, in which participants filled out online forms.

From probability to nonprobability sampling

Over the past two decades, the rise of cellphones, text messaging and online platforms has dramatically changed survey research. The traditional gold standard of using only live operator telephone polls has become nearly obsolete. Now that phones display who is calling, fewer people answer calls from unknown numbers, and fewer of them are willing to talk to a stranger about their personal views.

Even the random sampling that was once standard has given way to a nonprobability sampling[14] approach based on increasingly specific population proportions. So if 6% of a population are Black men with a certain level of education and a certain amount of household income, then a survey will strive to have 6% of its respondents match those characteristics.

In quota sampling[15], participants may not be selected randomly but rather chosen as participants because they have specific demographic attributes. This method is less statistically rigorous and more prone to bias, though it may yield a representative sample with relative efficiency. By contrast, stratified sampling randomly selects participants within defined groups, reducing sampling error and providing more precise estimates of population characteristics.

To help polling operations find potential respondents, political and marketing consulting firms have compiled voter information, including demographic data and contact details. At Emerson College Polling, we have access to a database of 273 million U.S. adults, with 123 million mobile numbers, 116 million email addresses and nearly 59 million landline numbers.

A newer technique pollsters are using to reach respondents is something called river sampling[16], an online method in which individuals encounter a survey during their regular internet browsing and social media activity, often through an ad or pop-up. They complete a short screening questionnaire and are then invited to join a survey opt-in panel whose members will be asked to take future surveys.

A digital figure emerges from a long stream of other digital figures. Databases compile large amounts of information about many U.S. voters. da-kuk/E+ via Getty Images[17]

Emerson College Polling methodology

Our polling operation has used a range of approaches to reach the more than 162,000 people who have completed our polls so far this year in the United States.

Unlike traditional pollsters, Emerson College Polling does not rely on live operator data collection outside of small-scale tests of new survey methods to evaluate and improve the effectiveness of different polling approaches.

Instead, like most modern pollsters, we use a mix of approaches[18], including text-to-web surveys, interactive voice response on landlines, email outreach, and opt-in panels. This combination allows us to reach a broader, more representative audience, which is essential for accurate polling in today’s fragmented social and media landscape. This diverse population includes younger individuals who communicate through various platforms distinct from those used by older generations.

When we contact the people in our stratified samples, we take into account differences between each communication method. For example, older people tend to answer landlines, while men and middle-aged people are more responsive to mobile text-to-web surveys. To reach underrepresented groups[19] – such as adults ages 18 to 29 and Hispanic respondents – we use online databases that they have voluntarily signed up for, knowing they may be surveyed.

We also use information about whom we sample and how to calculate the margin of error, which measures the precision of poll results. Larger sample sizes tend to be more representative of the overall population and therefore lead to a smaller margin of error.

For instance, a poll of 400 respondents typically has a 4.9% margin of error, while increasing the sample size to 1,000 reduces it to 3%, offering more accurate insights[20].

The goal, as ever, is to present to the public an accurate reflection of what the people as a whole think about candidates and issues.

References

  1. ^ 264 million eligible voters (bipartisanpolicy.org)
  2. ^ we lead (emersoncollegepolling.com)
  3. ^ primarily collected through mail and in-person interviews (www.pewresearch.org)
  4. ^ evolving challenges of reaching diverse populations (www.salon.com)
  5. ^ ferrantraite/E+ via Getty Images (www.gettyimages.com)
  6. ^ household telephones had become widespread (www.statista.com)
  7. ^ random digit dialing (doi.org)
  8. ^ not useful for opinion surveys (doi.org)
  9. ^ registration-based sampling (doi.org)
  10. ^ stratified sampling (www.investopedia.com)
  11. ^ interactive voice response (www.nationalreview.com)
  12. ^ did not require live operators (www.wsj.com)
  13. ^ internet-based polling (today.yougov.com)
  14. ^ nonprobability sampling (doi.org)
  15. ^ quota sampling (doi.org)
  16. ^ river sampling (resources.pollfish.com)
  17. ^ da-kuk/E+ via Getty Images (www.gettyimages.com)
  18. ^ we use a mix of approaches (emersoncollegepolling.com)
  19. ^ reach underrepresented groups (doi.org)
  20. ^ offering more accurate insights (www.mediaethicsmagazine.com)

Authors: Spencer Kimball, Associate Professor of Communications, Director of Emerson College Polling, Emerson College

Read more https://theconversation.com/how-pollsters-have-adapted-to-changing-technology-and-voters-who-dont-answer-the-phone-240283

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more