Comparing and Combining Text-to-Web and Panel-to-Web Surveys (Part 2)
Kevin Collins Kevin Collins

Comparing and Combining Text-to-Web and Panel-to-Web Surveys (Part 2)

In our last blog post, we described an experiment we conducted in 2024 and presented at AAPOR this past May to compare (and test strategies for combining) text-to-web and panel-to-web data. We found that the matched panel product we used was less expensive than text-to-web, but struggled to hit target numbers of completed interviews in two of the three mid-sized states included in the study. Panel respondents were more professional, meaning both they answered much more quickly but also passed attention checks at somewhat higher rates, and reported taking many more other surveys on average. But panel respondents were also less likely to report having a college degree and were less politically engaged. 

Read More
Comparing and Combining Text-to-Web and Panel-to-Web Surveys (Part 1)
Kevin Collins Kevin Collins

Comparing and Combining Text-to-Web and Panel-to-Web Surveys (Part 1)

Last week at the annual meeting of the American Association for Public Opinion Research, we presented on internal research conducted just before the November 2024 election. Despite being focused on using text messages to field surveys, the vast majority of research we do for clients is part of mixed mode projects, which is why we have been engaged in a multi-year research agenda to identify best practices for combining data across modes. For example, at last year’s AAPOR conference we presented research on comparing and combining text messages with phone surveys conducted in 2023.

Read More
How long should we stay in the field?  (Part II)
Kevin Collins Kevin Collins

How long should we stay in the field?  (Part II)

Today we return with part II of How long should we stay in the field (Part II), looking at how long to leave text to web surveys open.

In an experiment in the lead up to the 2023 Kentucky Governor election, we fielded an experiment randomizing a voter-file-sampled list to either receive text messages (or for landlines, IVR calls) or live interviewer calls. We wrote about some other findings from this survey previously. In the first round we let the initial text messages last a couple of days before sending a follow-up to non-responders. 

Read More
How long should we stay in the field?  (Part I)
Kevin Collins Kevin Collins

How long should we stay in the field?  (Part I)

Today we want to share some research, presented at last year’s AAPOR conference, on when to stop fielding.

The persistence and asynchronous nature of text message communication makes this a compound question: how long should we leave open the online component for a text-to-web survey, and should we send follow-up requests for participation among non-responders?

Read More
Improving Survey Performance with a Better Texting Interface
Kevin Collins Kevin Collins

Improving Survey Performance with a Better Texting Interface

Our software is custom- and purpose-built to conduct surveys. We are always on the lookout for software improvements that can help us better serve our sole mission of conducting surveys efficiently and accurately. Over the past several months, we have overhauled several aspects of our application to increase throughput and interviewer speed so as to be able to reduce costs for our clients, whether they are doing election polls, media measurement, or customer feedback surveys. 

Read More
Attention Checks
Kevin Collins Kevin Collins

Attention Checks

A forthcoming article in the Journal of Survey Statistics and Methodology by Tobias Rettig and Annelies Blom raises important concerns about the still all-too-frequent practice of screening out respondents on the basis of attention checks in web surveys.

Read More