Products

SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.

Get data-driven insights from a global leader in online surveys.

Explore core features and advanced tools in one powerful platform.

Build and customise online forms to collect info and payments.

Integrate with 100+ apps and plug-ins to get more done.

Purpose-built solutions for all of your market research needs.

Create better surveys and spot insights quickly with built-in AI.

Templates

Measure customer satisfaction and loyalty for your business.

Learn what makes customers happy and turn them into advocates.

Get actionable insights to improve the user experience.

Collect contact information from prospects, invitees, and more.

Easily collect and track RSVPs for your next event.

Find out what attendees want so that you can improve your next event.

Uncover insights to boost engagement and drive better results.

Get feedback from your attendees so you can run better meetings.

Use peer feedback to help improve employee performance.

Create better courses and improve teaching methods.

Learn how students rate the course material and its presentation.

Find out what your customers think about your new product ideas.

Resources

Best practices for using surveys and survey data

Our blog about surveys, tips for business, and more.

Tutorials and how to guides for using SurveyMonkey.

How top brands drive growth with SurveyMonkey.

Contact SalesLog in
Contact SalesLog in
Research

How to ask about attitude change

How to ask about attitude change

Did your new marketing campaign make customers more likely to buy your product?

Measuring change the traditional way

When something changes in your business strategy or in the world at large, it can cause shifts in attitudes, opinions, and behaviors. It could be a viral marketing campaign that you hope influences purchasing decisions, a new benefit you just introduced to your employees that might improve retention, a major political event that could change the results of an election, or a new disease that makes people more worried about their health. Whatever it is, you need to ask questions to know the impact.

The simplest way to ask about change may be to just ask:

Counterfactual-blog-Q1-v2

Let’s call this type of question the “change format.” It’s a standard type of survey question.

Interpreting the data for this type of question is straightforward; if 65% of people who saw your commercial say it made them more likely to buy your product, that’s a win! But what does that number mean? Are those people more likely to buy your product because of the commercial they saw, or are they expressing enthusiasm for the product in general? 

Survey respondents sometimes answer the question they want to answer, rather than the question they were asked; this is called “response substitution.” In our example above, they might answer with their overall impression of the product rather than how the commercial changed their opinion. Those who said “more likely” might be trying to say that they are very likely to buy the product, while those who say “less likely” might be expressing that they don’t want to buy the product at all – not taking the commercial into consideration. This can inflate your metrics and mislead you into thinking that your commercial had a stronger impact than it did.

An alternative way to ask about change

There’s another way to make sure you are measuring the real impact; ask your respondents about their current opinion, and then pose a hypothetical situation in which respondents didn’t know about the event and ask them to imagine what their opinion would be in that situation.

What would that look like in our Super Bowl example above? You would ask two questions:

Counterfactual-blog-Q2-v2
Counterfactual-blog-Q3-v2

This is the “counterfactual format,” a new method of measuring change. This format works differently from the change format because respondents are given the chance to express their current attitude, not just whether their attitude changed. Respondents get to answer both the question they were asked and the question they wanted to answer.

Using this format, you can calculate the percentage of respondents whose response to question 2 was lower on the scale than their response to question 1 – this is the equivalent to the “more likely” group in the change format. For example, if a respondent selected “Very likely” in question 1 and “Somewhat likely” in question 2, that means they were more likely to purchase your product after seeing the commercial. You can do the same thing in the opposite direction to find your “less likely” group. And those who didn’t change their answer are your “no impact” group. 

counterfactual-blog-table-v2

Experimenting with the counterfactual format

When we wanted to understand how the emergence of a new COVID-19 variant affected levels of concern about the pandemic, we were curious about how the counterfactual format would compare to the traditional change format. Randomly assigning respondents to one format or the other, we asked the following questions:

Change format

Counterfactual-blog-Q4-v2

Counterfactual format

Counterfactual-blog-Q5-v2
Counterfactual-blog-Q6-v2

We found that the counterfactual group showed significantly lower rates of attitude change than the change format group, with 73% of the counterfactual group vs. 43% of the change format group showing no effect of the emergence of the BA.5 variant on their concern about COVID. Only 17% of the counterfactual group vs. 39% of the change format group were more concerned about COVID, and 8% of the counterfactual group vs. 16% of the change format group were less concerned. 

counterfactual-blog-chart1-v3

Along with researchers from Temple and Yale Universities, we tested another version of the counterfactual format in measuring changes in political attitudes and found a similar reduction in attitude change (see Barari, Coppock, Graham, & Padgett, 2023).

The takeaways

The reduction in the magnitude of change for the counterfactual group suggests that the counterfactual format is more accurate than the change format. If we had used the change format alone, we would have overestimated the differences in levels of concern before and after the new variant emerged. 

We’ve looked at examples from advertising, public health, and political polling, but these findings are applicable to surveys in any field. The next time you want to know whether an event, a new promotion, an HR decision, or a marketing campaign has changed peoples’ attitudes and opinions, consider trying out the counterfactual format.