Keywords

1 Introduction

The 2020 Census was the first U.S. census to use an online reporting option as the primary data collection vehicle. The online census questionnaire was available between March and October of 2020, with Census Day being April 1, 2020. Most residential addresses in the U.S. were mailed notifications about answering the census. In these mail pieces, the URL of the online census questionnaire was included along with the authentication code, called the Census ID, specific to that address. The public could report their census information even if they did not have a Census ID, but they would have to enter their address. In May 2020, Census Bureau staff noticed a higher-than-expected use of the online census path where users did not enter a Census ID and these respondents were more likely to exit the census questionnaire before reaching the end of it. The former problem could have been due to confusion on the questionnaire’s login screen, or problems with the mail material where the authentication code was printed. The latter problem could also have been a usability problem with that questionnaire path. To examine if there were usability issues with the online census questionnaire or the mail materials, staff created and disseminated a short 3-min survey to measure satisfaction with the online census experience called the 2020 Census User Experience (UX) Survey.

Because the UX survey was not originally part of census production planning, and due to restricted mobility during the COVID-19 pandemic, staff needed a convenient way to notify the sample to minimize unanticipated costs and additional staffing needs. Contact via text message appeared to be a viable option, as the Census Bureau requested respondents’ phone numbers in the online census in the event of any official business follow-up. The UX survey was the Census Bureau’s first experience with text-only notification for an online survey.

While we did not select a sample for the UX survey based on respondent demographic characteristics, this paper repurposes the data to look at age-related differences. One goal is to determine whether older adults who answered their census online would be more or less likely to use text messaging as a communication vehicle to access another online survey. Measuring any limitations associated with notification modes is important to reduce total survey error [1]. The other goal of this analysis is to see whether older adults differed in their satisfaction with the online 2020 Census questionnaire compared to middle-aged and younger adults. As this was the first census to use an internet questionnaire as the primary mode of response, documenting if the online survey was satisfactory for all respondents, regardless of age, was important.

To be in the sampling frame for the UX survey, the respondent had to initiate or complete the online version of the 2020 Census, which included entering a cell phone number as their contact number. There was some concern that the online census questionnaire would pose a burden for older adults as internet access is not as prevalent with that age group [2, 3]. However, older adults traditionally have been considered “good” responders to the census in general, with high response rates. In a study measuring the attitudes and behaviors of the U.S. public, older respondents were more likely to report intent to participate in the census while younger householders were least likely to report they intended to participate in the census [4,5,6]. These findings suggest that while some older adults will not be able to use the online census questionnaire, for those who have internet access, their likelihood of answering the census online should be higher than younger adults.

To access the UX survey, the census respondent also had to respond to a text message sent to the cell phone number collected by the online census questionnaire. Like internet access, older adults are less likely to text, with differences especially apparent by health and socioeconomic status. Those with fewer physical limitations and higher education levels use technology more [7]. Other studies show that older adults who use text messaging do so successfully to build social relationships [8] and improve health experiences [9, 10]. Given the fact that older adults have been shown to be “good” responders and that these particular older adults are technically savvy since they already completed the 2020 Census online, we predict that there will be age-related differences in engaging with a survey through a text notification. Our hypothesis is that older adults will be more likely to engage with the text message than middle-aged or younger adults. Engagement includes actions such as refusing to participate (engaging with the text by replying STOP) or participating (accessing, initiating, or completing the web survey by selecting the link). We also examine participation as measured by breakoffs while in the survey (this is the difference between those who initiate and complete the survey).

We also wanted to determine whether user satisfaction of the online census questionnaire differed by age. Researchers who have examined the “positivity effect” associated with older adults [11] have observed a shift in behavior and attitude from a negativity bias early in life to a positivity bias in middle and late adulthood [12]. While there is some debate in the literature whether this phenomenon occurs because older adults forget or suppress negative experiences more easily, or because they simply assess those experiences more positively, studies tend to point to the latter [13,14,15,16]. In prior Census Bureau research with online mobile web surveys, adults 50 years and older rated their survey experience more positively than adults under 50 [17]. Based on this literature our second hypothesis is that older and middle-aged adults will be more satisfied with their online census experience than younger adults.

2 Methods

Below are highlights of methods relevant to the online follow-up satisfaction survey.

2.1 User Experience Survey Questions

The UX survey took on average three minutes to answer and was programmed in Qualtrics, an off-the-shelf software application for designing web surveys. The first question in the UX survey (Fig. 1) asks for the respondent’s satisfaction with their online census questionnaire experience. The UX survey measured respondent satisfaction with the online census questionnaire to examine why census respondents were not using their Census ID. Because respondents who did not use their Census ID were also less likely to click “submit” at the end of the survey, the UX survey also sought to better understand whether something about this path led to increased exiting prior to selecting “submit” at the end of the questionnaire [18].

Fig. 1.
figure 1

Satisfaction question in the 2020 Census User Experience Survey

2.2 User Experience Survey Sample

We selected a representative nationwide sample of 2020 Census respondents who reported online. This sample consisted of 153,000 phone numbers to receive the User Experience Survey via a web-link in a text notification. These were likely cell phones as we took the phone numbers collected in the online census and matched them to an administrative list of possible cell phone numbers in order to remove landlines before sampling. Online responses from Puerto Rico were also removed as our satisfaction survey was in English only.

We stratified and sorted the sample before sampling. We did not aim to sample particular demographics of respondents, but rather to include 2020 Census online responders who might have had different experiences, such as those who used a mobile phone versus those who used a large device (such as a laptop or PC) to answer the census. We also wanted to oversample those who did not use the Census ID as the authentication code and those who did not fully submit their census during the session (that is, those who had answered a number of the questions in the questionnaire but who had failed to click the “submit” button).

In terms of timing, the “early responders” (those reporting in March through June) were sent the text notification to participate in the UX survey in August. For the later responders, text notifications were sent in one of three additional waves (September, October, or November 2020) based on when they answered the census. Those texts were sent about a month after they had answered the census. Each wave had an 11-day field period from the time the first text was sent until the closeout of that wave.

2.3 User Experience Survey Notification Method

For this UX survey, we sent up to three texts. Figure 2 is an image of how the first text message looked on a phone. Once respondents finished the UX survey, they did not get the subsequent texts. Recipients could also reply STOP and they would be taken off any subsequent messages.

Fig. 2.
figure 2

Example of the first text notification for the 2020 Census User Experience Survey

Texts were sent through the SMS texting capability in Qualtrics. Users saw a five-digit number, like what is shown at the top of Fig. 2, on their phone. The three messages for the three texts underwent expert review and limited pretesting. We sent the texts during the daytime at 12 noon or 6 pm in the respective time zone of the address associated with the phone number. We sent texts during the week and not on the weekend or holidays. We sent the first two texts two or three days apart and then the third text about a week later. See [18] for more details about the text messages.

2.4 Sample for These Analyses

To study age-related differences in the use of text message notifications to access a web survey, we examined three age groupings of respondents: 18–29, 40–51, and 65–76. Those groupings are far apart enough to detect age-related differences and were used in other age-related analysis at the U.S. Census Bureau [19]. Out of the 153,000 in sample for the UX survey, 70,392 fell into one of the three age ranges used in our analysis as shown in Table 1.

Table 1. Recipient age groups used in analyses

As stated earlier, the original UX frame was stratified and sorted by characteristics that could impact user satisfaction with the online experience in order to determine why users did not use an authentication code and did not select the submit button. Table 2 includes the breakdown of the original UX survey sample by these different characteristics, and then the breakdown of those characteristics for the 70,392 cases we use in the analyses reported in this paper. The data illustrate that our original percentages differ slightly from the sample we use in this paper. Percentage-wise, our dataset for these age-related analyses includes more larger device users, more users who used the authentication code to access the census, more users who fully submitted the census, and finally more users who reported early to the census.

Table 2. Session characteristics sampled and with respondent age

2.5 Analysis Methods

To address the first hypothesis, we initially conduct a Chi-square test of independence to examine whether the text message usage differs by the three age groups, specifically:

  • Churn rate (the percent who replied STOP to the text message to remove themselves from receiving future text messages from the Census Bureau);

  • Access rate (the percent who selected the survey link in the text message – they may or may not have completed any of the survey);

  • Minimum completion rate (the percent who answered at least the first question in the UX survey, including those who answered or got to the last question in the UX survey);

  • Full completion rate (the percent who answered or got to the last question in the UX survey); and

  • Breakoff rate (of those who started the UX survey, the percent who answered at least the first question but who did not finish the UX survey). This is the difference between those who reached minimum survey completion and those who reached full survey completion.

We then use logistic regression models to check for possible confounding factors and examine the same five elements. The independent variable of interest is the age groupings. The older adult age group is the reference group so that we can compare older adults to the young adults and older adults to the middle-aged adults. The models control for other characteristics from Table 2 which might influence responding to the UX survey: the device, use of an authentication code, whether the census was fully submitted, and the date they responded to the census.

For the second hypothesis, first we examine the five-level satisfaction scores (from very satisfied to very dissatisfied) by the three age groupings to see whether there is a relationship between satisfaction and age using a Chi-square test of independence. Then we use a proportional odds model to explore whether there are significant differences in satisfaction by age, again controlling for the same variables mentioned earlier.

In the analysis, we consider significance to be p = 0.05 or less.

3 Results

There was a 5.9% access rate and a 15.7% churn rate across all 70,392 recipients, with a small amount of overlap between the two groups. The remainder, 78.6%, did not engage with the texts at all. Most of the people who accessed the survey answered the first question (5.3%). Of the 5.3% who accessed the first question, 16.7% of them broke off before completing the entire UX survey. Overall, satisfaction for the online census questionnaire experience was high; 87.4% reported either being very or somewhat satisfied.

3.1 Hypothesis 1: There Are Age-Related Differences in Engaging with a Text-To-Web Survey

Table 3 includes the percent of each age subgroup who replied STOP to the text, accessed the UX survey link from the text, answered the first question, and finished the survey. The bottom row provides the Chi-square test of independence statistics for each column. Each of the Chi-square statistics are significant so we fail to reject our hypothesis. There is evidence that engaging with the text-to-web survey is dependent on age for these four measures.

Table 3. Text-to-web survey engagement rates by age groups

Logistic regression models allow us to examine differences between age subgroups while controlling for fixed effects that might have also influenced participation in the UX survey. For churn, we found that younger adults were less likely to reply STOP than older adults (β = −.33, p < .01). We found no evidence that middle-aged adults were more or less likely to reply STOP than older adults (β = −.009, p = .5). On survey access, we found that younger adults and middle-aged adults were both less likely to access the UX survey than older adults (β = −.4, p < .01 and β = −.06, p < .01, respectively). This same pattern held for answering the first question in the survey. However, for fully completing the survey, while young adults were less likely to fully complete the UX survey than older adults (β = −.04, p < .01), there was only marginal evidence that middle-aged adults were less likely to fully complete the survey compared to older adults (β = −.04, p = .06).

We separate breakoffs from the other engagement elements in Table 3 because for breakoffs, we subset the data to examine only those who answered the first question in the survey. Table 4 includes the percent of each age subgroup who broke off before completing the entire UX survey with the Chi-square test of independence statistic. In the table we see the pattern of younger adults having the highest breakoff rate followed by middle-aged adults and then older adults. However, the Chi-square was not significant for age groupings by breakoffs. Thus, we conclude the tendency to break off is independent of age. When controlling for the other factors in the logistic model, we confirm this finding. That is, we did not find evidence that age groupings differ in their breakoff rates (Wald χ2(2) = 1.5, p = .5). Younger adults were no more or less likely to breakoff within the UX survey than older adults (β = −.05, p = .5), and the same held for middle-aged adults. They too were no more or less likely to breakoff compared to older adults (β = −.03 p = .6).

Table 4. Respondent age groups by breakoffs

3.2 Hypothesis 2: Older Adults and Middle-Aged Adults Will Be More Satisfied with Their Online Census Questionnaire Experience Than Younger Adults.

Table 5 contains the satisfaction results for the three age groupings: younger adults, middle-aged adults, and older adults. The Chi-square statistic was significant (χ2(8) = 99.0, p < .01) meaning satisfaction differed by age group.

Table 5. Satisfaction by age groups

The proportional odds model found age to be associated with satisfaction when controlling for the other fixed effects. Younger adults were more likely to report lower satisfaction than older adults (with coefficient β = −0.3849, p < 0.01), but middle-aged adults and older adults did not differ in satisfaction ratings (with coefficient β = 0.0037, p = 0.5). This provides evidence in favor of Hypothesis 2.

4 Discussion

The purpose of this research was two-fold: to learn more about older adults’ use of text messages for survey notification and to explore whether these older adults had a satisfactory experience using the 2020 Census online questionnaire, or whether there were age-related differences in satisfaction. While the 65 and older population in the U.S. generally has the lowest rate of internet access, still more than 60% of them have internet access and many of them used it to complete the 2020 Census online.

Most of the U.S. was notified about the online census through letters and postcards mailed through the U.S. Postal Service. In a follow-up survey measuring user satisfaction, the Census Bureau was able to study how text-notification for an online survey would work for the public. We defined engaging with the text messages as either selecting the survey link and answering some or all of the survey, or replying STOP to end the text message notifications. For older adults (defined in our experiment as those 65–79 years old) who answered the census online and who had a cell phone, we found that they were more engaged with the text messages than either younger adults (18–29 years old) or middle-aged adults (40–56 years old). Based on these data, we conclude that text message is a viable communication tool for older adults who are already online and who have a cell phone.

It is impossible, however, to disentangle whether using text message to notify older adults of other survey opportunities would result in the same level of engagement had the survey not been for census follow-up purposes. Past research has shown that older adults are aware of the census and its importance. These factors likely contribute to the high rate of self-report among this age group. Nevertheless, knowing that it is possible for older adults to engage with text messages for survey purposes is useful for other survey research activities. Also, once engaged in the online survey, older adults are not more or less likely to break off before completing than the other age groups studied.

In this experiment we examined satisfaction with the online census questionnaire experience. Similar to prior research findings [17], we continue to find that older adults report higher satisfaction with their survey experience compared to younger adults. However, so were the middle-aged adults (40–56 years old). As suggested in the literature, older adults exhibit a “positivity effect” which could help explain these findings.

5 Conclusions

This research set out to examine the use of text messaging for survey notification and to gather more data on how satisfaction of survey experiences differs by age, specifically if we could replicate a previous result that older adults tend to rate survey experiences as more satisfactory than younger adults. For older adults who are able to answer online questionnaires, like the census, and who have cell phones, text notifications about an online survey can be as effective as it would be for other age groups. We also found a greater satisfaction for those 40 years and older compared to those 18 to 29 years old, which may be explained by the general “positivity effect” found in middle aged and older adults. Greater satisfaction might also be explained by differences in household composition between younger adults and middle-aged and older adults. Future research should address whether controlling for covariates such as the household size and relationships within the household affects the differences in satisfaction by respondent age. Finally, the data used in this analysis continue to show how engaged older adults are with the U.S. Census.

6 Limitations and Implications for Future Research

Our research on older adults is limited to the group who uses the Internet and who has a cell phone to receive text messages. Other older adults who do not have these characteristics might behave differently.