skip to main navigation skip to secondary navigation skip to content
Board of Governors of the Federal Reserve System
skip to content

Insights into the Financial Experiences of Older Adults: A Forum Briefing Paper

Appendix A: Older Adult Survey Methodology

The Older Adult Survey was administered online to a sample of respondents from RAND's American Life Panel (ALP). The ALP is an Internet panel that includes more than 5,000 people age 18 and older who have agreed to participate in periodic online surveys. RAND has used several different methods to recruit panelists. These include inviting respondents to other surveys (e.g. University of Michigan Surveys of Consumers and the Stanford/Abt SSRI National Survey Project) to join the panel; snowball sampling and respondent-driven sampling to recruit contacts of existing members of the panel; and recruiting from other sources drawn from nationally representative samples. While some additional members of respondent households have been recruited to join the panel, the ALP is intended to be a survey of individuals, not households.72

Respondents are paid an incentive for their participation and, when necessary, are provided with Internet access that can be used for responding to surveys, as well as for general access to e-mail and the Internet.73 At the time that the Older Adult Survey was collected, over 300 surveys had been fielded on the ALP since January 2006. Topics covered on prior surveys include financial decision making, the effect of political events on self-reported well-being, inflation expectations, joint retirement decisions, retirement preferences, health decision making, Social Security knowledge and expectations, measurement of health utility, and numeracy.74 Panelists generally receive about two invitations a month to respond to surveys. In recent years, the majority of the core instruments of the Health and Retirement Survey (HRS) have been fielded on the panel, providing a rich set of variables on health, disability, retirement, pensions and other topics that can be matched by respondent identifier for analysis in connection with other survey data collected on the ALP.75

The Older Adults Survey was developed by staff from the Division of Consumer and Community Affairs (DCCA) at the Federal Reserve Board. RAND was consulted at various stages in order to maximize the response rate. In November 2012, a draft survey was pre-tested with 150 panelists. The resulting data were reviewed for completion time, response incidence, completion rates, and other response issues. Based on this review, the survey instrument was revised. The final survey consisted of 62 questions, but individual respondents may have received fewer questions based on skip patterns.

E-mail invitations to participate in the Older Adult Survey were sent to 2,328 ALP panelists. The sample was constructed to be representative of adults age 40 and above, while also collecting a sufficient number of responses within each of four age ranges to allow for statistical comparisons. The number of panelists sampled within each of the four age groups was as follows: 617 panelists age 40-49, 627 panelists age 50-59, 633 panelists age 60-69, and 451 panelists age 70 and above. Given the relatively small number of panelists age 70 or older, the survey invitation was sent to all of the ALP panelists in this age group. Invitations were sent to only a subset of ALP panelists in the younger age groups. The sample was drawn so that each age group matched the U.S. population distribution on gender and income, and that all the age groups when aggregated matched the U.S. population distribution on race and ethnicity.76

Invitations to respond to the final survey were e-mailed to panelists on December 3. Generic e-mail reminders for ALP panelists to respond to all of their open surveys were sent on December 10 and 17; specific e-mail reminders for the Older Adult Survey were sent on December 10, 18, and 21. The survey was closed to additional responses on December 23. Respondents were paid an incentive of $20 per one-half hour, which is pro-rated to the estimated length of the survey. Median response time for respondents who completed the survey was 10 minutes, and the middle 50 percent response range was 7 to 15 minutes.

A total of 1,821 respondents responded to the e-mail invitation and finished the survey for a completion rate of 78 percent. Respondent ages ranged from 40 to 94, with a mean age of 59 and a median age of 59; respondents over the age of 70 made up 17 percent of the sample. With respect to gender and race, 52 percent of the respondents were female, and 16 percent were non-white. The median family had an income in the $50,000 to $59,999 range. None of the panelists responding to the Older Adult Survey were provided with Internet access or computer hardware by RAND in order to participate in the ALP.

Table A.1 includes information on the number of invitations sent out by age group along with the number of panelists who completed this survey. Callegaro and Disorga (2008) note that Internet panels like the ALP involve several stages of response, including the response to the initial invitation to join the panel, completion of an initial profile, and response to a particular survey invitation. The completion rates in Table A.1 reflect only the share of panelists contacted to participate in this particular survey who completed the survey. These are not cumulative response rates, as they do not factor in the response rates to earlier stages of panelist recruitment to join the ALP or the precursor surveys (e.g. the University of Michigan Survey of Consumers).77 Therefore, the completion rates listed below are not directly comparable to response rates from other surveys that do not involve multiple stages of recruiting.

Table A.1. Completion rates for the Older Adult Survey

Age Selected for panel Surveys complete Percent panelists completing
40s 617 432 70%
50s 627 486 78%
60s 633 536 85%
70s+ 451 367 81%
Total 2,328 1,821 78%

The sampling strategy described above was designed to make the sample representative of the U.S. population age 40 and older along a few dimensions such as gender and household income. However, the final sample of respondents may not mirror the selected sample of individuals invited to take the survey, because response rates may differ across groups with different characteristics (e.g. females may be more willing to respond than males). Furthermore, in order to infer population-level statistics from the sample data, it is appropriate to consider how the sample distributions of other key demographics besides those used in the selection scheme compare to their population counterparts and to adjust for any observed discrepancy. For this purpose, RAND provided sampling weights to be used to make sample statistics representative of the population.

Sampling weights were generated using an iterative raking algorithm. That is, a survey respondent was assigned a weight such that the weighted distributions of specific socio-demographic variables in the sample matched their population counterparts (benchmark or target distributions). The benchmark distributions were derived from the Current Population Survey (CPS) Annual Social and Economic Supplement administered in March 2012. The socio-demographic variables whose distributions were matched include four two-way marginals (interaction variables), namely gender (male, female) by age (40-49, 50-59, 60-69, 70+); gender by race (non-Hispanic white, other); gender by education (high school or less, some college, bachelor's degree or more); household income (< $30,000, $30,000 - $59,000, = $60,000) by number of household members (single, couple, 3 or more members).78

While weighting is designed to make the sample representative of the population with respect to particular socio-demographic characteristics, the sample still may differ from the general population in other respects. For example, Internet use has increased rapidly across all age groups in recent years; however, older age groups are still less likely to be online than younger cohorts.79 Specifically, more than one-half (53 percent) of American adults age 65 and older now use the Internet or e-mail, up from 38 percent in 2008, with most growth occurring since 2011. In the next oldest age group, ages 50 to 64, more than three quarters (77 percent) use the Internet. This compares to rates 90 percent or higher for age groups younger than 50.

In a recent study of Internet access and cognitive ability, researchers find that cognitive ability has a significant and positive relationship with Internet access, after controlling for differences in demographic characteristics and economic factors.80 Furthermore, they find that this effect increases with age. Because cognitive abilities are important for financial decisionmaking, these findings imply that using an Internet-only sample to look at financial decisions among an older population is likely to generate measures of both the decisions of older adults and the differences across age groups that differ from those that would be found in the general population even with the use of population weights.

The RAND ALP supplemental data provide cognition scores for 1,690 of the 1,821 Older Adult Survey respondents. This report looks at fluid and crystalized cognitive ability measures from the ALP based on the Woodcock-Johnson III (WJ-III) test, which is part of a cognitive battery to attain person ability compared to a nationally normed sample. Fluid cognitive abilities are assessed by giving participants a sequence of numbers with a blank somewhere in the sequence. Crystallized cognitive abilities are assessed by way of a Picture Vocabulary test. Also, a Verbal Analogies test that assesses both fluid and crystallized cognitive abilities also was given to participants. Each test uses a Block Adaptive Testing (BAT) format where respondents first receive three items of varying difficulty and are then routed to one of four other three-item sets, of increasing difficulty, based on the number they got correct in the first set. Performance is summarized in a W score, which represents a national norm, where higher W scores indicate greater cognitive ability.

Concerning the representativeness of Internet panels in general, questions have been raised about whether repeated response to surveys can result in panelists becoming less representative of the general population over time.81 Such "panel conditioning" could result in seasoned respondents approaching the task of completing the survey differently than new respondents. Learning about the general process of responding to survey questions could, in principle, result in either higher or lower quality survey responses. Furthermore, prior exposure to a topic covered in earlier surveys could result in respondents being more knowledgeable about that topic and could alter behavior as well. Thus, while earlier surveys of ALP respondents provide a rich source of additional measures that can be used to augment the data collected from this survey, it is possible that prior surveys about financial topics may have altered respondent knowledge and behaviors that this survey sought to measure.

Results from the respondents to the Older Adult Survey should be considered with these caveats about representativeness in mind. The use of an Internet panel provides a relatively quick and cost-effective way to investigate a broad set of issues concerning older adults. It also helps identify areas for further research and can inform the design of follow-up surveys using other quantitative or qualitative methods. When possible, survey results presented in this report are discussed along with related measures from other sources to provide context or some basis of comparison for the estimates.


References

72. More information on recruiting for the ALP is available at https://mmicdata.rand.org/alp/index.php?page=panelcomposition  Leaving the Board .   Return to text

73. For an overview of how surveys are conducted on the ALP see https://mmicdata.rand.org/alp/index.php?page=main  Leaving the Board .   Return to text

74. More information on the development of the ALP is available at https://mmicdata.rand.org/alp/index.php?page=panel  Leaving the Board .   Return to text

75. More information on the HRS questions on the ALP is available at https://mmicdata.rand.org/alp/index.php?page=hrs  Leaving the Board .   Return to text

76. The distribution of race in the general U.S. population is 67 percent non-Hispanic white and 33 percent other based on the 2012 Current Population Survey (CPS).  Return to text

77. Response rates at the first stage of recruitment to join the ALP vary across the different recruitment methods. For example, through August 2008 when RAND discontinued recruiting from University of Michigan Monthly Survey respondents, about 30 percent of the Michigan Survey respondents who were invited to join the ALP became ALP panelists. In comparison, about 46 percent of the Stanford panel participants chose to become ALP panelists. In addition to the inclusion of new panelists from recruiting efforts, the panel composition also changes with panel attrition as inactive panelists are removed from the panel after a year. More information on panel recruitment and attrition can be found at https://mmicdata.rand.org/alp/index.php?page=panelcomposition  Leaving the Board and https://mmicdata.rand.org/alp/index.php?page=panelattrition  Leaving the Board .   Return to text

78. More information about weighting for the ALP is available at https://mmicdata.rand.org/alp/index.php?page=weights  Leaving the Board   Return to text

79. Zickuhr and Madden (2012).  Return to text

80. Hsu, Willis, and Fisher (2011).   Return to text

81. Dennis (2001); Toepoel, Das, and van Soest (2008); Baker, et al. (2010).  Return to text

Last update: July 30, 2013

Back to Top