skip to main navigation skip to secondary navigation skip to content
Board of Governors of the Federal Reserve System
skip to content

Consumers and Mobile Financial Services
March 2013

Appendix 1: Technical Appendix on Survey Methodology

In order to create a nationally representative probability-based sample, GfK's KnowledgePanel® has selected respondents based on both random digit dialing and address-based sampling (ABS). Since 2009, new respondents have been recruited using ABS. To recruit respondents, GfK (formerly Knowledge Networks) sends out mailings to a random selection of residential postal addresses. Out of 100 mailings, approximately 14 households contact GfK and express an interest in joining the panel. Of those who contact GfK, three-quarters complete the process and become members of the panel.6 If the person contacted is interested in participating but does not have a computer or Internet access, GfK provides him or her with a laptop and Internet. Panel respondents are continuously lost to attrition and added to replenish the panel, so the recruitment rate and enrollment rate may vary over time.

For this survey, the number of KnowledgePanel® members who were invited to complete the survey, and the invitation response rates, are presented in table 1 (see main text). A total of 4,030 e-mail solicitations to participate in the survey were sent out to a random selection of KnowledgePanel respondents, and data collection was terminated when the quota of 2,600 individuals completed the survey fully (a "cooperation rate" yield of 65 percent). To enhance the cooperation rate, GfK sent e-mail reminders to non-responders on days three and six of the field period.

As with any survey method, probability-based Internet panel surveys are subject to potential survey error, such as non-coverage and non-response due to the panel recruitment methods and to panel attrition. In order to address these potential sources of error, a post-stratification adjustment is applied based on demographic distributions from the most recent (November 2011 for re-interview cases and October 2012 for the fresh sample) data from the Current Population Survey (CPS). The variables used include gender, age, race/ethnicity, education, census region, residence in a metropolitan area, and access to the Internet. The Panel Demographic Post-Stratification weight is applied prior to a probability proportional to size (PPS) selection of a study sample from KnowledgePanel. This weight is designed for sample selection purposes.

Once the sample has been selected and fielded, and all the study data are collected and made final, a post-stratification process is used to adjust for any survey non-response as well as any non-coverage or under- and over-sampling resulting from the study-specific sample design. Demographic and geographic distributions for the non-institutionalized, civilian population ages 18 and over from the most recent CPS are used as benchmarks in this adjustment.

Comparable distributions are calculated by using all completed cases from the field data. Using the base weight as the starting weight, this procedure adjusts the sample data back to the selected benchmark proportions. Through an iterative convergence process, the weighted sample data are optimally fitted to the marginal distributions.

After this final post-stratification adjustment, the distribution of the calculated weights are examined to identify and, if necessary, trim outliers at the extreme upper and lower tails of the weight distribution. The post-stratified and trimmed weights are then scaled to the sum of the total sample size of all eligible respondents.

There are several reasons that a probability-based Internet panel was selected as the method for this survey rather than an alternative survey method. The first reason is that these types of Internet surveys have been found to be representative of the population.7 The second reason is that the ABS Internet panel allows the same respondents to be re-interviewed in subsequent surveys with relative ease, as they remain in the panel for several years. The third reason is that Internet panel surveys have numerous existing data points on respondents from previously administered surveys, including detailed demographic and economic information. This allows for the inclusion of additional information on respondents without increasing respondent burden. Lastly, collecting data through an ABS Internet panel survey is cost effective, and can be done relatively quickly.


 

References

6. For further details on the KnowledgePanel sampling methodology and comparisons between KnowledgePanel and telephone surveys, see www.knowledgenetworks.com/accuracy/spring2010/disogra-spring10.html Leaving the BoardReturn to text

7. David S. Yeager, Jon A. Krosnick, LinChiat Chang, Harold S. Javitz, Matthew S. Levendusky, Alberto Simpser, and Rui Wang (2011) "Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples," Public Opinion Quarterly, vol. 75(4), pp. 709-47.  Return to text

Back to Top

Last update: August 2, 2013