skip to main navigation skip to secondary navigation skip to content
Board of Governors of the Federal Reserve System
skip to content

Experiences and Perspectives of Young Workers

Appendix A. Technical Appendix on Survey Methodology

The Survey of Young Workers was designed by Federal Reserve Board staff and administered by GfK, an online consumer research company, on behalf of the Board. In order to create a nationally representative probability-based sample, GfK's KnowledgePanel® selected respondents based on both random digit dialing and address-based sampling (ABS). Since 2009, new respondents have been recruited using ABS. To recruit respondents, GfK sends out mailings to a random selection of residential postal addresses. Out of 100 mailings, approximately 14 households contact GfK and express an interest in joining the panel. Of those who contact GfK, three-fourths complete the process and become members of the panel.58 If the person contacted is interested in participating but does not have a computer or Internet access, GfK provides him or her with a laptop and access to the Internet. Panel respondents are continuously lost to attrition and added to replenish the panel, so the recruitment rate and enrollment rate may vary over time.

There are several reasons that a probability-based Internet panel was selected as the method for this survey rather than an alternative survey method. The first reason is that these types of Internet surveys have been found to be representative of the population.59 The second reason is that the ABS Internet panel allows the same respondents to be re-interviewed in subsequent surveys with relative ease, as they remain in the panel for several years. The third reason is that Internet panel surveys have numerous existing data points on respondents from previously administered surveys, including detailed demographic and economic information. This allows for the inclusion of additional information on respondents without increasing respondent burden. Last, collecting data through an ABS Internet panel survey is cost-effective and can be done relatively quickly.

A total of 4,135 KnowledgePanel® members received e-mail invitations to complete this survey. The e-mail invitations included 1,139 respondents who participated in the Board's 2013 Survey of Young Workers (excluding those who exceeded the age limit) and an additional 2,996 randomly selected KnowledgePanel® respondents who did not participate in the previous survey. A total 2,035 people responded to the e-mail request to participate and completed the survey yielding a final stage completion rate of 49 percent.

To enhance the completion rate, GfK sent e-mail reminders to non-responders over the course of the field period.60 GfK maintains an ongoing modest incentive program to encourage KnowledgePanel® members to participate. Incentives take the form of raffles and lotteries with cash and other prizes.

Significant resources and infrastructure are devoted to the recruitment process for the KnowledgePanel® so that the resulting panel can properly represent the young adult population of the United States. Consequently, the raw distribution of KnowledgePanel® mirrors that of U.S. adults fairly closely, baring occasional disparities that may emerge for certain subgroups because of differential attrition rates among recruited panel members.

The selection methodology for general population samples from the KnowledgePanel® ensures that the resulting samples behave as an equal probability of selection method (EPSEM) samples. This methodology starts by weighting the entire KnowledgePanel® to the benchmarks secured from the latest March supplement of the Current Population Survey (CPS) along several dimensions. This way, the weighted distribution of the KnowledgePanel® matches that of U.S. adults. Typically, the geo-demographic dimensions used for weighting the entire KnowledgePanel® include gender, age, race/ethnicity, education, census region, household income, homeownership status, metropolitan area status, and Internet access.

Using the above weights as the measure of size (MOS) for each panel member, in the next step a probability proportional to size (PPS) procedure is used to select study specific samples. It is the application of this PPS methodology with the above MOS values that produces fully self-weighing samples from KnowledgePanel®, for which each sample member can carry a design weight of unity. Moreover, in instances where the study design has required any form of oversampling of specific subgroups, such departures from an EPSEM design are corrected by adjusting the corresponding design weights accordingly with the CPS benchmarks serving as reference points.

Once the sample has been selected and fielded, and all the study data are collected and made final, a post-stratification process is used to adjust for any survey non-response as well as any non-coverage or under- and oversampling resulting from the study specific sample design. The following variables were used for the adjustment of weights for this study: gender, age, race/ethnicity, education, census region, residence in a metropolitan area, household income, and access to the Internet. Demographic and geographic distributions for the noninstitutionalized civilian population ages 18 and over from the CPS are used as benchmarks in this adjustment.

Although weights allow the sample population to match the U.S. population based on observable characteristics, similar to all survey methods, it remains possible that non-coverage or non-response results in differences between the sample population and the U.S. population that are not corrected using weights.


References

58. For further details on the KnowledgePanel® sampling methodology and comparisons between KnowledgePanel® and telephone surveys, see www.knowledgenetworks.com/accuracy/spring2010/disogra-spring10.html. Return to text

59. David S. Yeager, Jon A. Krosnick, LinChiat Chang, Harold S. Javitz, Matthew S. Levendusky, Alberto Simpser, and Rui Wang, "Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples," Public Opinion Quarterly 75, no. 4 (Winter 2011): 709-47. Return to text

60. Additional e-mail reminders to non-responders were sent on day 6, 9, and 12 of the field period. Return to text

Last update: February 2, 2017

Back to Top