The Federal Reserve Board eagle logo links to home page

Skip to: [SSBF 2003 Home] [Printable Version (PDF)] [Footnotes]
2003 SSBF: Pretest 1 Report Screen Reader version

Pretest 1 Report

2003 SURVEY OF SMALL BUSINESS FINANCES

PREPARED FOR:
BOARD OF GOVERNORS OF THE FEDERAL RESERVE
PREPARED BY:
 A National Organization for Research at the University of Chicago Logo
MAY 13, 2004

Table of Contents

1. Introduction 3

2. CATI Questionnaires 4

2.1 Screening Questionnaire 4

2.2 Main Interview 7

3. Training 11

3.1 Screening 11

3.2 Main Interview 13

4. Contact Materials 18

4.1 Pre-Screening Mailing to Business Owners 18

4.2 Worksheet Mailing 19

4.3 SSBF Internet Sites 19

5. Sample 21

5.1 Frame Acquisition and Preparation 21

5.2 Sample Selection 22

5.3 Sample Data Quality 23

5.4 Sample Size 32

6. Data Collection 33

6.1 Data Collection Procedures 33

6.2 Results of Data Collection 35

7. Interviewers 39

7.1 Recruiting and Training 39

7.2 Telephone Interviewer Performance 39

7.3 Interviewer Debriefing Session 40

8. SYSTEM INTEGRATION 41

8.1 Delivery of Sample to IT 41

8.2 Delivery of Respondent Data to the Mailing Center 41

9. Data Delivery 42

10. Appendices 43

10.1 Appendix A: New or Revised Materials 43

10.2 Appendix B. Interviewer Debriefing Meeting Materials 43


Introduction

This document describes the procedures and results of the first pretest of the 2003 Survey of Small Business Finances (SSBF). The SSBF is a study of the factors affecting the availability of credit to small businesses. The main purpose of the first pretest was to test the CATI instruments, training materials, and data collection procedures for the study. In conjunction with the first pretest, some of the sampling procedures, including matching the sample data obtained from Dun & Bradstreet with data from InfoUSA, a compiler of business yellow pages, were also tested.

Specific pretest objectives were to:

The body of this document is organized into nine sections, corresponding to the main steps of the survey. In each section, we describe what we did for the first pretest, how it worked, and finally, the changes we propose, if any, for the next pretest or the main survey. Appendix A contains documents that are either newly developed, as a result of our experience conducting the first pretest, or significantly modified from those used in the first pretest.


CATI Questionnaires

The 2003 Survey of Small Business Finances uses two separate computer-assisted telephone interviewing (CATI) questionnaires. There is a screening questionnaire to identify firms that meet the eligibility criteria for the survey. (See Section 5 for a description of the target population for the survey.) There is also a main interview questionnaire. The performance of each of these questionnaires, in Pretest 1, is discussed in this section.

2.1 Screening Questionnaire

The screening questionnaire is a brief instrument designed to identify firms that meet the eligibility criteria for the survey. During the pretest, we were particularly concerned with how this instrument performed in terms of helping the interviewer navigate past gatekeepers to reach an appropriate survey respondent, the wording and order of the questions, and the performance of the zip code look-up, particularly its accuracy and response time. Each of these issues is discussed in the following sections.

2.1.1 Introduction Script

Pretest interviewers reported having some difficulty getting past gatekeepers, locating respondents and securing cooperation. Unfortunately, the introduction script in the screening questionnaire did not make these tasks easier for interviewers. Not surprisingly, therefore, the interviewers frequently deviated from the introduction script, using their own words to try to find a way past gatekeepers or to convince respondents of the importance of the interview. The pretest therefore revealed that NORC needs to develop more effective scripts and techniques - that interviewers will want to follow - to get a high percentage of respondents to complete the eligibility screener. In revising the introduction script, NORC will consider the following recommendations that emerged from the interviewer debriefing:

NORC will consider these recommendations when revising the introduction script and the beginning of the screening questionnaire for Pretest 2.

2.1.2 Zip Code Look-Up

The zip code look-up is a Paradox application that is "called" in both the screener and main interview questionnaires. The purpose of the zip code look-up is to verify the accuracy of the zip codes reported by the respondent for the sampled firm and the financial institutions used by the firm. The application is first called in the CATI version of the screening questionnaire at question A11.1.1, when confirming and capturing the mailing address of the firm. This sequence is repeated at question A3 of the main instrument, but only if the screener has been completed by a proxy owner. The zip look-up is next called at question A3.3.1 of the main interview questionnaire, when the physical address differed from the mailing address. Finally, the zip lookup is also called during financial institution lookup in the main interview questionnaire.

The zip code look-up linked a reported zip code to the following information:

During main study data collection, if a zip code has changed between June 2003 and the date of the interview, two separate entries will be created.

The zip code look up worked well in production, for both the screening and main interviews. Interviewers reported negligible response time, and respondents confirmed that the correct city and state were being returned by the program. No revisions to the zip code look-up are planned for Pretest 2.

2.1.3 Question Order

The screening questionnaire was designed to first identify the owner or owner proxy; next, to confirm that we are calling the correct firm; third, to determine the eligibility of the firm, and finally, to collect information from eligible firms needed for the worksheet mailing. The question order in the screening interview seemed appropriate in production, as designed. During the pretest debriefing, the interviewers did not report any problems with the order of the screener questions. Also, review of the item frequencies revealed that the bulk of the ineligible firms were identified early in the screening interview, which is how we would like the screening interview to perform.

2.1.4 Question Wording

Likewise, the wording of the screener questions generally worked well. Two questions gave interviewers some difficulty, however. The first of these was the question asking to speak with the owner of the firm. The problem with this question was that the response options did not handle all the situations encountered by interviewers during the pretest. Specifically, there were no response options to handle the following situations:

Response options to handle these situations will be added to this question for the next pretest.

The second question that gave interviewers difficulty was the question asking whether we had reached the main headquarters of the company. The problem with this question was that it may have screened out eligible firms. This error would have occurred when we reached the owner at a location other than the main headquarters (such as at home, or at another, separate business owned by that individual), or were interviewing the firm's accountant who did not work at the firm. This question needs to be re-worked for the next pretest.

2.1.5 Sensitive Questions

We did not expect any of the screener questions to be considered as sensitive by respondents, as the screener generally confirms information that is publicly available about the firm. Our experience with the screener in Pretest 1 supported our initial expectation. There were very few questions that respondents refused to answer. Also, during the debriefing, the interviewers did not report that any of the screener questions were considered sensitive by respondents.

Many respondents, however, did not provide an email address. We believe at least some of these respondents had email accounts but were uncomfortable providing their email address, out of concern that NORC might inundate them with email messages or send their address to a junk email company. We recommend re-working the email questions for the next pretest to make them less threatening, and to provide an interviewer prompt that explains how NORC will use email addresses.

2.1.6 Screener Logic

The logic in the CATI version of the screening questionnaire generally worked as specified. However, interviewers had difficulty with the logic for the first question, requesting to speak with the owner. The logic for this question required the interviewer to make three attempts to reach the owner before the CATI would allow the interviewer to proceed with the rest of the screening interview. This proved to be cumbersome in situations where the owner would never be available during the data collection period, but the interviewer was speaking with a viable proxy respondent. For the next pretest, we recommend that this logic be changed to allow limited, situation-specific exceptions to the three-attempts rule. These exceptions will be discussed out during interviewer training.

In some cases interviewers had difficulty identifying an appropriate address to which the worksheet package could be sent to the owner by Federal Express. The screener collects a mailing address, which may or may not be a valid address for a Federal Express delivery. Post Office (P.O.) boxes and rural routes are two examples of types of addresses to which Federal Express typically will not deliver. We recommend verifying if the FRB needs to capture a firm's mailing address. If not, we recommend revising the screener:

2.1.7 Question-by-Question Specifications (QxQs)


Pretest interviewers made only one comment about the question-by-question specifications (QxQs) in the screening questionnaire, and this was the suggestion to reverse the interviewer prompts and QxQs for A10.1 about organization type. The impetus for this suggestion was the fact that the tax-form-based QxQs for this question were shorter than the interviewer prompt, and the interviewers found the shorter definitions in the QxQs more useful for respondents. This change will be made for the second pretest.


2.2 Main Interview

This section discusses the performance of the main interview questionnaire during the first pretest. Of particular concern were the introduction script, the institution look-up, question order, question wording, questionnaire logic, and the QxQs.

2.2.1.1 Introduction Script

As with the introduction script in the screening questionnaire, the introduction for the main questionnaire was found to be too long. Pretest interviewers commented that respondents do not need all the information in the script, and in fact, the current script length sometimes annoyed respondents. NORC proposes that this script be shortened for the next pretest.

2.21.2 Institution Look-up

The training version of the institution look-up worked as specified by the client, but interviewers had difficulty using and understanding it. Specifically, interviewers found it hard to identify the branch when the initial search brought up multiple main offices in different states across the country.

Immediately following training, FRB and NORC staff met to determine how to make the look-up procedure more effective, efficient and easier for interviewers to use. The session lasted one hour, and within two days NORC has implemented into the production version of the main questionnaire a new procedure for institutional look-ups.

The following is a summary of the new features of institution look-up.

The Bank ID query is possible only if the Get Bank (F4) option is selected, which automatically blanks out all search criteria except the state name. Executing this query will pull only institutions that match the specified Bank ID within the specified state. This includes any branches and possibly the main bank, and always includes the branch unknown record. This is the only query that displays the branch unknown record. The user can quickly go to the branch unknown record by pressing the End key to get to the bottom of the list. (Home is also available for the top of the list, i.e., main bank). The user also has the option to narrow a Bank ID search by adding additional search criteria, and these will be recognized.

Bank Name will be cleaned before the database is loaded into Paradox, to improve the quality of the matches. The following is a list of the cleaning operations that will be performed on Bank Name:

NORC also added an F key (F6) on the result screen which is labeled Zip. It simply shows, for one second, the last zip code specified in the search criteria.

The following notes indicate how the partial matches worked in the Pretest 1 production version of the institution look-up:

During the pretest debriefing, interviewers reported that they could usually find the branch the respondent reported using in the institution look-up table, and that the response time for the look up was good. Interviewers expressed surprise at the large number of respondents who knew the zip codes of their banks. NORC plans to use the same version of the institution look-up for Pretest 2. NORC is expecting to receive an updated database of depository institutions from the FRB for use in the main survey.

2.21.3 Question Order

The order of the questions in the main interview questionnaire appeared logical to both interviewers and respondents in the pretest, with one exception. During the pretest debriefing, one interviewer (who had previous experience as an owner of a small business) remarked that question C30, which asks if the firm is publicly traded, should be asked earlier because the respondent's answers to the questions immediately preceding question C30 sometimes make the answer to question C30 obvious. This issue will be reviewed and discussed with the client prior to Pretest 2.

2.21.4 Question Wording

Although the pretest revealed no serious problems with question wording in the main interview questionnaire, pretest interviewers reported during the debriefing that they found the questionnaire to be "over scripted" for knowledgeable respondents, such as accountants. It has been found in past rounds of this study, however, that many small business owners need financial terms defined and clarified in order to correctly answer the questions in the questionnaire. Therefore, this design feature of the questionnaire is unlikely to be modified.

Pretest interviewers also commented that the questionnaire contains too much repetition of the firm name and the phrase "the fiscal year ending." Pretest interviewers suggested that repeating question stems be placed in parentheses and read at the interviewer's discretion. NORC and FRB will consider these comments when revising the main interview questionnaire for Pretest 2.

Finally, pretest interviewers reported that the questions that require the respondent to compare company performance in different years are confusing as currently worded. The interviewers suggested that these questions be reworded to mention the most recent year first, then the prior year.

2.21.5 Questionnaire Logic

The logic of the main interview questionnaire worked well, for the most part, during the first pretest. Exceptions to this were parts of sections C, P, R, and S. Regarding section C, the logic in the set of questions identifying the owner with the largest, second largest and so forth ownership share need to be fixed to prevent the initial question from being re-asked inappropriately. In sections P, R, and S, there were problems with the questions that depend on data reported earlier in the questionnaire (e.g., firm age and tax forms used). Specifically, some of the text fills and skips in these sections did not work entirely correctly in the Pretest 1 version of the CATI questionnaire. NORC is working on fixing these sections for Pretest 2.

Other questionnaire changes involving logic issues, that need to made before the next pretest, are the following:

2.21.6 Question-By-Question Specifications (QxQs)

In general, pretest interviewers reported that they found the QxQs in the main interview very helpful for defining financial terms to respondents. The only negative comments made by interviewers about the QxQs was that the QxQs need to be written in shorter sentences, and reformatted to break up large blocks of text to make it easier for interviewers to retrieve the needed clarification. NORC will work on making these changes to the QxQs in both the main interview and screening interview, for the main survey.


Training

This section describes the training of telephone interviewers for Pretest 1. NORC conducted separate training sessions for the screening and main interviews. The materials for each training consisted of 1) a Training Agenda, 2) a Training Guide, and 3) an Interviewer Reference Manual. Each of these is described below, for each training session. We also discuss how well the materials worked and the changes we propose for the main study.

3.1 Screening

The training for the screening interview was a two-day session. NORC trained 12 interviewer candidates. (See Section 8 for a description of the interviewers.) The training session took place at NORC's Downer's Grove Telephone Center on March 8 and 9, 2003. Participants included the 12 trainees, six members of NORC's project management staff, and two staff from the Federal Reserve Board.

3.1.1 Training Agenda

The training for the screening interview was divided into 18 modules. This training was designed to lay the foundation for the entire survey, both screening and interviewing. Therefore, considerable time and attention was devoted to basic information about the study, such as its purpose, the sponsoring organization, the types of firms that are eligible for the study, the type of data collected, and how the data will be used. NORC's goal in this training was to give interviewers all the information they would need to explain the study to sampled firms, gain the cooperation of the owner or the owner's proxy, and correctly administer the screening questionnaire.

For the most part, the number of modules, the content of each module, and the time allocated to each module were all appropriate. The only exception to this was the time allocated for Module 3: Overview of the Screening Process. NORC had allocated 30 minutes for this module, and it actually took almost 90 minutes to complete. Despite the amount of time required, we believe that the content of the module is appropriate in terms of the type and amount of material, and the level of detail. The actual length of the module, however, necessitated moving a later module from the first day of training to the second day. It also increased the overall length of training, since the estimated durations of the other training modules all turned out to be fairly accurate. The training ended up taking about 14 hours, across two days: eight hours on the first day, and six hours on the second day.

The changes that need to be made to the Training Agenda for the main study training are to move the Gaining Cooperation, Part 2 module from Day 1 to Day 2, and to reverse the order of the third mock interview and the module on the respondent worksheets so that the third mock interview is preceded by the module on worksheets.

3.1.2 Training Guide

The Training Guide was designed to be used by the trainer to present all the necessary information to trainees. Like the Training Agenda, it was divided into 18 modules. In addition to the trainer's script, the Training Guide also contained pictures of the PowerPoint slides used to present the training points, copies of the interviewer job aides, exhibits, and the trainer answer keys for the training exercises for each of the 18 training modules.

It is NORC's assessment that the Training Guide for the Screening Interview worked well. It was well organized; the trainers found it useful to have the PowerPoint slides, exhibits, job aids and exercise answer keys all embedded in the guide. Also, the scripting of the training points was at an appropriate level of detail; i.e., enough detail to easily remind the trainer of the main points to cover, but not so much detail that the trainer had to read every word.

The main change that needs to be made to the Training Guide for the Screening Interview is to re-order several of the modules to make it agree with the revisions described above for the Training Agenda. The interviewers have also suggested that NORC add to the training a protocol for handling out-of-scope cases that do not complete the screening interview. In addition to this change, there are a number of minor wording changes that need to be made to the Training Guide for the Screening Interview.

3.1.3 Interviewer Reference Manual

The Interviewer Reference Manual contained useful documents for interviewers to reference during the training session, as well as during actual interviewing. A list of the contents of the Interviewer Reference Manual appears in Table 1 below.

NORC's evaluation of the Interviewer Reference Manual for the Screening Interview is that it contained the appropriate documents. We recommend keeping all the same documents for the main study training. Two of the documents in the manual need some slight revision. Specifically, NORC recommends adding the following two questions to the Frequently-Asked-Questions Job Aid:

NORC will add these two items to the FAQ job aid, and will consider adding them to the Frequently-Asked Questions brochure for the main interview.

The only modification we recommend to a document in the Reference Manual for the Screening Interview is to add the following terms to the Glossary:

Fiscal year

For-profit company

Leased employees

Majority-owned subsidiary

Not-for-profit company

Privately-owned company

Temporary employees.

Pretest interviewers commented that the Job Aids in the Reference Manual were not very accessible during production (i.e., during an interview, the interviewer does not have time to locate the appropriate Job Aid in the Reference Manual). Telephone Center supervisors made copies of the job aids for interviewers to post in their stations during pretest 1. NORC will provide main study interviewers with a set of Job Aids to post at their interviewing stations.

Table 1. Contents of Interviewer Reference Manual for Screening Interview
Section Documents
Job Aids Tax Forms Used by Different Organization Types
Job Aids Eligibility Criteria for the Survey
Job Aids Frequently-Asked-Questions and Answers About the Survey
Job Aids TNMS Disposition Codes and Their Meanings
Job Aids Instructions for Logging In and Out of the TNMS
Job Aids Telephone Answering Machine Script
Exhibits The 2003 Survey of Small Business Finances and How It will be Used
Exhibits NORC Confidentiality Statement
Exhibits Pre-Screening Advance Materials
Exhibits Dun & Bradstreet Brochure
Worksheets Reduced to 8.5 x 11" in size.
Screening Interview Questionnaire MS Word Version
Training Handouts In-class Quiz
Training Handouts Screener Check-out Evaluation Form
Training Exercises Eligibility Exercise
Training Exercises Identifying an Appropriate Proxy Respondent
Training Exercises Preserving Respondent Confidentiality
Training Exercises Assigning the Appropriate TNMS Disposition
Training Exercises Practice Writing Call Notes
PowerPoint Slides Formatted as handouts, 3 to a page.
Examples Practice Mock Interview Scripts
Reference Glossary of Terms

3.2 Main Interview

The telephone interviewer training for the main interview questionnaire placed heavy emphasis on administering the main interview questionnaire. There were reprises of the modules from the screener training on gaining respondent cooperation, maintaining respondent confidentiality, and using the Telephone Number Management System (TNMS). Most of the three-day training session, however, was devoted to understanding the concepts contained in the main interview questionnaire and practicing administering the questionnaire following the paths specific to each of the different organization types.

For Pretest 1, NORC trained 11 interviewers, all of whom had successfully completed the training for the Screening Interview, and had spent the time between the screening training and the main interview training conducting screening interviews with firms sampled for the pretest. Main interview training was held at NORC's Downer's Grove Telephone Center on March 15 through March 17, 2004. The same NORC and FRB staff who participated in the screener training also participated in the main interview training.

3.2.1 Training Agenda

The Main Interview questionnaire training contained 30 modules. After welcoming the interviewers to training and presenting a brief overview of the training session, the first substantive training module concerned questionnaire conventions, such as verifying all dollar amounts, using the exception procedure to handle failed hard range checks, and using the Institution Look-up. This was followed by a series of mock interviews. Each main section, or related group of subsections, was presented to trainees through a round robin mock interview. Interviewers then spit into pairs and completed two duo mock interviews of the same section or subsection. The round robin and duo interview scripts for this portion of training were all based on C-corporation examples. The agenda was designed to have this pattern repeat until the entire questionnaire had been presented. Interviewers were then to complete duo mocks with sole proprietorships and partnerships. The final check-out mock interview was with an S-corporation. The agenda also included modules on gaining cooperation and respondent confidentiality.

Day 1 started late because it took 30 minutes to distribute the documents to be added to the Interviewer Reference Manual for the Main Interview. Additionally, each of the Day 1 modules took longer than estimated. The most significant deviation occurred with Module 6 (Round Robin Mock of Subsections E, F, MRL, and G), which took about three hours versus the 60 minutes estimated. Nothing untoward occurred in Module 6; NORC just grossly underestimated the time needed to get through the mock interview of these sections, partly because of the amount of lecture material defining financial terms that preceded the round robin mock interview.

Because of the additional time needed for each module, we were able to get through only seven modules on Day 1, rather than the 10 modules planned. Table 2 below shows the actual duration of each Day 1 module compared with the estimated time.

Nevertheless, other than the scripted mock interviews, the Training Agenda worked well for the first day and a half of training, until we got to Section H in the questionnaire, which contains the institution look-up function. The institution look-up, as it was designed and presented at training, was found to be very confusing to interviewers. (See Section 2.2.2 for a full discussion of this issue.) Additionally, the mocks and the training scripts did not always agree with the production CATI and in some cases were not adequately tested prior to training. This was most apparent in the institution look-up section (Section H).

Therefore, we ended up spending almost two hours on Module 8 (round robin of questionnaire section H). NORC subsequently redesigned the institution look-up, and the new version worked well in production. NORC also designed a new job aid for the revised institution look-up (see Appendix A), and will revise the training materials to correspond to the revised look-up. With the revised look-up, we feel confident that we will be able to present a revised Module H in the time originally allocated for it.

Table 2. Actual vs. Estimated Time of Day 1 Training Modules
Module Estimated Time
(Minutes)
Actual Time
(Minutes)
Module 1: Welcome/Introductions 30 35
Module 2: Questionnaire Overview 20 30
Module 3: Questionnaire Conventions 30 60
Module 4: Mock Interview of Section I 60 75
Module 5: Duo Mocks, Section I 40 45
Module 6: Mock Interview of Subsections E,F,MRL and G. 60 180
Module 7: Duo Mocks, S Subsections E,F,MRL and G 40 60
Module 8: Mock interview of Subsection H 35 120
Module 9: Duo Mocks, Subsection H 30 45
Wrap Up of Day 1/Agenda for Day 2 15 5

To make up for the additional time spent on Day 1 topics, NORC dispensed with the duo mocks in Day 2 for the remainder of the questionnaire sections. These sections (i.e., Trade Credit, New Equity Investment, the Balance Sheet subsections, and Credit History) were presented through a round robin mock only. However, trainees were still able to practice these sections during the duo mocks for sole proprietorships and partnerships. Since these sections are more straightforward than earlier sections of the questionnaire, the trainees progressed through the sole proprietorship and partner ship mock interviews without difficulty.

For the main study training, NORC will revise the agenda to take into account the longer training time needed for each of the Day 1 modules (except for section H). We plan leave all the duo mocks in the agenda, but propose to use these flexibly for sections following section H, depending on the time available and the trainees' need for additional practice. Finally, based on the pretest experience, we will add more training on use of the proxy breakpoint screen, and the use of exception keys.

3.2.2 Training Guide

The Training Guide for the Main Interview was very similar to the Training Guide for the Screening Interview, in terms of format. Like the Training Guide for the Screening Interview, it contained the training script, PowerPoint slides, training exercises, job aids and mock interview scripts for each of the 30 modules in the Training Agenda.

The lecture portions of the Main Interview Training Guide worked well in Pretest 1, as did the training exercises, and job aides. NORC plans to make only minor modifications to these for main study training. The mock interview scripts will need substantial revision. Specifically, NORC plans to make the following changes to the mock interview scripts for main interview training:

Table 3 contains a summary of changes that NORC recommends be made to specific modules of the Main Interview Training Guide, based on Pretest 1 training and the interviewer debriefing at the end of Pretest 1 data collection.
Table 3. Recommended Changes to Training Guide
Module Title/Recommended Change
3 Questionnaire Conventions
3 Add section on the scripting of the probe: "Can you give me an estimate?"
3 Use screen shots from CATI rather than MS Word version of questions.
3 Create a slide with the answers to Exercise #6.
3 Revise section on Institution Look Up to match revisions to system and to include practice "mini-mock" exercises.
4 Questionnaire Section I
4 Add a description of the flow of section C.
15 New Equity Investments
15 Add a slide defining new equity investments.
16 Overview of Income & Expenses and Balance Sheet
16 Add an overview of a balance sheet similar to that used to introduce the income statement.

3.2.3 Interviewer Reference Manual

Table 4 contains a description of the documents added to the Interviewer Reference Manual for the Main Interview.

The materials added to the Interviewer Reference Manual for the main interview all proved to be useful in training except for the hardcopy version of the main interview. There was never a time during training when we found it was useful to refer to the hardcopy version of the questionnaire. In addition to these materials, the interviewers requested that we also include a job aid containing standard two-character state abbreviations. This job aid was created for interviewers during production. Additionally, a drop-down screen containing a state picklist will be added to the questionnaire for Pretest 2.

Table 4. Additions to the Interviewer Reference Manual for the Main Interview
Section Documents
Job Aides CATI Functions
Job Aides SSBF Important Codes/Telephone Numbers
Exhibits Worksheet Cover Letter
Exhibits Financial Services Roster for C-Corp Mock
Exhibits Interviewer Observations Questionnaire - Screener
Exhibits Interviewer Observations Questionnaire - Main Interview
Worksheets Actual full-size color worksheets (4 versions)
Training Handouts Day 2 Homework
Training Handouts Training Evaluation (Main Interview)
Training Exercises Translating Dollar Amounts (from words into numerals)
Training Exercises Deciding When to Write a Call Note
PowerPoint Slides Formatted as handouts, 3 to a page.
Practice Mock Interview Scripts Main Interview, C-Corp Duo #1
Practice Mock Interview Scripts Main Interview, C-Corp Duo #2
Practice Mock Interview Scripts Main Interview - Sole Proprietorship Duo #1
Practice Mock Interview Scripts Main Interview - Sole Proprietorship Duo #2
Practice Mock Interview Scripts Main Interview - Partnership Duo #1
Practice Mock Interview Scripts Main Interview - Partnership Duo #2
Main Interview Questionnaire MS Word version


Contact Materials

This section describes the materials that were mailed to respondents in Pretest 1. Respondents received a pre-screening advance mailing, and eligible respondents also received a worksheet mailing. Each of these mailings is discussed in detail in this section. This section also describes the project websites.

4.1 Pre-Screening Mailing to Business Owners

The pre-screening, advance mailing consisted of a cover letter from the NORC project director, a letter from Federal Reserve Board Chairman, Alan Greenspan, and a General Information Brochure containing answers to frequently-asked questions about the study. These materials were modified versions of the documents used in the 1998 SSBF.

NORC and the FRB began revising the pre-screening letter from the project director (see Appendix A) immediately following Pretest 1 interviewer training, based on comments received from a new member of the NORC project team who has extensive experience in marketing research. The revised version of the letter is shorter, uses slightly larger font than the original, and adds a summary sentence to the start of each paragraph. The intent of these changes is to make the PD letter more appealing to respondents who do not like to read. Also, we expect that even respondents who skim the revised letter will be able to come away with the major points, including the study's purpose and awareness that respondents will receive a phone call from NORC.

NORC has proposed, and the FRB has accepted, two other changes to the pre-screening mailing. The first change is to the return address on the envelope; the address will now include the FRB's official seal and slightly altered wording. The intent is to use the FRB logo to convey the importance of the mailing - official business from a government agency, not "junk mail" - to both gatekeepers and recipients.

The other change to the pre-screening advance materials used in Pretest 1 is to test a "buck slip" in the pre-screening mailing for Pretest 2. About the size of a dollar bill, and with the FRB logo and three or four bulleted sentences, the buck slip is intended to increase the incidence of recipients who come away from the pre-screening mailing with the major communication points. The buck slip will be printed on high-impact colored paper and will be the first item a recipient sees when he or she opens the mailing. If a recipient looks at nothing but the buck slip, he or she will be likely to have registered the key communication points.

In Pretest 2, a random half of the respondents (n=300) will receive a pre-screening mailing with a buck slip, and the other half will receive a mailing without a buck slip. After Pretest 2, NORC will look for differences between the two groups on such measures as the average number of hours spent per completed screener, to help evaluate the effectiveness of using buck slips for the main study.

Pretest 1 interviewers commented that respondents did not always remember receiving the pre-screening mailing, although respondents were not systematically asked if they had received it. The changes described above are all intended to make this mailing more memorable to respondents, and therefore, increase respondent cooperation with the screening interview. The results from Pretest 1 provided no indication for changing the Chairman Greenspan letter.

4.2 Worksheet Mailing

The worksheet mailing was sent to firms that were determined to be eligible for the survey on the basis of the screening interview. The worksheet mailing consisted of the following documents:

In general, the worksheet mailing seemed to work well in Pretest 1. It was the impression of the pretest interviewers that business owners were impressed by receiving this package of materials via Federal Express, and consequently tended to pay more attention to the materials. Interviewers also reported, however, that some owners (particularly those of very small businesses) were intimidated by the worksheets. Interviewers recommended that the worksheet cover letter be revised to make it more apparent that we do not expect every business owner to complete the entire worksheet. NORC will make this change for Pretest 2. Seventy of the 71 Pretest 1 respondents reported using the worksheet during the interview. The interviewers thought that proxy respondents, who are accountants, liked the worksheet. Twenty-one (21) worksheets (out of 211 mailed) were returned with some data; two of these were accompanied by tax forms. One company returned a tax form and general ledger with no worksheet.

4.3 SSBF Internet Sites

The pre-screening and worksheet letters from the project director refer respondents to two project websites, one hosted by the FRB (www.federalReserve.gov/ssbf) and the other hosted by NORC (www.norc.uchicago.edu/ssbf). The FRB website for the SSBF was online for Pretest 1, although at this point we do not know how many respondents went to the site. The NORC site for the SSBF was still under development; Pretest 1 respondents who may have clicked on the URL would have seen an "Under Construction" sign at the site.

The NORC project website will meet all requirements identified in NORC's "SSBF Functional Requirements" document by the time of the second pretest. In addition, with direction from the FRB, the NORC project website will be improved in both content and appearance from the website used for the 1998 round.

NORC is creating a Section-508-complient project website to provide respondents with online information about the SSBF and, equally important, help persuade respondents that the survey is legitimate and important and that their participation is valuable. Accordingly, the website's home page - which for many visitors may be the only part of the site they see - displays prominently the NORC and FRB logos with links to their respective sites. The home page has just two short text paragraphs, one explaining the study purpose and the other a persuasive message that appeals to participants' self-interest, i.e. how their participation ultimately benefits them. In addition, the home page has a link marked "Letter from Chairman Alan Greenspan." Even if visitors do not click on the link, they are reminded that the survey has the explicit backing of Chairman Greenspan.

NORC project website provides a wealth of information that is potentially useful to an SSBF respondent, including frequently asked questions (FAQs), letters from the project director, and information about NORC, the FRB and the survey itself. As a persuasive tool, by Pretest 2 the website will have copies of endorsement letters from the National Business Association and the Small Business Administration. NORC is also attempting to secure an endorsement letter from the National Association of the Self Employed (NASE), and the FRB is attempting to secure an endorsement letter from the National Federation of Independent Businesses (NFIB). Both will be posted online.

The NORC website has downloadable copies of the four versions of the financial worksheet required for the main interview. The main purpose of adding the worksheets to the website is to allow respondents to gain an understanding of the kinds of information they will be asked to provide. Though the worksheets at the website are downloadable, they will print to 8 \raise.5ex\hbox{$\scriptstyle 1$}\kern-.1em/ \kern-.15em\lower.25ex\hbox{$\scriptstyle 2$} x 11-inch paper - too small for many respondents to use easily. NORC has posted a notice on the website that the intended size of each worksheet is 11 x 17 inches, with instructions to call the toll-free project hotline to request that full-size worksheets be sent to them through regular mail.

NORC is also adding a link to its SSBF site explaining its privacy policy. NORC is reviewing the possibility of adding an internal counter to its site, to track the number of visitors and which links are accessed most often. NORC is now checking to see if using the counter software of a specific distributor will not violate privacy guidelines of either NORC or the FRB.

We anticipate the NORC website to be a factor in the next pretest - the website will be accessible during the full duration of Pretest 2. During Pretest 2, NORC will use unprompted (volunteered) responses about the website from respondents to help ascertain the website's value and identify improvements.

The FRB project website "went live" during Pretest 1. The URL and logos help establish the SSBF as a legitimate FRB survey (opposed to, for example, a market research survey). The FRB website provides an overview of the study and its objectives. Other information and links show how earlier SSBF survey waves have been used to influence public policy. The site includes excerpts of papers mentioning the SSBF by Chairman Greenspan, Vice Chairman Ferguson and Governor Bies, with links to the full text of each paper. Finally, the FRB website has links to Frequently Asked Questions about the SSBF and to publicly available research references and abstracts that used the 1987, 1993 and 1998 SSBF.


Sample

This section describes the pretest sample and procedures for sample acquisition, sample selection. We also discuss the quality of the pretest sample data, and the adequacy of the sample size for the first pretest.

5.1 Frame Acquisition and Preparation

The pretest sample frame was purchased from Dun & Bradstreet (D&B). The target population was defined as all non-governmental, non-agricultural, non-financial, non-subsidiary, for-profit firms in the United States with fewer than 500 employees in business in December, 2003. A technical description of the sample frame can be found in NORC's Sampling Plan.

The initial frame, sent to NORC, by D&B consisted of 8,037,809 records. The initial data sent by D&B was a limited-content abstract of the standard DMI file. Each record consisted of the data listed in Table 5.

Table 5. Content of D&B Abstract Data Record
Data Elements Data Elements
DUNS Number Employee Here
State FIPS Code Code for Estimated Range
County FIPS Code Primary SIC Code
MSA FIPS Code (prior to 2000) Status Indicator
Physical Location Zip Code Subsidiary Indicator
Sales Volume Manufacturing Indicator
Code for Estimated Range Legal Status a.k.a. Business Type
Employee Total Transition Code
Code for Estimated Range D&B Report Date

NORC first purged the abstract file of 81 records with out-of-scope primary Standard Industrial Classification (SIC) codes. NORC next categorized employee total, and purged an additional 16,506 records with 500+ employees from the file. The remaining 8,021,303 records comprised the sampling frame for pretest 1.

The process of acquiring the pretest sample frame from D&B was fraught with difficulties. Some of these difficulties were associated with the time of year. NORC and D&B were developing the sampling specifications and constructing the sample frame during the Christmas and New Year's holidays. Whether due to the fact that staff at both NORC and D&B were out due to the holidays, or some other reason, there were communication issues that resulted in D&B programmers initially working from a non-current version of the specifications. This problem was rectified before D&B sent the first file to NORC. Nevertheless, the first file sent to NORC from D&B did not match the file layout, making it impossible for NORC to read the file. This took almost a full week to straighten out with D&B.

There was also a misunderstanding with respect to the zip code in the file, which NORC originally understood to be the zip code of the firm's mailing address (whether or not it is a physical location). However, it turns out to be the zip code of the firm's physical location. Finally, and most significantly, there was a problem with the SIC code that D&B delivered to NORC. Specifically, as described above, there were 81 cases with out-of-scope SIC codes in the file. It was later discovered that during the frame-building process, D&B replaced the primary SIC code for these firms with the secondary SIC code because of their value on a third variable that was not used in defining of the target population, and some antiquated code that could not be removed from D&B's selection program. These 81 firms actually had in-scope primary SIC codes, and therefore, technically, should not have been purged from the file before sample selection.

5.2 Sample Selection

Sample selection for Pretest 1 proceeded smoothly. From the frame of over eight million records, NORC selected a sample of 500 companies for Pretest 1. NORC employed systematic, stratified random sampling to select the pretest sample. Six strata were created by crossing employment size (0-19 and unknown, 20-499) with business type (sole proprietorship, partnership, corporation). Four of the six strata were allocated sample sizes of 83; the other two strata were allocated sample sizes of 84. The frame was sorted by SIC code, within strata, and selections were then made, systematically, within strata.

The resulting sample was checked to insure that the sample sizes within strata were equal to the amount allocated. The output was also spot-checked to make sure that the systematic approach was correctly implemented. Finally, the distribution of SIC codes in the sample was compared with the distribution in the frame. The sample "passed" each of these checks.

After four days of screening, NORC had identified only about 100 eligible firms compared with 200 expected. Although the eligibility status of most of the pretest sample was unresolved at this point, NORC decided to add an additional 250 cases to the sample to ensure the required 50 completed main interviews by the end of the pretest data collection period. These 250 cases were drawn from a sample of 500 cases originally intended for the second pretest.

The original Pretest 2 sample of 500 cases was selected from the same sample frame as the pretest 1 sample, minus the 500 cases already selected for pretest 1. After the first sample had been selected, the remaining 8,020,803 (8,021,303-500) records were subject a second systematic selection done in a completely analogous manner as the first set of selections. Note that for the main sample, any previously selected firms will first be removed from the updated DMI frame. Then the entire sample (37,000 firms) will be drawn simultaneously.

To select the 250 cases to add to the Pretest 1 sample, the 500 Pretest 1 cases were first randomly ordered (uniform distribution) and then split into replicates of size 50 each. Five of these replicates became the sample for part 2 of Pretest 1. The remainder was set aside for use in Pretest 2. Though each of the replicates is a valid random subsample of the original systematically sampled cases, due to time constraints there was no attempt to control the number of cases that were selected into each replicate by stratum or primary SIC distribution. The procedures will be modified for the main sample selection such that each replicate is representative of the target population within each stratum, preserving the implicit SIC stratification.

5.3 Sample Data Quality

In this section, the quality of the sample data is analyzed from three perspectives. In the first part of this section, we analyze the quality of the address data for the firms in the sample, based on the results of processing these data with SmartMailerTM software, which is designed to standardize address data and determine its completeness based on United States Post Office standards. In the second part of this section, we analyze the quality of the D&B sample data based on the outcomes of the calls we made to sampled firms. In the third and final part of this section, we analyze the quality of the D&B sample data based on a comparison of these data with data from InfoUSA, another source of firm listings.

5.3.1 SmartMailer Outcomes

SmartMailer software looks for the existence of the subject address in that city and state. If found, the Zip code is verified, and a Zip+4 is obtained if available. Often, amplifying address information will be found (e.g. an apartment, suite, or floor number); this new information will be provided in a separate field, without overwriting the existing address. If SmartMailer cannot verify the Zip, it will suggest an alternative Zip code that matches that address, city and state. If SmartMailer cannot verify the address, it will flag the address but will not change any of the information. SmartMailer does not verify the resident or occupant at the subject address.

The results from SmartMailer, for the first batch of 500 Pretest 1 cases, are summarized in Table 6 below.

Table 6. Results of Checking Dun & Bradstreet Address Data Using SmartMailer
Outcome N Percent
Address validated by SmartMailer as complete and accurate 231 46.2
Address lacked apt., suite, or floor number that SmartMailer could not provide 85 17.0
Address contained spelling or formatting error corrected by SmartMailer 79 15.8
Address lacked apt., suite, or floor number that SmartMailer could provide 69 13.8
Address could not be found by SmartMailer 36 7.2
Total 500 100.0

As Table 6 shows, slightly less than half of the addresses supplied by Dun & Bradstreet were validated as complete and accurate by SmartMailer. The most common problem with the address supplied by Dun & Bradstreet was a missing apartment, suite or floor number. SmartMailer was able to provide the missing information for about half (45%) of these cases. The next most common problem was a spelling or formatting error. These were all corrected by SmartMailer. Only a small fraction of the addresses supplied by Dun & Bradstreet (7.2%) could not be found at all by SmartMailer.

5.3.2 Calling/Mailing Outcomes

The results achieved by mailing and calling the firms in the pretest sample are another measure of sample data quality. The mailing outcomes for the 750 cases in the first batch of Pretest 1 sample suggest that the address data from Dun and Bradstreet were of fairly high quality. Specifically, the mailing outcomes were as follows:

The calling outcomes similarly suggest that the sample data from Dun and Bradstreet was of high quality. These are summarized in Table 7. Only 37 firms (5%) could not be contacted to confirm that we were calling the correct business. An additional 15 firms (2%) were eventually located after data collection ended. Only 23 firms out of the 750 (3.1%) fielded in Pretest 1 were confirmed as out of business.
Table 7. Calling Outcomes Related to Sample Data Quality
Calling Outcome N %
Disconnected/Wrong Number 32 4.3
Located After Data Collection Period Ended 15 2.0
Non-Contact With Busy and No Answer 5 0.7
Successfully contacted 698 93.0
Total 750 100

5.3.3 Matching with InfoUSA

In this part of the report, we analyze the quality of the D&B sample data based on a comparison of these data with the business data from InfoUSA, a compiler of business telephone directories.

BACKGROUND. Because NORC's past experience with D&B data suggested that these data contained errors, one of which is the inclusion of firms no longer in business, NORC's original proposal called for matching D&B data to InfoUSA data. The idea was to use a second source of information to identify firms with a lower probability of being in business, so that these firms could be subsampled in an effort to reduce data collection costs. Firms in the D&B file for which no match could be found in the InfoUSA file would be considered suspect in terms of still being in business. The premise of such a match is that two sources showing a business in operation increases the ex ante probability that the business is genuinely in operation. Thus, our original approach involved matching our D&B sample to InfoUSA, treating all matches as highly likely in operation and including those in the fielded sample with certainty. For those businesses in D&B with no match in InfoUSA, we planned to assume a lower likelihood of being in operation and therefore, to subsample from this group to reduce the number of non-existent businesses we would try to reach.

The strength of this plan was that if non-matches turned out to be mostly non-operational businesses, considerable resources might be saved by not calling so many former businesses. However, any completed cases from the non-match group would receive a large weight (due to subsampling reducing the probability of selection). Thus the risk was that if a large number of operational businesses comprised the non-match group, we would end up with a large number of cases with large weights, yielding a large variance in the weights, which would in turn reduce the effective sample size. This result could occur, for instance, if the non-matches were mostly comprised of businesses that had moved.

After the start of the project, an additional fact came to light, which is that InfoUSA collects information to flag businesses on their list as being non-operational, or out of business (OOB). For our purposes on the 2003 SSBF, this results in three categories of businesses: (1) D&B businesses that match to InfoUSA businesses as operational, (2) D&B businesses that match to InfoUSA businesses as non-operational, and (3) D&B businesses that do not match to InfoUSA businesses at all.

EVALUATION OF THREE APPROACHES TO MATCHING. In discussions between project staff and Board staff, many uncertainties were identified with regard to the methods used by InfoUSA to conduct a match. We therefore determined that pretesting was necessasry. After many conversations with InfoUSA, we identified three match procedures to test, using the 1000 sampled cases intended for the two pretests.

NORC sent InfoUSA the pretest samples and asked them to match these to their database using three matching methods. The matching was done independently for each method, and they are described below.1

  1. Standard Match (SM). This approach required that 80% of the characters in the company name and company address match to one of their records. InfoUSA claims typically to match 60-65% of a file. They also claim that their standard match is designed to be very reliable, thus not producing false matches or multiple matches.
  2. Loose Match (LM). This approach was analogous to the SM, but relaxed the requirement of character matches for company name and address somewhere below the 80% threshold (supposedly 60%). This was match procedure was selected to account for different spellings, typos, etc. that might have caused a non-match.
  3. 6-digit Match (6M): This approach defined a match as a record that matched on company name (80%) and the first 6-digits of the telephone number (exactly). The intention of this match was to capture firms that moved, but retained their phone number.
As described above, NORC ended up using 750 of the 1000 firms originally selected for pretests 1 and 2, for Pretest 1. By calling these firms to conduct screening interviews, NORC was able to classify most businesses in the Pretest 1 sample as either operational or non-operational. In addition, NORC conducted various locating activities to resolve any unknown classifications.

These procedures were intended to answer the following questions:

  1. What is the match rate? If InfoUSA cannot match a reasonable percentage of D&B businesses, matching is not worthwhile.
  2. What percentage of matched in-business records is determined to be in-business? If this percentage is not high, matching is not worthwhile.
  3. What percentage of non-matched records is in-business? If this percentage is high, then matching is not worthwhile.
  4. What percentage of the matched out-of-business records are out of business? If this percentage is low, we cannot use the Out-of-business flag. If it is quite high, we could totally eliminate these records from consideration, or at least subsample them at very low rates.
QUALITY OF THE MATCHES. NORC's analysis of the resulting matches spawned questions about the Loose Match (LM) and the 6-digit Match (6M) that were never satisfactorily answered by InfoUSA. The Standard Match (SM), however, lived up to expectations. As InfoUSA predicted, the SM resulted in matches for approximately 60 percent of the D&B sample (n=592 of 1000 in-business firms). There were also19 out-of-business firms, which InfoUSA refers to as "nixies." Moreover, close inspection of the matched records revealed that these matches appeared to meet the specified matching criteria.

The LM proved to be completely unreliable. Based on the definition of the Loose Match, we were anticipating the LM matches to be a superset of the SM matches, but this was not the case. For many of the LM matches (and non-matches), no rationale behind the match-status was evident. Although its matches appeared to be highly reliable, the 6M method was too conservative, as many potential matches were left as non-matches.

The following types of inconsistencies were identified with the Loose and 6-digiit Matches:

  1. Standard matches that do not appear as loose matches (n=238).
  2. Standard matches for which the first 6 digits of the phone numbers match, but there is not a corresponding 6-digit match for that record (n=270).
  3. Loose matches with company names that are so different that the looseness criteria described should not have picked them up as matches (n=47 of the first 100 records).
  4. Loose matches that look good enough to be standard matches, but are not (n=37 of the first 100 records).
  5. Cases for which there is both a standard match and a loose match; however, the standard match has the same firm name, but the loose match has a different firm name that does not match the D&B firm name (34 of the first 100).
Because of these inconsistencies, the matching pretest was unable to fully answer question #1 above. However, because of it's 60 percent match rate, the Standard Match still held promise for use in sample design. NORC next looked at the results of Pretest 1 screening to evaluate the operational status of matches and non-matches, and also to evaluate the accuracy of InfoUSA's "Out-of-Business" flag.

EVALUATION OF MATCHING OUTCOME BASED ON PRETEST OUTCOME. The purpose of this analysis was to evaluate the usefulness of matching outcome, using information obtained in the pretest about whether or not the firm was currently in operation, and whether this varied by firm size or ownership type.

After screening was completed, NORC classified the businesses into three categories, based on screening outcome. Table 8 shows the number of cases in each category. To show how Operating Status was determined, Table 9 shows a cross-tabulation of operating status and final case disposition.

Table 8. Operating Status of Pretest 1 Cases
Operating Status Description N Percent
B Operating 688 91.7
N Not Operating 57 7.6
U Unknown Operating Status 5 0.7

Table 9. Operating Status of Pretest 1 Cases by Final Case Disposition
Final Disposition Operating Status N Percent
COMPLETE-Ineligible B 93 12.4
COMPLETE-Ineligible1 N 1 0.1
COMPLETE-Eligible B 302 40.3
LANGUAGE BARRIER B 3 0.4
COMPUTER TONE/FAX N 1 0.1
DISCONNECTED/WRONG NUMBER N 32 4.3
NO LONGER IN BUSINESS N 23 3.1
PRIVACY MANAGER B 6 0.8
LOCATED AFTER DATA COLL PD ENDED B 15 2.0
NON-CONTACT WITH BUSY AND NO ANSWER U 3 0.4
NON-CONTACT W/ALL NO ANSWER OR ALL BUSY U 2 0.3
FINAL UNAVAILIBLE/AWAY FOR FIELD PERIOD B 82 10.9
NON-CONTACT W/ANSWERING MACHINE B 20 2.7
PROXY REFUSAL B 5 0.7
R/OWNER REFUSAL B 85 11.3
GATEKEEPER REFUSAL B 72 9.6
HOSTILE REFUSAL B 5 0.7

1 Screener data was used to determine this case was ineligible as it was not in operation at the time of the screener interview. Return to Table

As Table 8 shows, 92 percent of the firms sampled from D&B for Pretest 1 were operating as a business at the time of the pretest. The driving force behind the matching with InfoUSA data was the assumption that in-operation rate of firms in the D&B sample would be lower than 92 percent. The five percent follow-up subsample of the SSBF98, suggests an estimate of 88 to 90 percent.2 Nevertheless, eight percent of the main survey sample, which is expected to be about 37,000 firms, translates into 2,960 out-of-business firms. Therefore, the value of predicting out-of-business firms, without having to go through the screening process, is still valuable. It is noteworthy that, of the 62 pretest cases finalized as either N or U, the average number of calls made to reach this disposition was 5.4 (minimum calls=1, maximum calls 26, mode=4). Therefore, potential cost savings do exist by identifying firm characteristics that make an enterprise likely to be out of business.

Table 10 compares SM matches with SM non-matches in terms of operating status. As this table shows, there is a difference in the operation rates for SM matches and non-matches (i.e., 95.6% in-operation for SM matches, and 85.8% in-operation for SM non-matches). However, the difference is not substantial. Furthermore, the percent of non-matches that are in-business is quite high. Therefore, the assumption that non-matches are less likely to be in business appears to be correct, but the SM non-matches cannot simply be assumed not to be viable businesses.

Table 10. Operating Status and Match Outcome by Final Case Disposition
Final Disposition Operating Status Non-Matches
N
Non-Matches
Percent
Matches
N
Matches
Percent
COMPLETE-Ineligible B 46 15.5 47 10.4
COMPLETE-Eligible B 91 30.7 211 46.5
LANGUAGE BARRIER B 2 0.7 1 0.2
PRIVACY MANAGER B 5 1.7 1 0.2
LOCATED AFTER DATA COLL PD ENDED B 7 2.4 8 1.8
FINAL UNAVAILIBLE/AWAY FOR FIELD PERIOD B 29 9.8 53 11.7
NON-CONTACT W/ANSWERING MACHINE B 7 2.4 13 2.9
PROXY REFUSAL B 2 0.7 3 0.7
R/OWNER REFUSAL B 36 12.2 49 10.8
GATEKEEPER REFUSAL B 28 9.5 44 9.7
HOSTILE REFUSAL B 1 0.3 4 0.9
TOTAL B 254 85.8 434 95.6
COMPLETE-Ineligible1 N 0 0.0 1 0.2
COMPUTER TONE/FAX N 1 0.3 0 0.0
DISCONNECTED/WRONG NUMBER N 25 8.4 7 1.5
NO LONGER IN BUSINESS N 14 4.7 9 2.0
TOTAL N 40 13.5 17 3.7
NON-CONTACT WITH BUSY AND NO ANSWER U 1 0.3 2 0.4
NON-CONTACT W/ALL NO ANSWER OR ALL BUSY U 1 0.3 1 0.2
TOTAL U 2 0.7 3 0.7

Based on these results, there appears little to be gained from stratifying the 2003 SSBF sample by match and non-matches. However, since the sample design involves a variety of stratifications, NORC might still consider subsampling within a set of cells if there were sufficient differences between the matches and non-matches. Two possibilities would be to partition the SM matches and SM non-matches by employment size (0-19, 20-49, 50-99, and 100-499) and/or business type (sole proprietor, partnership, and corporation). The results in Tables 11-15 show a familiar pattern: most subgroups of the SM non-match group have lower in-operation rates than their SM match counterparts, but the non-match SM subgroups nevertheless have high in-operation rates. Thus, we may not assume that any subgroup of the non-matches is largely comprised of enterprises that are not operating as a business.

Table 11. Operating Status by Firm Size and Match Outcome
Firm Size Operating Status Non-Matches
N
Non-Matches
Percent
Matches
N
Matches
Percent
0-19 B 139 47.0 199 43.8
0-19 N 27 9.1 9 2.0
0-19 U 1 0.3 1 0.2
20-49 B 84 28.4 170 37.4
20-49 N 8 2.7 6 1.3
20-49 U 0 0.0 2 0.4
50-99 B 20 6.8 42 9.3
50-99 N 4 1.4 0 0.0
50-99 U 1 0.3 0 0.0
100-499 B 11 3.7 23 5.1
100-499 N 1 0.3 2 0.4
100-499 U 0 0.0 0 0.0

Table 12. Operating Status by Ownership Type and Match Outcome
Ownership Type Operating Status Non-Matches
N
Non-Matches
Percent
Matches
N
Matches
Percent
Sole Proprietorship B 97 32.8 128 28.2
Sole Proprietorship N 18 6.1 5 1.1
Sole Proprietorship U 1 0.3 2 0.4
Partnership B 75 25.3 151 33.3
Partnership N 15 5.1 6 1.3
Partnership U 0 0.0 1 0.2
Corporation B 82 27.7 155 34.1
Corporation N 7 2.4 6 1.3
Corporation U 1 0.3 0 0.0

Table 13. Operating Status by Firm Size, Ownership Type, and Match Outcome
Firm Size Ownership Type Operating Status Non-Matches
N
Non-Matches
Percent
Matches
N
Matches
Percent
0-19 Sole Proprietorship B 51 17.2 62 13.7
0-19 Sole Proprietorship N 11 3.7 2 0.4
0-19 Partnership B 36 12.2 72 15.9
0-19 Partnership N 11 3.7 3 0.7
0-19 Partnership U 0 0.0 1 0.2
0-19 Corporation B 52 17.6 65 14.3
0-19 Corporation N 5 1.7 4 0.9
0-19 Corporation U 1 0.3 0 0.0
20-49 Sole Proprietorship B 41 13.9 53 11.7
20-49 Sole Proprietorship N 5 1.7 3 0.7
20-49 Sole Proprietorship U 0 0.0 2 0.4
20-49 Partnership B 24 8.1 59 13.0
20-49 Partnership N 2 0.7 2 0.4
20-49 Corporation B 19 6.4 58 12.8
20-49 Corporation N 1 0.3 1 0.2
50-99 Sole Proprietorship B 3 1.0 8 1.8
50-99 Sole Proprietorship N 1 0.3 0 0.0
50-99 Sole Proprietorship U 1 0.3 0 0.0
50-99 Partnership B 8 2.7 13 2.9
50-99 Partnership N 2 0.7 0 0.0
50-99 Corporation B 9 3.0 21 4.6
50-99 Corporation N 1 0.3 0 0.0
100-499 Sole Proprietorship B 2 0.7 5 1.1
100-499 Sole Proprietorship N 1 0.3 0 0.0
100-499 Partnership B 7 2.4 7 1.5
100-499 Partnership N 0 0.0 1 0.2
100-499 Corporation B 2 0.7 11 2.4
100-499 Corporation N 0 0.0 1 0.2

Table 14. Operating Status by Firm Size (Percent calculation by firm size)
Firm Size Operating Status Non-Matches
N
Non-Matches
Percent
Matches
N
Matches
Percent
0-19 B 139 83.2 199 95.2
0-19 N 27 16.2 9 4.3
0-19 U 1 0.6 1 0.5
0-19 TOTAL 167 100.0 209 100.0
20-49 B 84 91.3 170 95.5
20-49 N 8 8.7 6 3.4
20-49 U 0 0.0 2 1.1
20-49 TOTAL 92 100.0 178 100.0
50-99 B 20 80.0 42 100.0
50-99 N 4 16.0 0 0.0
50-99 U 1 4.0 0 0.0
50-99 TOTAL 25 100.0 42 100.0
100-499 B 11 91.7 23 92.0
100-499 N 1 8.3 2 8.0
100-499 U 0 0.0 0 0.0
100-499 TOTAL 12 100.0 25 100.0

Table 15. Operating Status by Collapsed Firm Size (Percent calculation by firm size)
Firm Size Operating Status Non-Matches
N
Non-Matches
Percent
Matches
N
Matches
Percent
0-19 B 139 83.2 199 95.2
0-19 N 27 16.2 9 4.3
0-19 U 1 0.6 1 0.5
0-19 TOTAL 167 100.0 209 100.0
20-499 B 115 89.1 235 95.9
20-499 N 13 10.1 8 3.3
20-499 U 1 0.8 2 0.8
20-499 TOTAL 129 100.0 245 100.0

EVALUATION OF INFOUSA'S OUT-OF-BUSINESS FLAG. Finally, NORC used the information gained in the pretest about the operating status of the firm to evaluate InfoUSA's Out-of-Business flag. InfoUSA's term for those records flagged as not in operation is "nixie." In all, 19 of the 1000 firms in the pretest sample were flagged nixies; 12 of these appeared among the 750 firms comprising the sample for Pretest 1. Of these 12, six were determined to be in business during the pretest and six were determined not to be in operation. Given that less than eight percent of the overall sample was found not be in operation, a 50 percent out-of-operation rate for nixies shows that this flag has some predictive power. However, given the level of effort InfoUSA claims to make before it flags a record as out of business, a 50 percent in-business rate is a dismal failure. Given the small sample size, we hesitate to draw too firm a conclusion. It is possible that we encountered a very bad draw, i.e. that if we had a larger sample of nixies, perhaps the in-business rate would turn out much lower. However, based on what information we have, we do not consider the nixie flag to be of value for the 2003 SSBF sample design.

CONCLUSION AND DECISIONS. Based on the results described above, NORC concluded that the information generated by matching the D&B file to InfoUSA's file would not enhance the sample design or operational efficiency of data collection. First, only the Standard Match proved reliable, preventing any attempts to narrow the non-match stratum. Second, while in-business rates were lower for the non-match group, the in-business rate was still high for that group. To use the non-match group as intended would require either a high subsampling rate, defeating the purpose of the exercise, or potentially result in substantial design effects. Third, the InfoUSA out-of-business flag (nixies) proved unreliable. Half of the 12 nixies we examined were in fact operating businesses.

On the basis of these findings, FRB and NORC staff decided that no matching to InfoUSA should be undertaken for the purposes of sampling. The potential benefits do not warrant the costs of designing and implementing such a scheme. Nor is it worth complicating the sample design and weighting procedures for such small potential savings.

5.4 Sample Size

As described above, the original pretest sample of 500 firms was augmented with an additional 250 firms due to a concern about achieving the required number of completed interviews in the relatively brief pretest data collection period. The final outcomes, by sample, are described in Section 6.2 below.


Data Collection

This section describes the data collection procedures used in Pretest 1 and the results of data collection.

6.1 Data Collection Procedures

The data collection procedures evaluated in Pretest 1 were the callback rules and disposition codes used in the Telephone Number Management System (TNMS), other TNMS design features, and the procedures for mailing respondent materials.

6.1.1 Callback Rules

Due to the much shorter data collection period in the pretest, NORC reduced the call-back intervals in the Telephone Number Management System (TNMS) to approximately half of what these will be in the main survey so that more calls would be made in a shorter period of time. For example, if the callback interval was 24 hours after the third ring-no answer, we reduced this interval to 12 hours. Although NORC was able to complete the desired number of pretest interviews in the allocated time, pretest interviewers felt that cases were called back too frequently, and after too short a period time. This resulted in wasted calls, as well as being potentially annoying to respondents. Interviewers also questioned whether it is a good idea to put ring-no-answer cases into locating. Although we are not planning to change the callback rules for the second pretest, these will be modified for the main survey.

6.1.2 Disposition Codes

NORC used standard disposition codes for this pretest, modified slightly to handle study specific circumstances (e.g., proxy refusals). At the pretest debriefing, the interviewers made the following suggestions regarding disposition codes:

NORC plans to make these changes to the disposition codes for Pretest 2.

6.1.3 Other TNMS Design Issues

In addition to changes to the callback rules and disposition codes, pretest interviewers also suggested the following changes to the Telephone Number Management System (TNMS):

NORC will work on implementing as many of these changes as possible for Pretest 2. We expect to have them all implemented for the main survey.

6.1.4 Mailing Procedures

NORC's Mail Center in downtown Chicago prepared the mailings for the pretest. There were three types of mailings in Pretest 1:

  1. Pre-screening advance materials
  2. Worksheets and related materials
  3. Respondent incentives.
MAILING OF PRE-SCREENING ADVANCE MATERIALS. For the pre-screening advance letter mailing, the sample control file was read into NORC's Case Management System (CMS). NORC Mail Center staff accessed the sample information in the CMS and ran the company names and addresses through SmartMailer software (see Section 5.3.1 above) to standardized the address information, and identify incomplete and undeliverable addresses.

Once the address data were standardized, NORC used a mail-merge procedure to produce the cover letter from the Project Director. This letter contained the name and address of each individual small business owner in the pretest sample. Along with the Project Director letter, the pre-screening advance mailing contained a letter from Federal Reserve Board Chairman, Alan Greenspan, and a General Information Brochure. The content of these materials is described in section 4.1 above. The three documents were placed in #10 business envelopes, metered, and mailed.

Pre-screening advance materials were mailed to the first batch of Pretest 1 respondents (n=500) on March 1, 2004, and to the second batch of pretest respondents (n=250) on March 22, 2004. The mailing of the pre-screening advance materials went smoothly for Pretest 1. NORC is planning to follow the same procedures for Pretest 2.

MAILING OF WORKSHEETS AND RELATED MATERIALS. Cases successfully screened and determined to be ineligible for the main interview were mailed worksheets and related materials. The contents of the Worksheet mailing are described in Section 4.2 above.

To determine which firms should be sent a worksheet mailing, NORC captured the outcome of the screening interviews, electronically, in NORC's SurveyCraft Telephone Number Management System (TNMS). In Pretest 1, a programmer created the files for the worksheet mailing, but this process will be automated for Pretest 2. There is one file for each worksheet type (i.e., sole proprietorship, partnership, C-corporation, and S-corporation), and a fifth file containing cases of unknown organization type. The worksheet mailing for companies with unknown organization type contained a copy of all four types of worksheets, and a cover page explaining why four worksheets are enclosed and requesting the respondent to complete the appropriate version. Of the 211 respondents in Pretest 1 to whom we mailed worksheets, seven had unknown organization type.

The worksheet and related materials were placed in a special folder, designed for this study, and mailed to respondents, overnight, via Federal Express. In the first batch of sample, there were six cases with a mailing address that was a Post Office box. Since Federal Express does not deliver to Post Office boxes, these cases were re-contacted by a Telephone Supervisor and asked to provide a street address. All six cases did provide a street address upon being recontacted, and NORC subsequently sent these firms the worksheet mailing via Federal Express. The second batch of sample contained four such cases. These cases were not recontacted, however, because they were not needed to reach the goal of 50 completed pretest interviews. For Pretest 2, the screening questionnaire will be revised to collect physical address, rather than mailing address. Also, an interviewer prompt will be added, reminding interviewers to probe for a street address if the respondent gives a Post Office box or rural route as the physical address.

Otherwise, the worksheet mailing for Pretest 1 went smoothly. NORC plans to use the same mailing procedures for Pretest 2.

6.2 Results of Data Collection

NORC completed 302 eligible screeners and 52 main interviews. An additional 19 main interviews were partially completed. The pretest was not a test of the response rates that might be achieved in the main study, for the following reasons:

Table 16 summarizes screening activity by day for Pretest 1. Pretest 1 screening started on March 10, 2004 and ended on April 6, 2004. Note that there was no screening conducted during the period of main interview training (March 15-17). As this table shows, Saturdays were very low production days. Also, the high levels of production in the later part of the data collection period are probably due to the fact that an additional 250 cases were released for screening on March 24. Finally, hours per complete were double what NORC budgeted, probably due to shortening the interval between callbacks (thus increasing the number of calls), and strictly enforcing the "three-attempts" rule to reach the owner of the firm.
Table 16. Screener Activity by Day
Day,Date Cumulative to Date
Hours
Cumulative to Date

Complete

Eligible

Cumulative to Date

Complete

Ineligible

Cumulative to Date

Total

Compete

Cumulative to Date

Hours

per

Complete

Wednesday, March 10, 2004 47.15 33 10 43 1.1
Thursday, March 11, 2004 101.55 79 25 104 1.0
Friday, March 12, 2004 134.15 112 54 166 0.8
Saturday, March 13, 2004 149.58 115 54 169 0.9
Monday, March 15, 2004 NONE NONE NONE NONE NONE 
Tuesday, March 16, 2004 NONE NONE NONE NONE NONE 
Wednesday, March 17, 2004 NONE NONE NONE NONE NONE 
Thursday, March 18, 2004 187.58 143 62 205 0.9
Friday, March 19, 2004 203.61 154 67 221 0.9
Saturday, March 20, 2004 206.36 157 68 225 0.9
Monday, March 22, 2004 228.66 166 71 237 1.0
Tuesday, March 23, 2004 244.61 171 72 243 1.0
Wednesday, March 24, 2004 260.8 185 75 260 1.0
Thursday, March 25, 2004 303.48 215 75 290 1.0
Friday, March 26, 2004 322.48 232 84 316 1.0
Saturday, March 27, 2004 326.98 233 85 318 1.0
Monday, March 29, 2004 354.82 252 87 339 1.0
Tuesday, March 30, 2004 379.49 273 88 361 1.1
Wednesday, March 31, 2004 397.42 283 89 372 1.1
Thursday, April 01, 2004 424.57 288 92 380 1.1
Friday, April 02, 2004 433.25 291 93 384 1.1
Monday, April 05, 2004 449.75 302 96 398 1.1
Tuesday, April 06, 2004 450.67 302 96 398 1.1

Table 17 Summarizes Main Interview activity, by day, for Pretest 1. As the table shows, Saturdays were also low production days for main interviewing. Hours per case were appreciably lower than budgeted (3.7 vs. 5.0). This may be due to the high caliber interviewers (see Section 7), the effectiveness of the respondent materials, or both.

Table 17. Main Interview Activity by Day
 
Date
Cumulative to Date
Hours
Cumulative to Date
Completes
Cumulative to Date
Hours per Complete
Friday, March 19, 2004 27.7 7 4.0
Saturday, March 20, 2004 28.9 7 4.1
Monday, March 22, 2004 55.3 15 3.7
Tuesday, March 23, 2004 71.0 16 4.4
Wednesday, March 24, 2004 91.0 22 4.1
Thursday, March 25, 2004 110.8 28 4.0
Friday, March 26, 2004 125.2 37 3.4
Saturday, March 27, 2004 128.7 37 3.5
Monday, March 29, 2004 142.78 41 3.5
Tuesday, March 30, 2004 161.61 43 3.8
Wednesday, March 31, 2004 178.45 49 3.6
Thursday, April 01, 2004 204.22 54 3.8
Friday, April 02, 2004 222.07 63 3.5
Monday, April 05, 2004 247.74 68 3.6
Tuesday, April 06, 2004 262.81 71 3.7

The results of Pretest 1 screening, by final case disposition, are presented in Table 18, and the results of main interviewing are presented in Table 19. These tables also show the average number of call attempts made by each final disposition code.

Table 18. Results of Pretest 1 Screening by Final Case Disposition
Final Disposition Number of Cases % Cum.
Number of Cases
Cum % Avg. Number of Calls
COMPLETE-INELIGIBLE 94 12.5% 94 12.5% 4.5
COMPLETE-ELIGIBLE 302 40.3% 396 52.8% 5.4
LANGUAGE BARRIER 3 0.4% 399 53.2% 3.0
COMPUTER TONE/FAX 1 0.1% 400 53.3% 6.0
DISCONNECTED/WRONG NUMBER 32 4.3% 432 57.6% 6.5
NO LONGER IN BUSINESS 23 3.1% 455 60.7% 3.7
PRIVACY MANAGER 6 0.8% 461 61.5% 6.3
LOCATED AFTER DATA COLL PD ENDED 15 2.0% 476 63.5% 6.2
NON-CONTACT WITH BUSY AND NO ANSWER 3 0.4% 479 63.9% 7.0
NON-CONTACT W/ALL NO ANSWER OR ALL BUSY 2 0.3% 481 64.1% 5.0
FINAL UNAVAILABLE/AWAY FOR FIELD PERIOD 82 10.9% 563 75.1% 11.8
NON-CONTACT W/ANSWERING MACHINE 20 2.7% 583 77.7% 14.9
PROXY REFUSAL 5 0.7% 588 78.4% 9.6
OWNER REFUSAL 85 11.3% 673 89.7% 6.9
GATEKEEPER REFUSAL 72 9.6% 745 99.3% 9.2
HOSTILE REFUSAL 5 0.7% 750 100.0% 8.8

Table 19 contains the results of data collection for the main interviewer, by final case disposition. As with the previous table, this table also contains the average number of calls made by final disposition. These results confirm the impression of the pretest interviewers, reported during the debriefing, that once a respondent starts the interview, he or she usually completes it. Note that the 72 cases counted as "complete" for this table were not subject to the completeness test required by the FRB, but merely made it all the way through the main interview questionnaire. For the one firm that was found to be no longer in business, the screening interview had been completed with a proxy respondent. The one ineligible firm had a new owner who was unable to report on the 2003 finances for the firm.

Table 19. Results of Main Interviewing by Final Disposition Code
Final Case Disposition Number of Cases % Cum Number of Cases Cum % Avg. Number of Calls
COMPLETE 72 32.6% 72 32.6% 5.4
PARTIAL COMPLETE 18 8.1% 94 42.5% 9.3
NO LONGER IN BUSINESS 1 0.5% 73 33.0% 9.0
INELIGIBLE 1 0.5% 74 33.5% 5.0
PRIVACY MANAGER 2 0.9% 76 34.4% 5.5
UNAVAILBLE DURING FIELD PERIOD 86 38.9% 180 81.4% 7.4
PROXY REFUSAL 4 1.8% 184 83.3% 10.5
OWNER REFUSAL 15 6.8% 199 90.0% 6.5
GATEKEEPER REFUSAL 22 10.0% 221 100.0% 6.6


7. Interviewers


7.1 Recruiting and Training

 

NORC recruited 12 trainees to conduct Pretest 1. Three trainees were experienced NORC telephone interviewers trained on other NORC projects, six were from employment agencies specializing in placing accounting, bookkeeping and other business professionals into temporary and permanent positions, and the remaining three trainees came from referrals, newspaper ads and other sources (see Table 20).

 Table 20. Interviewer Trainees Who Completed Training by Source


Table 20:
 Source Number of Trainees Number Who Completed Training*
Experienced NORC staffers 3 3
Accountemps 2 1
Office Team 2 2
Spherion 3 3
Referrals, newspaper ads and other 3 3
* Those certified to administer both the screener and the main instrument through check-out mocks with trainers.

 

All 12 trainees completed a six-hour course on general telephone interviewing practices and procedures. (The experienced NORC interviewers had completed this introductory training at the start of their employment with NORC; the other trainers completed this training immediately before the SSBF-specific training.)

 

The SSBF screener training took two days and occurred on March 8th and 9th. (See Section 3 for description of training.) Of the 12 trainees, 11 successfully completed the SSBF training. These 11 were certified to administer both the screener and main instrument check-out after being tested on mock interviews conducted with trainers. The trainee not certified to work on the study was recruited from one of the temporary agencies, Accountemps. The SSBF main interview training took three days and occurred on March 15 through 17, 2004.

7.2 Telephone Interviewer Performance 

Generally, interviewer performance in Pretest 1 was evaluated using NORC's standard criteria for screening candidates for interviewing positions, and for evaluating interviewers performance on the job. The former criteria include exhibiting enthusiasm, being able to read questions fluently and with appropriate affect, and the ability to follow directions and communicate clearly. The latter criteria include number of completed interviews, hours per completed interview, and dials per hour.

 

The results of this evaluation suggest that recruiting some interviewers from agencies is a good practice for SSBF. Of the three top-performing interviewers in the pretest (based on a combination of subjective and objective measures), all came from outside agencies. In many ways these three individuals differed in their histories and interviewing styles, but all three had a background in business, either through work, education or both.

 

We believe that the pretest demonstrated that having a business background is a useful, though not necessary, attribute of a successful SSBF interviewer. Individuals with business backgrounds are more likely to understand the financial concepts in the SSBF main questionnaire, compared to interviewers without a business background. Moreover, given their choice of education and jobs, those with business backgrounds may have more affinity to the topic and be more likely to demonstrate a convincing interest in the subject, compared to other interviewers.

 

It needs to be emphasized, however, that the pretest showed that a business background, no matter how impressive or well aligned with small business financial practices, is insufficient in and of itself to prepare an interviewer for SSBF screening and interviewing. While it appears that a business background is helpful, exhibiting the (other) qualities NORC demands of all its interviewers is a necessary requirement to being a top SSBF performer.

 

All trainees had to demonstrate fluency with numbers and percentages to qualify for working on the study. Trainees with business backgrounds were more likely to have this fluency coming into training, but those without business backgrounds were able to successfully master and demonstrate numeric fluency over the course of the training.

 

NORC recommends that for the main study, the SSBF interviewer pool be a mix of experienced NORC interviewers from other studies, and qualified trainees with business backgrounds recruited from agencies. We recommend filling openings with available NORC interviewers first, then filling the remaining openings through agency candidates.

 

NORC also recommends that before being invited to SSBF training, candidates exhibit skill in reading the entire screener, which takes only about five minutes, and which includes many of the words that may be unfamiliar or difficult for some interviewers, such as the word, "proprietorship."


7.3 Interviewer Debriefing Session

NORC conducted a Pretest 1 interviewer debriefing on April 13, 2004. The full-day session was held at NORC's Telephone Center in Downer's Grove, Illinois and was attended by the NORC project staff, John Wolken and Traci Mach from the FRB, and the 11 pretest telephone interviewers.

The agenda for the debriefing can be found in Appendix B-1 of this report. The session was organized into 16 modules covering different aspects of pretest interviewing, including what worked and what did not work, screener issues, main questionnaire issues, the TNMS, training materials, job aids, and gaining cooperation.

Participation in the session was high, with each of the interviewers offering observations and comments at various times in the session. Both FRB and NORC project staff felt that a lot of good information and feedback was collected during the debriefing that will be very useful in improving systems, materials, and processes for Pretest 2 and the main survey. The minutes from the debriefing can be found in Appendix B-2.


8. System Integration

In a survey as complex as the Survey of Small Business Finances, it is critical that not only each task occur smoothly, but that the interface between tasks occur smoothly as well. In this section, we discuss the interface between sampling and IT in terms of getting the pretest sample loaded into the productions systems, between IT and the Mailing Center for the mailing of respondent materials, and between the Telephone Center and Accounts Payable for the payment of respondent incentives.


8.1 Delivery of Sample to IT

Early in the process of sample design, NORC sampling and IT staff worked together to identify the information from Dun and Bradstreet that should be preloaded into NORC's Case Management and Telephone Number Management systems. In general, this information included data used for reporting (such as organization type and size), for sample management (such as replicate and batch number), and for gaining cooperation (such as Standard Industrial Classification, and owner's name). Sampling and IT staff also agreed upon a data layout and a schedule for data delivery that would give IT enough time to load the sample information into the systems before data collection was to begin.

The close communication and joint planning of sampling and IT staff appears to have paid off, because this interface worked very smoothly in Pretest 1. NORC plans to follow the same procedures for Pretest 2 and the main survey.


8.2 Delivery of Respondent Data to the Mailing Center

As described above, the SSBF involves three separate mailings to respondents: 1) a pre-screening mailing to all sampled firms, 2) a worksheet mailing to those firms determined to be eligible based on screening, and 3) an incentive mailing to those firms that complete the main interview and select the $50 incentive option. For the pre-screening mailing, the sample control file was read into NORC's Case Management System (CMS). NORC Mail Center staff accessed the sample information in the CMS and ran the company names and addresses through SmartMailer software (see Section 6.1.4 above). This process ran very smoothly in Pretest 1. NORC plans to follow the sample process for Pretest 2 and the main survey.

The worksheet mailing requires information about which firms screened in as eligible for the survey. The outcome of the screening interviews was captured, electronically, in NORC's SurveyCraft Telephone Number Management System (TNMS), and automatically transferred to the Case Management System (CMS). In Pretest 1, a programmer created the files for the worksheet mailing. This process ran smoothly in Pretest 1, but NORC plans to automate this process for Pretest 2.

For the incentive mailing, case information from the TNMS was saved in a file and emailed to Accounts Payable. Of the 71 pretest respondents who made it through the entire main interview questionnaire, 52 requested the $50 incentive option. This incentive choice was captured electronically in the TNMS during data collection. After pretest data collection ended,  IT staff produced a file from the TNMS  containing the necessary information for mailing a $50 check to each of the firms that selected this incentive option.  This information included the owner's name and mailing address.  Accounts Payable produced the checks and sent these, via interoffice mail, to the Mailing Center. The Mailing Center applied postage via postage meter and mailed these to respondents.

 


9. Data Delivery

NORC delivered the following to the FRB for Pretest 1:

The format of the data delivery for Pretest 1 was driven by expedience, and was not intended to resemble for format for the delivery of main survey data. During Pretest 2, NORC will work with FRB staff to define the requirements for main survey data delivery.

The FRB reported that the variable names assigned to the Pretest 1 data made it difficult to related data items to questionnaire items. NORC will work with the FRB to revise the variable names for the next pretest.


10. Appendices


10.1 Appendix A: New or Revised Materials

A-1. Revised Letter from Project Director included in pre-screening mailing

A-2. Revised Letter from Project Director included in worksheet mailing

A-3. Job Aid #9: Institutional Look-Up


10.2 Appendix B. Interviewer Debriefing Meeting Materials

B-1. Interviewer Debriefing Meeting Agenda

B-2. Interviewer Debriefing Meeting Minutes

APPENDIX A-1
Revised Letter from Project Director Included in Pre-screening Mailing
The 2003 Survey of Small Business Fiances Logo  A National Organization for Research at the University of Chicago Logo

[INSIDE ADDRESS]

Dear [RESPONDENT NAME]:

As the enclosed letter from Chairman Alan Greenspan explains, you are one of a select group of small business owners throughout the country being asked to participate in a study of the cost and availability of financial services.

Your participation is critical. Every part of the diverse small business community needs to be represented if policymakers in Congress are to have a complete picture of the economic health of small businesses in America today.

Expect a five-minute phone call from NORC next week. An NORC interviewer will contact you in the next week to determine your eligibility for the Federal Reserve Board's Survey of Small Business Finances. This call will be very brief - five minutes - and you will be asked for a few basic facts about your business, such as the number of employees and the end date of your fiscal year.

If your firm is selected for the main survey, we will contact you again and provide additional information.

Your confidentiality is assured. Participation is voluntary, and you can skip any question you choose not to answer. Let me emphasize that the information you provide will be kept strictly confidential. Data that identifies your firm will be accessible only to project staff at NORC and the Federal Reserve Board who have signed confidentiality agreements. Published data will be stripped of all identifying information.

Enclosed is a brochure containing more information about this scientific research study and about NORC. If you have questions, please call our toll-free hotline at 1-800-692-4192. For more information, visit our website at norc.uchicago.edu/SSBF or the Federal Reserve Board's website at FederalReserve.gov/SSBF. If you prefer, you can also send me an e-mail at [email protected].

Thank you for your assistance.

Sincerely,

Carol-Ann Emmons Signature

Carol-Ann Emmons

Project Director, Survey of Small Business Finances

National Opinion Research Center

Appendix A-2
Revised Letter from Project Director Included in Worksheet Mailing
The 2003 Survey of Small Business Fiances Logo  A National Organization for Research at the University of Chicago Logo

[INSIDE ADDRESS]

Dear [RESPONDENT NAME]:

You recently spoke with an NORC interviewer about the Survey of Small Business Finances. Thank you for sharing your time and information. You received this package because your firm has been selected for the main survey. Your participation in the main survey will make it possible to inform policymakers about the credit needs and the availability of credit for businesses like your own. Even if you are not currently using credit, your responses can help to ensure that credit will be available to you in the future.

We value your participation. As a token of appreciation for your participation in the main interview, we invite you to choose either $50 or Dun and Bradstreet's Small Business Solutions ® information package, which retails for $199.

The attached worksheet makes the interview go faster. Page 2 of this letter lists the materials included in this package. One of the items is your 11" by 17" SSBF worksheet. Ideally, we would like you to complete the worksheet prior to the interview. By completing the worksheet in advance, you will reduce the time required to complete the interview.

You may not need to complete the entire worksheet. Side 1 of the worksheet provides space to indicate the financial services you have used in the past year, and then name the financial institution(s) where you acquired these services. While the form is designed for all types and sizes of small businesses, many businesses only use one or two sources for one or two services. Side 2 provides space to record balance sheet information and indicates where you can find this information in your current tax records.

We will accommodate your schedule to complete the telephone interview. An NORC interviewer will be calling you soon to conduct the interview, which typically takes 30 to 45 minutes. While participation is voluntary and you may skip any question you choose, we encourage you to participate so that we can gain an accurate picture. We want to reassure you that your responses will be kept confidential. Because we understand that your time is valuable, your interviewer will accommodate your busy schedule by working with you to arrange a time that is convenient for you. If necessary, the interview can be broken up into shorter sessions to accommodate your schedule.

After you have completed the interview, please use the enclosed prepaid envelope to return your worksheet to NORC. You may also fax records to 1-866-435-5637. If you have any questions about the study or need a different worksheet, please call our hotline at 1-800-692-4192. If you prefer, you can visit us online at norc.uchicago.edu/SSBF for more information, endorsements, and downloadable copies of the worksheets. You can also e-mail your questions to me at [email protected].

Again, thank you for participating in this important study. We look forward to speaking with you soon.

Sincerely,

Carol-Ann Emmons Signature

Carol-Ann Emmons

Project Director, Survey of Small Business Finances

National Opinion Research Center


LIST OF ENCLOSURES

Please find the following items in your SSBF folder:

  • Letter from Chairman Alan Greenspan, Board of Governors of the Federal Reserve System
Explains the importance of participating.
  • Frequently Asked Questions
Answers questions you may have about the study.
  • Worksheet (11" X 17")
Simplifies and speeds up the interview. Provides space to record financial services used and the sources of these services on side 1 and records data on side 2.
  • The SSBF and how it will be used
Provides a detailed description of how the data are used, with examples from previous rounds of the study.
  • The Board of Governors of the Federal Reserve System
Describes the members and responsibilities of the Board of Governors.
  • Dun and Bradstreet Small Business Solutions ® brochure
Describes information package you may choose as an alternative to receiving $50 as a token of our appreciation.

As a token of appreciation for participating, please accept either
fifty dollars ($50)
or
Dun and Bradstreet's Small Business Solutions ® information package,
which retails for $199.

When we call to conduct the interview, we will ask which gift you prefer. For more information about the information package, see the enclosed brochure, or view examples of the available reports by visiting D&B's Small Business Solutions ® at www.dnb.com/smallbusiness/ssbf.

Appendix A-3
Job Aid # 9
INSTITUTION LOOK UP

The revised institution lookup program is designed for you to first submit a branch query and then, if necessary, a bank query.

BRANCH QUERY

The branch query runs when you press <F4> the first time. All the information you've been able to collect - City, State, Zip and Bank Name - appears in the data fields and will be used by the search program to try to find an exact match right down to the branch level.

The branch query has a second step that is automatically performed as needed.

BANK QUERY

NEW KEY


Appendix B-1
SSBF Pretest 1
Debriefing Agenda
April 13, 2004
Module Schedule Topic, Moderator and Note-Taker
1 9.00 - 9.15 Introduction and Ground Rules

Bill Sherman

2 9.15 - 9.35 General Observations and Suggested Topics

Carol Emmons (Bill as note-taker)

3 9.35 - 9.50 Building on Strengths: Identifying What Works and Why

Carol Emmons (Bill as note-taker)

4 9.50 - 10.35 Cooperation, Refusals and Gatekeepers

Bob Bailey (Terri as note-taker)

  10.35 - 10.45 Morning Break
5 10.45 - 11.10 Handling Unusual and Difficult Situations

Mireya Dominguez (Bob as note-taker)

6 11.10 - Noon Main Questionnaire Issues

Terri Kowalczyk (Carol as note-taker)

  Noon - 12.45 Lunch
7 12.45 - 1.00 Screener Issues

Terri Kowalczyk (Bill as note-taker)

8 1.00 - 1.30 Using the Institutional Look-up

Bob Bailey (Mireya as note-taker)

9 1.30 - 1.40 Fading Respondents: Regaining Cooperation

Terri Kowalczyk (Bob as note-taker)

10 1.40 - 2.10 TNMS

Mireya Dominguez (Bob as note-taker)

  2.10 - 2.25 Afternoon break
11 2.25 - 2.35 Job Aids

Mireya Dominguez (Terri as note-taker)

12 2.35 - 2.50 Contact Materials

Bill Sherman (Carol as note-taker)

13 2.50 - 3.00 Making the Most of Incentives

Terri Kowalczyk (Mireya as note-taker)

14 3.00 - 3.20 Interviewer Training and Training Materials

Carol Emmons (Terri as note-taker)

15 3.20 - 3.35 Other Questions and Topics

Bill Sherman (Mireya as note-taker)

16 3.35 - 3.45 Wrap Up

Carol Emmons


Appendix B-2
 A National Organization for Research at the University of Chicago Logo
Memorandum

April 29, 2004

To: John D. Wolken

From: Carol Emmons

Re: 2003 Survey of Small Business Finances

Minutes from Pretest 1 Telephone Interviewer Debriefing

Participants:

Federal Reserve Board: Traci Mach, John Wolken

NORC: Bob Bailey, Mireya Dominguez, Carol Emmons, Terri Kowalczyk, Bill Sherman, NORC Telephone Interviewers, and Phil Panczuk (by phone).

Minutes:

1. Introduction and Ground Rules

Bill Sherman opened the meeting and explained the ground rules. Introductions were accomplished by having each person in the room say their name and tell something interesting about himself or herself or about a recent vacation. Bill emphasized the importance of open and honest participation. He noted that the purpose of the debriefing is to evaluate the survey, not the interviewers.

2. General Observations and Suggested Topics for the Day

Carol Emmons led this portion of the meeting. She explained that the purpose was to identify topics that the interviewers wanted to make sure were discussed during the debriefing. The topics the interviewers identified were the following:

Main Interview:

TNMS Set Up:

Training:

Survey Procedures:

Screener:

3. Building on Strengths: Identifying What Works and Why

Carol Emmons also led this portion of the meeting. The purpose was to identify the things that interviewers found worked well during the first pretest. The interviewers identified the following items:

4. Cooperation, Refusals and Gatekeepers

Bob Bailey led this portion of the meeting. The purpose was to identify the main challenges involved in gaining the cooperation of gatekeepers and respondents with the pretest. Another purpose was to identify and share solutions to overcoming gatekeeper and respondent objections. This session covered a wide range of topics related to gaining cooperation.

Screener vs. Main Interview. In general, the interviewers found it easier to gain cooperation with the main interview than with the screener. It appears that once the owner or proxy agrees completes the screener, he or she is willing to "go the rest of the way," and complete the main interview. Exceptions to this are owners for whom English is not their first language, owners who work directly serving customers, and owners who work in noisy establishments (e.g., auto repair shops).

Interviewers felt, however, that the time between the screener and main interview should be made as short as possible to take advantage of the rapport developed during the screener. Interviewers also felt that some owners were "turned off" by the worksheets because they look complicated and time-consuming to complete.

Survey Introduction. The interviewers all agreed that the screener introduction needed to be changed. They found that simply asking to speak to the owner of the business was a much more effective way to get to speak to the owner, than first introducing themselves and the study to a gatekeeper. The interviewers recommended that we postpone the introduction until we are speaking with the person we want to interview.

There were differences of opinion among the interviewers about whether it was harmful to mention the length of the interview in the introduction. One interviewer noted that respondents sometimes held him to that amount of time. One interviewer said that a strategy he found useful was to ask the respondent for five minutes of his or her time, rather than telling the respondent how much time was needed. Another interviewer thought it better to be vague, i.e., "This will only take a few minutes."

3-attempt Rule to Reach the Owner. Interviewers sometimes found the "3-attempt rule" to reach the owner as a significant challenge during screening. In some instances, it was clear on the first call that the interviewer would never speak to the owner, yet the CATI system required three attempts to reach the owner before it allowed the interviewer to screen a proxy respondent. The interviewers feel that there needs to be a way to circumvent the 3-attempt rule when circumstances demand it.

Proxy Respondents. For some types of small businesses, such as physicians' and dentists' offices, interviewers found that there was no point in asking to speak to the owner, because the physician or dentist would never take the call. One interviewer noted that in such instances, he would say to the gatekeeper, "I want to make sure the owner knows what is going on, so please talk to the owner about this, and I will call back." Interviewers also felt that having the owner's title (i.e., Mr., Ms., Dr.) would allow them to ask for the owner in the proper way, e.g., May I speak with Dr. So-and-so.

One interviewer made the distinction between owner-designated proxies and self-designated proxies. This interviewer thought it would be helpful to have different introductions for different types of proxy respondents.

Reasons for Refusals. Interviewers mentioned the following as the most common reasons for refusing to do the screening interview:

Strategies for Gaining Cooperation. Interviewers mentioned the following strategies for gaining cooperation:

Changes to Help Interviewers Gain Respondent Cooperation. The interviewers identified the following changes they would like to see made, in the questionnaire, call management system, and survey procedures, to help them gain respondent cooperation with the survey:

Collecting Respondent Email Address. When asked how respondents reacted to being asked for their email address, interviewers commented that respondents often asked how this information would be used. Interviewers were confused about how to code a "no" response, i.e., as "no email address," or as a refusal to give the email address.

5.4. Unusual and Difficult Situations

Mireya Dominquez led this portion of the discussion. The purpose was to identify unusual situations that came up during pretest data collection, that interviewers felt ill-equipped to handle. The following situations were noted:

6.5. Main Interview Questionnaire

Bill Sherman led the discussion for this topic. The purpose was to identify questions or skip logic in the main interview questionnaire that interviewers found problematic. The interviewers made the following general comments about the main interview questionnaire.

The interviewers also made the following comments about specific sections in the questionnaire:

Section B: Organization Demographics
Question Comment
B3 If the number of owners entered at A10.1 is "2," CATI will not accept "sole proprietorship" as the firm type.

Section C: Personal Characteristics of Owners
Question Comment
C2 When a husband and wife are co-owners, either may be reluctant to name a majority owner. In this situation, some interviewers found it helpful to say, "Let's just start by talking about you."
C30 One interviewer thought this question (which asks if the firm is publicly traded) should be asked earlier because earlier questions sometimes make the answer obvious.
C32 Change QxQ to explain that we are looking to identify the owner who has owned the firm the longest time, and to find out when that owner took ownership.

Section F: Use of Credit and Financing
Question Comment
General Respondents rarely mentioned additional institutions after completing sections E and F.
F32.1 Since vehicle loans are usually collateralized by the vehicle, the vehicle should be the first response option.

Section MRL: Most Recent Loan


Table 26:
Question Comment
General Questions seem redundant with earlier questions in sections E and F.

Section G: Use of Other Financial Services
Question Comment
G11 Respondents confuse bank with the bank holding company that provides them with credit card processing services. Interviewers should be warned about this in training.

Section L: Trade Credit
Question Comment
General Smaller firms are unfamiliar with the term, "trade credit." Most refer to this as "having an account." Should add this and the term "invoices" to QxQ.

Section M: New Equity Investments in Firm
Question Comment
General The QxQs need to be written in shorter sentences.
READ 27 Needs to be shortened.
M1 Suggest re-wording as, "Did someone invest in your company?"

Section P: Income and Expenses
Question Comment
P6 and P8 One interviewer thought that using negative numbers to indicate a loss may be error-prone.

Section R: Assets
Question Comment
R2 Needs to be changed in CATI to match hardcopy version of questionnaire.

7. Screener Issues

This section was led by Terri Kowalczyk. The purpose was to identify problems and issues with the screening questionnaire. The following issues were identified by the interviewers:

8. Institution Look Up

This section was led by Bob Bailey. The purpose was to get interviewer reactions to the institution look-up function in the main interview CATI questionnaire. Interviewers reported that they could usually find the branch the respondent used in the look-up table. The response time for the look up was good. Finally, a surprising number of respondents knew the zip codes of their banks, but less knew the zip codes of non-depository institution sources.

9. Fading Respondents, Regaining Cooperation

Due to earlier sections taking longer than plan, this section was skipped. However, interviewers reported that most respondents were willing to complete the interview, once they started it. (See also module 4, Gaining Cooperation.)

10. Telephone Number Management System (TNMS)

This session was led by Mireya Dominquez. The purpose was to discuss possible improvements to the TNMS to facilitate the interviewers' job. Interviewers made the following suggestions:

11. Job Aids

This discussion was led by Mireya Dominguez. The purpose was to identify necessary improvements to the interviewer Job Aids designed for Pretest 1 as well as to identify additional job aids that interviewers thought would be helpful.

Regarding existing job aids, the interviewers suggested adding the breakpoint function to Job Aid #7: CATI Functions.

They also suggested revising the answering machine script as follows:

Interviewers also suggested developing the following two new job aids for the main study:

Finally, the interviewers commented that the job aids were not very accessible in the binder. Job aids need to be posted at the interviewing station.

12. Contact Materials

Pre-screening Materials. Interviewers reported that these were not very memorable to respondents. They agreed that putting the Federal Reserve Board seal on the envelope would cause respondents to pay more attention to these materials. They also agreed that the advance letters seemed to help if the respondent recalled receiving them. Finally, the interviewers liked the revised version of the project director letter that will be used for Pretest 2.

Worksheet Materials. As described above, the interviewers reported that some respondents were put off by the worksheets because they look long and complicated. Accountants seem to like the worksheets.

13. Incentives

Terri Kowalczyk led this discussion. The purpose was to collect interviewers' impressions of how helpful the incentives were in gaining respondent cooperation.

Some interviewers thought the incentives were very helpful in gaining respondent cooperation; other interviewers felt that the incentives did not make much difference. Nobody thought the incentives were harmful. A few respondents declined the incentive. The $50 appeared to be more popular with respondents than the Dun & Bradstreet Small Business Solutions package. No respondents expressed concern about their names being sold by Dun & Bradstreet. The interviewers agreed that there needs to be rules about who gets the incentive if a proxy completes all or part of the interview.

14. Interviewer Training and Training Materials

Comments and suggestions about interviewer training were made throughout the discussion of other topics. These comments are summarized in this section.

Using the TNMS:

Using CATI

Mock Interviews

Eligibility Criteria

Main Interview

Continuous Training



Footnotes

1. These descriptions are based upon verbal understandings of the InfoUSA match procedures. However, InfoUSA reviewed and agreed to a set of specifications based on these descriptions. In fact, it is clear from the analysis below that the loose match and six-number match did not resemble these descriptions. Return to Text
2. The difference may reflect different compositions of the samples. The pretest was evenly divided among ownership types, generating a greater percentage of larger establishments than in the main study. Large establishments are more likely to be in business. Return to Text

This version is optimized for use by screen readers. Descriptions for all mathematical expressions are provided in LaTex format. A printable pdf version is available. Return to Text