More Information

Do students tell the truth on surveys?

Decades of research suggest that self reports of alcohol use are an accurate source of data about drinking. Many surveys have been administered to assess drinking behavior, and studies have examined the validity, reliability, and effectiveness of these surveys. This research has shown that self-reported data on alcohol use are valid. In simple terms, this means that the survey questions regarding alcohol use are accurately measuring the behaviors they are intending to measure. In scientific terms, validity is demonstrated by showing that it is highly correlated with other measures of alcohol consumption including:

  • reports by friends
  • statistics on alcohol-related problems
  • scientific testing

In addition, self-reports of drinking have been found to be reliable through examining test-retest reliability (Sobell et al., 1988). That means that individuals respond consistently to questions about their drinking at multiple points in time.

What has been done to ensure that Marist students tell the truth on this survey?

Research has shown that when people feel that their responses to survey questions are anonymous, they tend to respond honestly (Nurco, 1985). There is overwhelming evidence that students tell the truth about sensitive topic areas, like alcohol use, when taking anonymous and voluntary surveys. The SCANB is completely anonymous; it is administered through the US Mail and does not include any identifying markings on the survey instrument or reply envelope that would reveal a student's identity. All students are informed of the survey's anonymous and voluntary nature. Because it is a mail survey, students can complete the SCANB in privacy.

Won't some students lie as a joke or to mislead the researchers?

While some students may lie to mislead researchers, research has shown that if students spend the time to respond, they tend to provide truthful responses. A small percentage of students may choose to lie in a way that exaggerates their responses; they are generally not subtle in their responses, and their surveys are therefore easy to identify. To account for such responses, EDC researchers do a thorough "cleaning" of the data prior to any analyses to determine whether any students have provided impossible or illogical responses. These surveys are discarded. A small percentage of students may also under-report their behaviors, particularly if they feel their behaviors are socially undesirable. These types of surveys are harder to detect but tend to balance out the over-reporting that may also exist.

Was the data manipulated to support a cause?

The SCANB data is presented in its raw form with no statistical analyses. Only frequencies (or the number of times a response occurs) are reported. The only exception is a cleaning of the data to determine whether any impossible responses have been given (i.e. a student states that s/he consumed 99 drinks on the last occasion when using alcohol). Other than that, what you are seeing is truly Just the Facts!

My school has strong reasons for wanting to portray moderate drinking among students. How can I believe this data?

The survey was not conducted by Marist College. It was conducted by an outside institution, Education Development Center, Inc. (EDC), a highly regarded national public health research institution. EDC uses rigorous sampling and research techniques and conducts the survey from its offices in Newton, Massachusetts, following a strict methodology. EDC must follow guidelines for research set by the National Institutes of Health and must adhere to principals for protecting the rights of research participants (in this case, students who are asked to complete the survey). This means that students are made aware of the survey procedures, including the fact that it is both anonymous and voluntary, as well as any potential risks and benefits of completing the survey (both of which are minimal).

How can the results be accurate when only 300 students were surveyed?

One of the most fascinating aspects of survey research is that a surprisingly small number of people need to be surveyed to accurately represent a large population. Did you know that when you hear about opinion polls on the news, such as those determining which candidates are leading in pre-election times, often a sample of only 1,000 voters is used to describe the leanings of the entire nation (Newport, Saad, Moore, 1997)

As long as you select a random sample of students to complete a survey, then you can legitimately survey only a small number of students and still get results that are accurate to within a few percentage points in either direction. That is why only 300 students were randomly selected from among all current undergraduate students to participate in this survey. (To put the size of this sample in perspective, the nationally known Harvard School of Public Health's College Alcohol Study uses even smaller samples. Some schools that participate in the Harvard Study collect data from as few as 75 students.)

Furthermore, even though the sample size appears to be small, a random sample survey means that the results will be representative of students at your school. In fact, a random sample survey of 300 students will produce much better results than a survey of 1,000 (or even more) students surveyed via a convenience sample (such as a survey administered to students in particular classrooms or at a table in a campus building, a procedure used by many schools). This is because the students in the convenience sample may share some common characteristics that could skew or bias the data.

Not all students responded to the survey. How do we know that those who did portray the school accurately?

When a response rate for a survey is high, then the results accurately portray the population being surveyed accurately. The response rate for the SCANB is very high in comparison with other surveys of college students. Generally, college-based researchers report response rates between 20-60 percent for a student survey similar to the SCANB. The response rate for Marist College in spring 2001 was 52%, spring 2002 it was 53%, spring 2003 it was 44%, and spring 2004 it was 46%, which is at the far upper end of response rates. By comparison, Gallup, Roper, Louis Harris, and other major polling organizations usually produce response rates of about 35% for their telephone polls.

This data was collected during the spring 2001, 2002, 2003, and 2004 semesters. Is it still accurate?

Most survey research looking at trends in health behaviors is conducted on a biennial or triennial basis (i.e., every 2-3 years). Our nations' census is only conducted every 10 years! Data from one, two, three, or four years ago is considered extremely accurate. Given the amount of time it takes to process and report data after it has been collected, this is often as current as is possible.