Frequently Asked Questions

  1. What is this study? 

The MASDER (Motivational Attitudes in Statistics and Data Science Education Research) team is studying attitudes in statistics and data science education. Since 2016, we have been developing a family of instruments to measure student attitudes and instructor attitudes in introductory statistics and data science courses at the undergraduate level. We are also collecting data about characteristics of the learning environment. These instruments will be used  to collect nationally representative data and link it with measures of achievement. 

  1. What is happening in Fall 2023 / Spring 2024? Why am I seeing this?

In Fall 2023 and Spring 2024, we are collecting data from a nationally representative sample of introductory statistics courses and data science courses: YOU have been selected to participate! This is the last and most critical phase of the project, and we hope you will agree to participate by sending surveys to your students and completing a few surveys about your attitudes and courses yourself.

  1. Why was I selected for this study?

In order to collect data that is widely representative of undergraduate introductory statistics and data science courses in the US, we are conducting a stratified random sample of such courses and instructors. Your institution, course, and you, as an instructor, were randomly selected to participate in this study. The sampling frame was determined based on data from the College Scorecard and the Carnegie Classification systems.

After our taking random samples of statistics and data science courses and instructors, we also extended the invitation to participate to instructors via various ListServs.

  1. Why should I agree to participate in this study?

Your specific course matters! Our strata have been created to provide information on a variety of educational experiences, structured around highest degree offered, minority-serving institution status, and school selectivity. Without you and your course, your type of institution will be underrepresented in the study. We need your help to reduce non-response bias. Participation in this study will contribute to the improvement of teaching methods and materials in statistics and data science, by contributing to the creation of instruments to measure important aspects of motivation and attitudes of students, instructors, and environments which impact student learning and retention.

After the term ends, you will receive a customized report showing how students’ attitudes in your class(es) compare to the nationally representative data. 

Also… if you participate, you will receive a $125 Amazon gift card

  1. What am I being asked to do? What is the time commitment?

This study takes place during one class term, such as a semester or quarter. Prior to the start of your term, you will be asked to complete some surveys about your teaching and your course . Within the first week of the term, you will give your students a link to the student attitudes survey . (Students can take the survey outside of class time.) At the end of your term, you will again administer the survey to your students and complete one more survey about your class. 

There is an additional $25 gift card available for opting into an optional component of the study: administering a content assessment at the end of your semester to your students. (The content assessment needs to be required as some kind of class assignment.)

As an instructor, you will spend about 45 minutes in total completing surveys, and about 30 minutes total sending surveys to your students. Students will spend about 30 minutes total completing attitudes surveys. If your students also complete the content assessment, it will take them about 50-70 minutes. 

  1. Who is sponsoring this study? Who are the researchers?

The MASDER study is funded by the National Science Foundation under Grant No. DUE-2013392. The study is under the NSF’s Improving Undergraduate STEM Education (IUSE) program. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or our host institutions. 

You can read more about the MASDER research team on our website.

SOMA Surveys

  1. What surveys are being administered? How long do the surveys take to complete?

We are developing a family of six instruments called the SOMA – the Surveys of Motivational Attitudes (SOMA). The focus of the instrument is indicated with a prefix (S for Student, I for Instructor, E for learning Environment), and the audience is indicated with a suffix (S for Statistics and DS for Data Science). 

  • The student instruments measure student attitudes about learning statistics or data science, respectively. 
  • The instructor instruments measure instructor attitudes about teaching statistics or data science. 
  • The environment survey collects information on the course learning environment and is completed separately for each distinct course that is participating in the study. 
Student InstrumentInstructor InstrumentEnvironment Inventory
StatisticsS-SOMASI-SOMASE-SOMAS
Data ScienceS-SOMADSI-SOMADSE-SOMADS
TimelineStudents complete Pre and Post Survey in one course term.Instructors complete one time.Instructors complete Pre and Post surveys in one term. 
Time Commitment15 minutes each time15 minutes10 minutes for Pre
20 minutes for Post*
*(Additional time if you have multiple courses participating.)
  1. What kinds of questions are asked on the surveys?

The student surveys gauge respondents’ attitudes about learning statistics or data science, and the instructor surveys gauge respondents’ attitudes  toward teaching statistics or data science. These items are answered using a 7-point Likert-type disagree/agree scale and are designed to measure several components of attitudes and motivation aligned with an established psychological framework. They also include demographic-type questions.

  • Some example student survey items are:
    • Understanding statistics gives me a sense of satisfaction. [S-SOMAS]
    • I value statistics because it makes me an informed citizen. [S-SOMAS]
    • I am disappointed in myself when my code doesn’t work. [S-SOMADS]
    • I get excited to share things I have learned from data. [S-SOMADS]
  • Some example instructor survey  items are:
    • I am confident in my ability to help students learn statistics. [I-SOMAS]
    • I teach statistics to prepare students for successful careers. [I-SOMAS]
    • Getting students to develop computational/algorithmic thinking is difficult. [I-SOMADS]
    • Teaching data science makes me better at doing data science. [I-SOMADS]

The environment surveys include questions to gather information characteristics of the courses and the way the course is being taught (i.e., learning environment and pedagogy). Some example environment survey  items are:

  • What is the mathematical prerequisite for this course?
  • How many students do you expect to enroll in this class?
  • In which modality is your class being offered? [choices provided]
  • Do you have lab/recitation/discussion sections for this course?
  1. Why do we need new attitude surveys when some already exist?

While some surveys for measuring student attitudes about statistics exist (such as the SATS-36), the field of statistics education has advanced substantially since these were released. Data science is a young discipline, and no surveys exist that have published evidence supporting their use. Surveys of instructors’ attitudes toward statistics and data science also have not been made before: existing surveys focus on pedagogical practices or self-efficacy. We are developing a cohesive set of survey tools that can be used to examine the entire learning environment, so we can understand how student attitudes are related to instructor attitudes.

Study Logistics

  1. What are the criteria for receiving a gift card?

There are two levels of participation that will receive gift cards – $125 only or $125 with an additional $25.

In order to receive a $125 Amazon gift card for participating in the study, the following criteria must be met by the end of your course term:

  • Complete the instructor attitudes survey
  • Complete the Pre and Post learning environment inventory
  • Administer the Pre and Post student attitudes survey to your students and receive at least a 75% completion rate each time

If you additionally assign the content assessment at the end of the term and receive at least a 75% response rate, you will receive an additional $25 gift card

  1. How will I access all the surveys?

We have developed a web portal that will be used to access the surveys. After completing the Google Form, instructors who wish to participate will be sent the URL for the portal. 

  1. How should I assign the surveys to my students? 

Via the web portal, you will receive  links to the surveys to share with the students in your course(s). We will also provide you with a script that you should use in communication with students when soliciting their participation. We strongly recommend making the attitude surveys a class assignment and giving students credit for participation because this helps to reduce non-response bias. Assigning the survey for credit is not required; you could also assign for extra credit or no credit. However, assigning for credit is the most reliable way to receive a high response rate.

  1. Can I get the students’ names for assigning credit?

Yes, you can get the students’ names. We will email you a list of names of students who completed the consent form so that you can give these students credit. Note: students do not have to complete the survey to be included on the list of names you are sent. There should not be any penalty for not giving consent.

  1. Will I be able to see my students’ results?

At the end of the term, you will receive a customized report showing aggregated results from your class with a comparison to national averages on the attitudes survey. At the end of the grant period (after October 2024), we will make de-identified survey data publicly available, but we are not able to share the raw data from any particular course for privacy reasons.

If you opt in to administering the content assessment, we will return the student scores to you, but not their specific answers to each question. . 

  1. Does this study have IRB approval? 

This project has been approved by the Institutional Review Board (IRB) at California State University, Monterey Bay. If you would like to connect with your IRB before administering the survey, you may provide them with this folder that includes information they might need to determine your eligibility to administer the survey.

  1. Can any instructor participate in the Spring 2024 study?

Only courses at the undergraduate level that have been selected as part of our random sample can be included in this study. This means that only invited instructors can participate.  

  1.  I participated in Fall 2023. Can I participate again in Spring 2024?

Yes, you can. You will be eligible for another gift card. You will need to complete all components of the study and receive at least a 75% response rate. However, you do not need to complete the instructor attitude survey a second time.

  1. I am teaching a statistics course and a separate data science course. Can both classes participate?

No, you cannot. Due to an overwhelming response, we are only able to offer one gift card per instructor.

Content Assessment

  1. What will I be asked to do if I agree to administer the content assessment?

Administering the content assessment is an optional add-on to this study. If your students do complete the content assessment with a 75% completion rate, you are eligible for an additional $25 gift card.

If you agree to administer the content assessment, you will obtain a link  via our website. We will provide you with a script that you should use in communication with students when soliciting their participation. You will need to assign the content assessment for credit during the last week of class in your term. Credit must be based on accuracy/correctness rather than completion. The content assessment will take students about 50-70 minutes to complete. After students complete the content assessment, our team will send you a list of students who completed the content assessment, as well as their scores, so that you can assign credit. 

  1. Why is the content assessment required to be assigned for course credit based on accuracy/correctness?

Because a content assessment is not an attitudinal survey, being able to assess students’ correct or incorrect answers is essential. This means that students have to have buy-in for doing their best on the assessment. 

Contact

  1. Who should I contact if I have any questions or concerns?

Please contact Dr. April Kerby-Helm at Winona State University (akerby@winona.edu) if you have any questions about the project.