LfJ Series: Inclusive Data Streams: How Do Our Students Experience School?Read Now
Listen to the episode by clicking the link to your preferred podcast platform below:
This is the second episode in our 4-part Leading for Justice series. In the last episode, we talked about the impacts of shared leadership and the research on why we should invest in this approach to leadership. In this episode, we're talking about how students experience school and how we measure student perceptions and experiences. We have to ask our students how they experience school instead of assuming we know what the students are thinking or experiencing.
What do you want to measure?
First, it’s important to consider the constructs you want to measure as you select a survey or design your own. For example, how do you take the construct of belonging and actually turn it into a scale (i.e., a series of survey questions) that accurately measures a student’s sense of belonging? It's far more accurate to have 3-5 questions that ask about different aspects of belonging because asking just one question like, “Do you feel like you belong at school?” could lead to students interpreting “belonging” in different ways and just responding based on one aspect of belonging.
Let’s look at an example. Panorama has a great student survey (it’s available on their site for free). Within this survey, they have several scales (topics), one of which is belonging. Their belonging scale consists of 5 questions that they asked that all measure belonging. Here they are, in order:
The survey builds a more complete understanding of what belonging is through this series of questions. It also enables us to look at the results and see differences in specific aspects of belonging. For example, I’ve worked with a number of school districts whose students reported feeling connected to adults, but not feeling respected by students.
So, define the construct(s) you want to measure. If you’re creating your own survey, I find it helpful to do some research. What does the research say? What are the pieces of each of these constructs? How do researchers define it? Scholars have already gone through the process of curating all of the theory and research that's out there in their literature reviews. This makes them good sources of thoughtfully detailed, yet succinct definitions.
Curate Scales from Existing Surveys
If you are thinking about creating your own survey, you can borrow from existing surveys. I would pull complete scales (all items that measure a topic), rather than picking individual questions you like. To start, you can see if you want to borrow one or more of my scales (the personal, interpersonal, or organizational student leadership capacity building scales which you can get for free at the bottom of this post) or use one or more of Panorama’s scales.
I definitely encourage you to add a few open ended questions too. While they are time intensive to sort through and analyze, open-ended questions are a great way to ask students for more detail about their item/scale responses or to invite them to share what actions the school could take to improve in a particular area (or what the school is doing well!) These are great questions to create on your own as you can personalize them for your context and the things you specifically want to learn about.
Principles of Survey Design
Now, let’s briefly talk about how to choose a quality survey or design your own quality survey. I’m going to walk us through some principles of survey design. Of course, you can use research reports which should include statistics that reflect how reliable and valid the survey is. However, reviewing some general principles feels more immediately practical than advising you to dive into the research data for each survey that’s out there.
Panorama Education’s research team, under the leadership of Dr. Hunter Gehlbach, has created a wonderful survey design checklist. I’ve highlighted a few of their recommendations and added some thoughts of my own below.
Survey Items (i.e., Questions)
The survey should use scales, not single items to measure a construct. This means students respond to multiple items on the same topic instead of responding to one question on a topic.
Avoid negative wording. A negatively worded question is one in which a response that “disagrees” is a positive thing. For an item like, “I rarely feel like I belong” we would say it’s a good thing for students to “strongly disagree.” However, it’s cognitively challenging for respondents to switch gears like this, and it’s harder to compute a score for the scale or survey because you will need to “reverse code” responses to these types of questions.
Students should be asked to respond using a continuum of answers. This is ideal because you can then measure a more precise degree of favorability, agreement, or frequency (whatever it is you’re asking about).
All response options should be labeled with words. Asking students to respond on a scale from 1 to 5 without saying what each number means is going to result in imprecise data.
Survey Length and Question Order
Keep it short. I aim for the survey to take no more than 10 minutes to complete.
Ask the most important questions early in the survey. Survey fatigue is real. Make sure the most important questions are asked up front. Do not put demographic questions at the front of the survey; they should come at the very end. It’s important to ask about the survey topic (i.e., belonging, leadership opportunities) immediately, but you also want the survey to flow like an engaging conversation.
Implementing the Survey
Communicate the importance of the survey (i.e., it will help us improve your experience at school—and back it up by taking action on the results!). I suggest using the introductory/consent page of the survey to highlight why this is important and how you plan to analyze and take action based on the resulting data.
Dedicate time within the school day to take the survey. If you’re surveying family members, dedicate time or space during an open house night to invite families to complete the survey. Finally, remind stakeholders to take the survey a few times during the “survey open” window.
Many schools wonder how often to give out a survey (especially as they seek to measure growth over time). Survey stakeholders as often as you are able to take action on the results. That might look like once in the fall and once in the spring. We don’t want to survey students every month if we're not thoughtfully analyzing and taking action on that data every month. Asking students to fill out a survey and then never acting on the responses is just as bad as never giving a survey in the first place.
Those are the basics of survey creation/evaluation and implementation. I'm so excited to see the survey you select or create and the data that you get from it! Please feel free to share in the comments below. Next week we're continuing with the series with an episode on how to create shared governance structures. The following week, we’ll be talking about data analysis through an adaptive leadership lens! If you’re not already subscribed to the podcast, now’s the time to do it, so you don’t miss the rest of the episodes in the series.
Thanks for reading! Continue the conversation below in the comment section and join our community of educational visionaries on Instagram, LinkedIn, and Facebook. Until next time leaders, continue to think big, act brave, and be your best self.
Leave a Reply.
For transcripts of episodes (and the option to search for terms in transcripts), click here!
Time for Teachership is now a proud member of the...
Lindsay Lyons (she/her) is an educational justice coach who works with teachers and school leaders to inspire educational innovation for racial and gender justice, design curricula grounded in student voice, and build capacity for shared leadership. Lindsay taught in NYC public schools, holds a PhD in Leadership and Change, and is the founder of the educational blog and podcast, Time for Teachership.