Editors Note: Happy New Year! We are starting 2020 off with a new series of guest authors. We hope you enjoy this collaboration as much as we have enjoyed working on it. Our first guest author is Dr. Melissa Nantais from Michigan’s Integrated Behavior and Learning Supports Initiative (MiBLSi)
I think I was drawn into the field of School Psychology because of the central role that data play in decision making and how I could be part of that work. It just made a lot of sense to me. I still enjoy the many hours I spend creating graphs related to academics and behaviors because they are used in our team meetings to help guide decision making. My favorite days of the school year were during school-level data review, where we systematically looked through reading, behavior, outcome, and fidelity data to inform our action plans for the coming school year. I was the proverbial “kid on Christmas morning” on each of these days, looking forward to the analysis of these data and the action planning that resulted.
I am proud to say that I come by this use of data for decision making quite honestly. My father is a retired Certified Public Accountant (CPA) and I used to love watching him use the adding machine and the larger ledger pages to review and balance budgets. On a recent visit with my parents I found further evidence of this family data-based decision trait: I went into the utility closet in my parents’ kitchen to get a light bulb and there, next to the box of new light bulbs, was a post-it note with data on it. My mother had recently switched to the LED light bulbs. They cost a little more but were supposed to last longer. She was keeping track of which light she had changed to LED and the date she had made the change in order to find out how long the lights actually lasted. Collecting data to make future purchasing decisions - I love it!
So, it should come as no surprise that I am a huge proponent of data-based decision making in schools. However, it turns out that not everyone has the same level of excitement when it comes to using school-wide behavior data for decision making. In fact, it can feel quite overwhelming given the amount of data available. However, I hope to convince you when you’re provided with a consistent set of questions for teams to ask and know the best sources of data with which to answer these questions, the process becomes not so overwhelming and, just maybe, a little fun! The goal is to have an efficient, effective, and equitable system for using data to make decisions and improve outcomes for students.
With anything that is implemented in schools, we strive to make its use efficient so that it can be effective. Researchers examined factors related to sustained implementation of School-wide Positive Behavioral Interventions and Supports (SWPBIS) beyond 3 years and found that team use of data during the first year of implementation of SWPBIS is one of the strongest predicting factors.1
If we know it is important, and see it should be used as soon as possible, how do we do it well? Developing a data “routine” is one way schools can become more effective and efficient in their use of data for problem solving when implementing SWPBIS. Recent research and practice help to identify what might make up a data routine. In a descriptive study focused on team problem solving using behavior data, researchers found that when the routine of reviewing behavior data prior to team meetings was in place, teams were more precise in defining their problems.2
So, how do we do this well? Below, I suggest a way forward to developing a routine by none other than the using a grouping of five questions for a team to support data-based decision making.
Tip from the field At the start of a school year have the Data Analyst put a standing appointment in their calendar every 2-4 weeks to review the Data Integrity Report and address any problems that are flagged. Having time set aside for this helps to prioritize the importance of this work and is easier to do before your calendar gets full after the school year starts.
Question 1: Are our data accurate? A common response when examining data, especially data that are not the most flattering, is to try to explain the data away. To help guard against this tendency, school teams should develop a data routine that starts with answering the question, “Are our data accurate?” Identifying one or two people who will set a routine [see sidebar below] to review the Data Integrity Report in SWIS and address any problems detected will go a long way to having confidence in the accuracy of your school’s discipline referral data.
Tip from the fieldConsider developing a data routine that puts the school dashboard reports in front of the school staff each month. Having a routine where data are shared in the same format each month, much like instructional routines, helps to lighten the cognitive load for staff who may be less comfortable reviewing data. Some schools have put together a monthly infographic that is shared via email from the school’s principal. Other school teams have put a quick visual with a statement or two of describing the data in each report from the dashboard in front of staff in the same place each month (e.g., posted on a bulletin board above the copier or on the back of the bathroom stall door for the staff restroom - places that all staff tend to be at throughout the day!)
Question 2: What is the current problem?The SWIS dashboard is a great spot to start when answering this question. Use the dashboard to identify an area in which the team wants to dig deeper and gets you using the Drill Down feature of SWIS. I like to call the Drill Down feature The Next Best Thing Since Sliced Bread - because it is! The dashboard is good but not enough. To really answer this question, you will want to develop a precise problem statement using Drill Down that will set you up for more successful decision making. Having a precise problem statement provides your team (and staff) with a more robust data story regarding what is happening in your school and sets you on a more solid path of what to do to address the situation. Make sure that your precise problem statement can answer the questions of what, when, where, why, and who.
Question 3: What is contributing to our precise problem?Even with a precise problem statement, it is possible that an action plan may not be completely successful in improving outcomes. Slowing down long enough to consider what factors are contributing to the precise problem statement is critical. This problem analysis step, when done well, can help narrow the focus of the action planning. This is where a fidelity measure such as the School-wide PBIS Tiered Fidelity Inventory can come into play. When considering school-wide needs at Tier 1, a team should look at the scale score report, subscale score report and items report to help identify and prioritize the factors that may be contributing to the current problem.
Let’s take a look at an example. This elementary school’s data analyst developed the following precise problem statement from the school’s SWIS data:
“As of 1-17-19, 4th, 5th, and 6th grade students (both boys and girls) are engaged in the behaviors of disrespect and inappropriate language on most days in order to obtain adult or peer attention. This is happening most often between 7:35 a.m. and 8:09 a.m. during the morning arrival time while students are waiting in the gym during breakfast.”
When the team began considering the contributing factors, they looked to the SWPBIS TFI Tier 1 score as well as the Items Report to see what might be going on. On the Tier 1 scale report, they noticed that their score was below the 70% target.
The team then looked at their subscale report and made note that their lowest subscale was implementation. To know what about implementation needed to be improved, the team then looked at their Items Report. When the team took a closer look at the Items Report for Tier 1, two items in particular jumped out at them: item 4, Teaching Expectations and item 9, Feedback and Acknowledgement.
Tip from the field Organize a list of common contributing factors along with data sources like specific items from the TFI or other data that could verify whether or not each factor is actually contributing to the current situation. This makes the process of answering this question much easier on teams, especially when the team is new to this work.
The team quickly realized the difference between a previous school year’s plans and the time-slot of morning gym during breakfast - that time frame did not have space on the matrix or lesson plans, and expectations had not been taught for this unique setting.
Question 4: What is the smallest amount of change for the biggest impact on student outcomes?
With a precise problem statement in mind and the contributing factors identified, the team is now charged with considering the actions that will be taken to address the problem. Teams are encouraged to consider actions that can help prevent the problem from occurring, what might need to be taught (or retaught) to address the problem behavior, and how to best reinforce the behaviors defined in the school’s expectations while avoiding inadvertently reinforcing behaviors that are not appropriate for the school setting. Teams are looking for the smallest amount of change that will result in the biggest improved outcomes for students. Teams need to keep the precise problem statement (question 2) as well as their hypothesis statement(s) (question 4) in the forefront when determining their action plan.
Tip from the field Set up a standard meeting agenda and action plan that includes prompts to consider action items related to prevent, teach, respond. This will help to establish and support another part of the data routine. Your team may want to take a look at the TIPS process as another routine that could support the use of data for decision making.
So going back to our last example: The team quickly developed a hypothesis that if they defined expectations for the morning gym, developed the matrix and lesson plans, and taught the expectations to the 4th, 5th and 6th grade students, then they would see a decrease in the number of discipline referrals for disrespect and inappropriate language. The team further hypothesized that if they defined the feedback and acknowledgement expectations for this setting, it would increase the ratio of positive interactions with adults and decrease the number of discipline referrals for disrespect and inappropriate language.
The team developed the matrix column and lesson plans for the morning gym along with an active supervision guide for the individuals assigned to supervise the morning gym. This active supervision guide included a defined goal for ratio of interactions as well as examples of how to provide behavior specific feedback.
Question 5: Did we implement our plan and is it working?
Have you ever been a part of a meeting where an action plan was developed only to come back together and have no real plan for how to tell if the plan worked? You are not alone! It can be easy to come up with an action plan and to forget to define goals related to the plan as well as the means for measuring if the plan is working. This brings us full circle, back to having data for decision making. SWIS and the TFI will continue to be great sources of data but there may be a need for some additional information based on the plan that is developed. Do not leave the meeting without a clear plan of how you will answer these questions! Consider setting up a report template in SWIS Drill Down to make running your reports more efficient. Easy access to the data will help your team routinely use it to figure out if your plan is working.
Tip from the field
Get in the habit of starting your meeting with this question! It puts it center stage and pushes your team to make sure this is defined at the end of the previous meeting. Also consider taking all 5 questions and making a poster or printing these questions on brightly colored sheets of paper that are laminated and used to guide your team’s meetings.
Keeping with the earlier example, the team came back together every 3-4 weeks and were able to review the plan and keep track of the status of whether or not they completed the actions as well as go back to their saved SWIS Drill Down report template to watch the decrease in the discipline referrals during the morning gym! The team also made sure to celebrate the progress not only as a team but also with the staff and students.
Data is a four letter word but it doesn’t have to be one that causes your team anxiety. Embrace a data routine using these process questions [or similar ones] and you should find your team is looking forward to using your data to improve decision making and outcomes for students!
Melissa Nantais, PhD
Melissa is the Professional Learning Coordinator for Michigan’s Integrated Behavior and Learning Supports Initiative (MiBLSi). Before MiBLSi, Melissa was a faculty member in the School Psychology program at the University of Detroit Mercy. She has also worked as a school psychologist for two years with Kalamazoo Public Schools. Melissa also worked for seven years in the greater Cincinnati as a school psychologist and as an educational consultant, trainer and coach with a specialty in collaborative consultation for systems change related to Response to Intervention (RtI).
1. McIntosh, K., Mercer, S.H., Nese, R.N.T., Strickland-Cohen, M.K., Kittelman, A., Hoeslson, R. & Horner, R. H. (2018). Factors predicting sustained implementation of a universal behavior support framework. Educational Researcher, 47(5). 307-316.
2. Todd, A.W., Algozzine, B., Horner, R.H., Preston, A.I., Cusumano, D., & Algozzine, K. (2019). A descriptive study of school-based problem-solving. Journal of Emotional and Behavioral Disorders, 27(1), 14-24.