Editors Note: This is the third article in our four-part guest author series focused on using questions to guide challenging team conversations. The first article exploring school-wide decisions is available here. The second article exploring disproportionality is available here. This month's guest author is Dr. Susannah Everett, Assistant Professor in Residence at University of Connecticut's Neag School of Education.  

As a school psychologist, the teams I was part of struggled to implement and sustain behavior support plans for individual students. We often put lots of energy into developing a beautiful plan, only to have it literally sit in a folder, or we’d get a strong start and then things would fizzle out, or when the student experienced changes, often we’d respond by talking about it a lot and starting all over! Over the years I’ve learned from experience, research, and my colleagues that part of the answer to this question lies in data. After lots of conversations about this topic, my wonderful colleagues Katie Conley, Sarah Pinkelman, and I wrote this article for teams to help them explicitly think through and plan for the steps they should take to support plans that will help students develop and practice the skills they need to be successful!1

Kaitlyn’s student support team is stuck. While they agree that they need to increase Kaitlyn’s behavioral supports, they are divided on whether to continue implementing her current plan with minor adjustments or to start over from scratch. Kaitlyn receives special education services for academic challenges and social behavioral problems and receives the majority of her services in her general education classroom. Kaitlyn recently started responding to adult requests by asking off-topic questions, saying “no” loudly, putting her head down, and arguing. These behaviors are disruptive to the classroom, and lately, her behavior has escalated to a point that her teacher has asked Kaitlyn to leave the classroom several times a week. Kaitlyn’s team understands the importance of making collaborative decisions about Kaitlyn’s plan based on data, but doesn’t feel confident that the data they currently collect - office discipline referrals and work completion - will give them the information they need to make good decisions about her plan. The team calls a meeting to revisit their system for monitoring Kaitlyn’s progress.

Individual Student Progress Monitoring Flowchart.png
For students like Kaitlyn who receive individualized interventions targeting social and behavioral goals, behavior support plans (BSPs) are designed to teach and reinforce pro-social behaviors, as well as prevent, correct, and reduce the impact of undesired behaviors. Student support teams like Kaitlyn’s have the task of designing plans that are technically adequate – for example, based on results of functional behavior assessment (FBA) - and contextually appropriate for the student, school, and family. An equally important task for the student’s team is to monitor the effectiveness and implementation of the plan in order to make adjustments, such as revising supports when new concerns arise or fading supports when goals are met. Monitoring individualized supports requires teams plan for how to design and embed routines for the collection and use of data.2 For student support teams, building, monitoring, and analyzing their process of implementation is a critical component of progress monitoring. 

First, Kaitlyn’s team works together to review the information they already have. They confirm the results of a prior FBA that hypothesized the primary function of Kaitlyn’s problem behaviors is avoidance of math tasks. Next, the team revises Kaitlyn’s BSP to ensure that the level of supports meet Kaitlyn’s increased needs. This revised BSP includes feasible strategies such as a seating change; brief, individualized instruction on difficult lessons; teaching Kaitlyn routines for taking a break and asking for help; and reinforcing academic engagement and task completion. However, the team still has several questions: 
  1. How will we know whether the interventions are working?
  2. How will we know if everyone is implementing the plan with fidelity?
  3. How will we know when to modify or fade supports? 
The team needs to design a progress monitoring decision system for Kaitlyn’s new plan, starting with evaluation of plan implementation. They need to determine: Are we doing what we said we’d do?

Step 1: Develop Fidelity Measures

A BSP like Kaitlyn’s outlines strategies designed to improve student behavior that staff can implement. How effective the BSP is relies on the extent to which staff follow the plan with fidelity [INSERT LINK: https://www.pbisapps.org/community/Pages/Fidelity.aspx]. Improving fidelity is not only related to better student outcomes, it also helps teams evaluate the effectiveness of interventions. Teams can start with simple indicators that are easy to measure and emphasize the most important parts of the plan. As a team, consider the following:
  1. What are the critical BSP components most likely to impact student behavior? 
  2. How will staff measure fidelity of these components? 
  3. What level of fidelity is considered acceptable and when should this goal be met? 
  4. How will staff let us know when and how they’ve had difficulty implementing the plan?
  5. Are fidelity goals specific, measurable, and realistic? 
Kaitlyn’s team discusses each BSP component and agrees that the fourth-grade teacher, classroom aide, and special education teacher would each use a checklist outlining the critical prevention, teaching, and response components of Kaitlyn’s BSP to assess fidelity. The checklist has the six critical components of Kaitlyn’s BSP in one column. Staff mark a “yes,” ”no,” or “N/A,” to indicate their completion in another column. The checklist includes a place for staff to write additional information relevant to Kaitlyn’s behavior for that day. Team members agree each of them will use the checklist to self-monitor their implementation during math block.
Kaitlyns Fidelity Measure.png

Step 2: Develop Outcome Measures

Once a fidelity measure has been developed, the team can explore how they will assess the plan’s effect on student behavior. Outcome measures indicate a BSP’s effectiveness at 
  • Increasing pro-social behaviors and alternative/replacement behaviors 
  • Decreasing behaviors that are undesired or interfere with a student’s social and academic success. 
As with fidelity measures, teams start by developing simple outcome measures that are easy to collect and sensitive to changes in behavior. Teams may want to develop one measure to capture student problem behavior, and another to capture student replacement behavior(s). Start by operationally defining each behavior and ensuring the behaviors are specific, observable, and measurable. Then, decide how staff can feasibly measure each of the behaviors. These questions can guide you in the process:
  1. Which target behaviors do we want to increase – both desired and replacement? Can staff realistically measure these? How?
  2. Which target behaviors do we want to decrease? Can staff realistically measure these? How?
  3. What are present levels of performance of the behaviors above?
  4. What are the desired (long-term) and acceptable (short-term) target goals and timelines? 
  5. Are outcome goals specific, measurable, and realistic?
Kaitlyn’s team members identify two replacement behaviors they want to see increase: using a break routine and asking for help when needed. They develop a point card with a rating scale for both routines during math period. Kaitlyn’s teachers will rate how well she follows the steps for each routine. Next, Kaitlyn’s team members turn their focus to the behavior they want to see decrease: time off-task during math. They want to know how many off-task incidents occur during math and how much time she spent away from instruction. The team also wants to include Kaitlyn and her family in discussions about her behavior, so at the bottom of the behavior point card there is space for Kaitlyn, her teachers, and a family member to initial or make a comment. 

Step 3: Develop an Action Plan

Developing a comprehensive BSP complete with fidelity and outcome measures to monitor progress is a major accomplishment and should be celebrated. However, a plan’s success depends on how you implement it. An action plan details all of the tasks and resources associated with implementation of the student’s BSP. This is an opportunity to think through hurdles and to take advantage of existing systems. Some tasks and resources will be obvious, like training staff or creating lesson plans. Others may be more subtle, like making sure someone has time in their day for data entry. Time spent considering how the BSP (including data collection) will be put into action helps the team anticipate and address potential barriers and stumbling blocks. 

Action planning discussions focus on who will be involved in implementation, what they need to do, and how they can efficiently embed data collection into daily routines. Through this process, your team may realize the data collection, goals, and routines add up to unrealistic expectations. You may need to modify goals and data collection schedules that prioritize one or two key measures and maintain a less intensive goal and schedule for the rest at a later point. Throughout action planning, a theme for teams to embrace is parsimony; choose the smallest and simplest changes that will provide the largest effects toward goals. Consider the following questions to guide action planning: 
  1. Who will implement each BSP component? What are the specific tasks and resources (e.g., time, training, coaching, materials) required to embed the BSP into daily routines? 
  2. Who will collect data? How often will data for each measure need to be collected? 
  3. Which existing procedures and resources for data collection are available? 
  4. What are the back-up plans when tasks can’t be done? 
Kaitlyn’s team builds an action plan, starting with a review of current routines and procedures during her math period and identifying where the BSP components make the most sense. For example, Kaitlyn is pulled out for reading just before math. The special education teacher schedules ten minutes after the reading lesson to provide an overview of the upcoming math lesson and complete one or two example problems together. The action plan identifies a weekly meeting to review the lesson plans and identify the information that will be most helpful to Kaitlyn. The team proposes to collect points and breaks daily, and fidelity ratings three times a week  The special education teacher will enter data all of these data into I-SWIS, and summarize it for the team before each weekly meeting. 

Step 4: Establish Decision-Making Routines 

A decision system allows teams to efficiently use data to make important, timely, and responsive decisions. For a student’s plan to be effective, teams must commit to active, consistent progress monitoring. This includes regular schedules and meeting agendas for where and how often to meet, and which data to review. 
When establishing routines, teams can consider the following questions: 
  1. What is the schedule for the student support team meetings? 
  2. Who will monitor and communicate summaries of fidelity and outcome data to the support team and how often? 
  3. How will the team know if the student’s plan is working? How will the team know if the plan is implemented by staff? 
  4. What are data indicators for 
    1. Continuing plan components?
    2. Modifying plan components?
    3. Adding plan components?
    4. Fading plan components?  
Starting each meeting with data review centers urgent discussion items and problems to address. When reviewing the data, teams may ask the following questions: 
  1. What are current levels of fidelity? Do steps need to be taken to improve fidelity? 
  2. Is the student’s progress on track to meet long-term goals? 
  3. Do the data match perceptions across team members and implementers?  
  4. What (if any) changes need to be made to meet plan goals and timelines?
Kaitlyn’s team already has a meeting schedule that works for the core members. They share reports weekly. The team adopts general guidelines on formatting the data and agrees upon decision rules to guide data review and action planning. For example, if fidelity data are below the short-term goal for two consecutive weeks, teachers and the classroom aide will meet to review the procedures and provide feedback to the team regarding what’s not going well. All components of the plan will be intensified if Kaitlyn’s problem behavior continues and the number of requested breaks remains lower than the number of teacher-prompted breaks for over one week.  The team members all commit to their individual tasks and agree that the changes will strengthen the BSP. 

Successfully implementing individualized supports in schools can be challenging. Implementing and monitoring plans with the quality, efficiency, equity, and flexibility needed to meet student needs requires team commitment and time. Plan for and celebrate each step of the journey that promotes student academic and social success. 

Headshot of Ruthie Payno-Simmons, PhD

Susannah Everett, PhD

Susannah is an Assistant Professor in Residence at the University of Connecticut’s Neag School of Education. She started her career working with kids and families as a clinical psychologist in community mental health settings. Through a variety of opportunities to work in and collaborate with schools, she fell in love with school psychology, and was fortunate to work as a school psychologist for ten years in both Oregon and CT. At University of Connecticut, Susannah teaches behavioral intervention and special education courses and works with schools and districts in the Northeast PBIS network to support implementation of advanced tiers of supports for students. She is a rabid UConn women’s basketball fan!.


1. Conley, K., Everett, S., & Pinkelman, S. (2019). Strengthening Progress Monitoring Procedures for Individual Student Behavior Support. Beyond Behavior, 28(3), 124-133. doi: 10.1177/1074295619852333 
2. Pinkelman, S. E., & Horner, R. H. (2017). Improving Implementation of Function-based Interventions: Self-monitoring, Data Collection, and Data Review. Journal of Positive Behavior Interventions, 19, 228-238. doi:10.1177/1098300716683634