![]() |
![]() |
![]() |
![]() |
![]() |
Abbey Ball, EdS
Graduate Student |
Amity Noltemeyer, PhD Professor & Chair |
Anthony G James Jr, PhD Associate Professor & Interim Vice President of Institutional Diversity & Inclusion |
Jerusalem Tukura, BA Graduate Assistant |
Dawna Cricket-Martita Meehan, PhD Director, Center for School-Based |
PBIS is a proactive framework demonstrated to be effective in reducing student behavioral problems over time. According to Sugai and Horner (2014), PBIS enhances student academic engagement and achievement by preventing problem behavior, actively teaching desired behaviors, and responding quickly to patterns of problem behavior. This framework teaches students the tools to solve problems and confrontations appropriately. When used correctly, schools teach, observe, and recognize students for appropriate behavioral actions (Positive Behavioral Interventions & Supports OSEP Technical Assistance Center, 2019). Rather than using disciplinarian strategies or punishing students for misbehavior, PBIS strives to teach students appropriate behavior and gives them the tools to problem solve before punitive consequences might be needed.
PBIS is associated with positive outcomes, such as a reduction in problem behaviors and increased educational effort (Bradshaw et al., 2010). A study also found that when implementing school-wide PBIS, there is a reduction in behavior problems and improvements in emotion regulation among students. The study supports the use of PBIS in schools and found it was effective after training (Bradshaw et al., 2012). PBIS has also been found to decrease the number of office discipline referrals in schools that implement it with fidelity (Flannery et al., 2014). Not only does PBIS benefit students, it supports teachers as well. According to a study across 40 elementary schools, school-wide implementation of PBIS was associated with lower levels of teacher burnout (Ross et al., 2012).
Professional development on PBIS is increasingly needed, as about 20,000 schools nationwide are attempting to use PBIS and it is now required in certain states, such as Ohio (Horner, 2014). However, according to the Ohio Department of Education (ODE, 2016), there has been inconsistent implementation of PBIS due to varied interpretation of the law. Professional development is defined as an experience to enhance professional growth. These experiences allow professionals, such as teachers, to gain knowledge about a new curriculum or strategies to use in classrooms. A 2016 study about professional development relative to Response to Intervention (RTI), a similar multi-tiered framework implemented in schools, found that educator skill development serves a critical role in effective implementation of RTI (Castillo et al., 2016). This study also discovered the effectiveness of job-embedded coaching, meaning educators who received this coaching had an increase in RTI skills, which suggests that coaching is a necessary tool for skill development (Castillo et al., 2016). Similar research is needed to examine the effectiveness of professional development and coaching efforts with regard to PBIS implementation.
Research has described the importance of continuous professional development in the workplace (Boyle et al., 2004). However, there is a gap in professional development literature in regard to PBIS professional development. Professional development is key to increasing teacher knowledge of classroom interventions, and when teachers work together in groups, more improvements in professional development occur (Garet et al., 2001). However, while professional development is extremely important, it is also important to understand how teachers can realistically implement topics learned at professional development sessions. For example, coaching can be provided in classrooms to support teachers and help them learn how to improve (Croft et al., 2010). A 2007 study found that increasing the time teachers are given to plan for implementation of techniques learned at professional development training led to increased implementation and fidelity (Penuel et al., 2007).
Currently, the ODE and Ohio’s 16 State Support Teams (SST) are providing PBIS professional development trainings to educators. This study is a continuation of Palmer and Noltemeyer’s (2019) research, which focused on the effectiveness of these PBIS trainings. Specifically, Palmer and Noltemeyer’s study identified six factors which contribute to the effectiveness of PBIS: Knowledge, staff support, duration, coherence, active learning, and timing. They examined whether there is a relationship between these six factors and the effectiveness of training, and the combination of factors that best predicts effectiveness. Palmer and Noltemeyer collected data through a survey sent to educators who had attended professional development training. Ultimately, the effectiveness of the trainings was found to be significantly related to the incorporation of active learning, level of coherence between activities at the training and structures in place at participants’ schools, increase in knowledge, and date of training. While this study examined the effectiveness of PBIS training in Ohio, participants were only surveyed directly after sessions with no follow-up after they returned to their schools. Therefore, there is a need to examine whether and for how long PBIS strategies are used after training, and if there is a decrease in use over time, why this occurs.
Through Palmer and Noltemeyer’s (2019) research, it was noted that additional research is also needed regarding administrator support. Increased support was related to training effectiveness, but the research did not determine what types of support are effective in producing change. A 2016 study found job-embedded professional development to be effective (Castillo et al., 2016), but this has not been examined with PBIS professional development specifically. There is a lack of literature regarding how coaching can increase the use of PBIS and what additional coaching is needed. Districts are the main providers of professional development opportunities and more research needs to be done regarding how districts are involved and stay updated with new practices used in classrooms (Whitworth & Chiu, 2015). Possible solutions include coaching, further training, and classroom observations. Palmer and Noltemeyer (2019) also noted the lack of research in regard to PBIS use over time. Studies about how often and why PBIS strategies are used are essential to fully understand and incorporate these strategies in schools.
Present Study and Hypotheses
Altogether, these gaps in the literature reveal the need for increased research regarding PBIS use, PBIS professional development effectiveness, and additional supports needed to implement PBIS in schools. The research questions for this study include:
RQ1: How frequently were PBIS strategies implemented within the first four weeks following a PBIS training and within the four weeks following the training, and did reported use of PBIS strategies change significantly between these two time periods?
RQ2: What supports did attendees report receiving following their initial training (e.g., coaching, consultation, performance feedback), and was such support significantly related to reported PBIS implementation?
RQ3: What additional supports did attendees feel were needed to increase use of PBIS?
Methodology
Sample
This sample consisted of 121 participants who attended an Ohio PBIS training during the 2017-2018 school year. The majority of participants were teachers (n = 71; 59%), followed by administrators (n = 27; 22%), related service personnel (n = 13; 11%), and other educational professionals (n = 8%). Trainings were conducted in all 16 SST regions of Ohio, and participants represented rural, urban, and suburban schools and populations. On Jan. 22, 2018, 319 email surveys were sent to people who primarily completed training between August and November 2017, and 244 emails were sent on May 7, 2018, to people who primarily completed training between December 2017 and March 2018. The response rate for the study was 21.5%.
Materials
A follow-up survey was developed for this study and used after participants’ PBIS professional development (see Appendix for items). The time between the participants’ PBIS training session and the follow-up survey ranged between 2 and 6 months. The survey asked demographic items about the participants’ professional role and years of professional experience. The survey also asked participants to rank the frequency of their use of PBIS strategies within the first four weeks after training, and again within the prior four weeks. Participants also answered items regarding what support, if any, was given to them after training and what additional support they believed was needed. The survey was hosted online using Qualtrics software (Qualtrics, Provo, UT).
Procedures
Participant names and school districts were collected as part of training registration records from ODE and were provided in a spreadsheet to a faculty member at X University (university name de-identified for blind peer review) involved with evaluating PBIS training effectiveness in the state of Ohio. The researcher searched for participant email addresses on school district websites. A January 2018 email was sent individually to anyone who completed training from August to November 2017. In early April 2018, this process was repeated, and a list of participant email addresses was created. On May 7, 2018, an individual email was sent to anyone who completed training from December 2017 to March 2018. The email, which included language approved by the Institutional Review Board (IRB), described the study, invited the individual to participate, and provided them with a link to the Qualtrics survey. A reminder email was sent one week after the initial email, to non-responders. Survey results were obtained anonymously through Qualtrics and exported to SPSS for data analysis. Participant email addresses were not attached to or associated in any way with the survey results. All procedures were approved by the X University IRB before data collection began.
Data Analysis
Answers to the survey were coded, and descriptive statistics were used to analyze the data. Specifically, measures of central tendency, frequency, and measures of variability were used to analyze the data. A paired samples t-test was used to examine whether there were changes over time (from the first four weeks after training as assessed by item 5, to the prior four weeks before survey as assessed by item 6) in the degree to which participants implemented PBIS strategies. An independent samples t-test was used to examine whether a participant receiving support (item 8) significantly impacted PBIS use in the first four weeks after training (item 5) and the prior four weeks (item 6). A correlation was also conducted to examine the association between the degree of support participants received after the training and their degree of implementation of PBIS strategies (a) in the first four weeks after the training (item 5), and (b) in the prior four weeks before the survey was sent (item 6). Finally, a thematic analysis was implemented to examine what additional supports are needed to increase PBIS use in schools (item 13). Codes were created to identify important features of the data (such as administrators, professional development, etc.) and themes were discovered among these codes to analyze the results.
Protection of Human Subjects
The survey results remained anonymous as individual names and emails were not reported on the survey. Participants’ responses remained confidential and anonymous during data collection and analysis. The IRB also approved the survey, email, and study prior to data collection to ensure confidentiality and protection of participants’ rights.
Results
Responses to items 5, 6, 7, 8, 10, 11, and 12 are summarized in Table 1. The vast majority of participants (over 80%) reported that they occasionally, often, or very often applied information they learned at the training within the first four weeks after the training. Similarly, the vast majority of participants (over 80%) reported that they occasionally, often, or very often applied training information within the past four weeks prior to the survey administration. Overall, the results indicate frequent use of PBIS training information in the initial four weeks following training, and the four weeks prior to the survey, although it is also important to note that a small percentage (less than 7%) of participants reported that they never applied information from the training either in the first four weeks after the training or the prior four weeks before survey administration. The results also indicate that the majority of participants (nearly 90%) used PBIS strategies more after attending training and also received some type of support in implementation (over 70%). For those who reported receiving support, this support was mostly provided by building administrators (42%) or SST staff members (24%), with less support being provided by “other” (e.g., PBIS team staff; 16%), district administrators (14%), and external consultants (4%). While over 70% of participants reported receiving support, 45% reported receiving this support only occasionally. More than two-thirds of participants (over 70%) also reported that additional support would increase their use of PBIS.
A paired samples t-test was conducted to examine whether there were changes over time (from the first four weeks after training as assessed by item 5, to the prior four weeks before survey as assessed by item 6) in the degree to which participants implemented PBIS strategies. The results showed there was not a significant difference in PBIS implementation between the first four weeks after training (M = 3.44, SD = 1.054) and the prior four weeks before the survey (M = 3.49, SD = 1.064); t(116) = -1.00, p = 0.319. These results suggest that there were no significant changes over time in the degree to which participants implemented PBIS from the first four weeks after training to the prior four weeks before the survey.
An independent samples t-test was conducted to compare how frequently participants used PBIS information learned at the training in those who received support from administration or other personnel to those who did not receive support. There was a significant difference in frequency of training use for participants that did receive support after training (M = 3.71, SD = 0.857) and those that did not receive support (M = 2.87, SD = 1.306); t(113) = 3.985, p = 0.000. These results suggest that those who received support from administration or other personnel following PBIS training used strategies learned at training more frequently.
Additionally, a Pearson correlation was conducted to examine the association between the degree of support participants received after the training and their degree of implementation of PBIS strategies (item 11) (a) in the first four weeks after the training (item 5), and (b) in the prior four weeks after training (item 6). The researcher found a significant, positive correlation between Items 5 and 11, r(81) = .388, p < .01. Meaning, increases in frequency of PBIS implementation initially following training were correlated with increases in frequency of administrator support. A significant, positive correlation was also found between Items 6 and 11, r(81) = .374, p < .01. Therefore, frequency of support was positively correlated with frequency of PBIS use in the four weeks directly prior to the survey.
A qualitative analysis was conducted for Items 9 and 13. Item 9 asked participants to describe the type of support received following PBIS trainings. Three themes emerged among participant responses. One of the themes that emerged was team member support. For example, one participant stated, “Our principals formed a PBIS team and conduct monthly meetings along with activities to promote PBIS building-wide.” Participants described team member support from both school teams and SST members. In addition to school teams, another important theme identified was administrator support. One participant stated, “We have had a great amount of respect from administration with planning, giving resources, meeting times during school, and giving out surveys to families, teachers, and community members.” In addition, a theme of feedback also emerged. For example, a participant stated that they received “assistance with answering questions and feedback from administrators.”
Item 13 asked participants what additional support is needed to improve PBIS implementation school-wide. Three themes emerged among participant responses. The first theme was outside support. For example, one participant stated “someone coming in to help design a program” would be beneficial. Participants generally described outside support as additional training, coaching, and school-wide instruction from SST members. The second theme was the need for refreshers and/or continued education. For example, one participant stated a need for “continuing education related to effective positive behavior incentives.” Finally, the third theme that emerged was additional planning time among school teams. A participant described his or her school’s need for “a consistent time for a monthly meeting.” Many other participants also stated a need for additional meetings among teams to further improve PBIS implementation.
Table 1: Frequency of Item Responses for Entire Participant Sample |
||
Variable |
Frequency |
Percentage |
Item 5: Within first four weeks after training, how frequently did you apply information? | ||
Never | 7 | 5.8 |
Infrequently | 11 | 9.1 |
Occasionally | 42 | 34.7 |
Often | 40 | 33.1 |
Very Often | 18 | 14.9 |
No response | 3 | 2.5 |
Item 6: In the past four weeks, how frequently are you applying information learned? | ||
Never | 6 | 5.0 |
Infrequently | 13 | 10.7 |
Occasionally | 36 | 29.8 |
Often | 42 | 34.7 |
Very Often | 20 | 16.5 |
Missing | 4 | 3.3 |
Item 7: Are you using PBIS strategies more or less after attending training? | ||
More | 104 | 86.0 |
Less | 8 | 6.6 |
No Response | 9 | 7.4 |
Item 8: Following training, did you receive support to help implement PBIS? | ||
Yes | 85 | 70.2 |
No | 30 | 24.8 |
No response | 6 | 5 |
Item 10: Who provided this support?* | ||
Building Administrator | 54 | 47.4 |
District Administrator | 17 | 14.9 |
SST Staff Member | 26 | 22.8 |
External Consultant | 3 | 2.6 |
Other | 14 | 12.3 |
Item 11: How frequently was support provided?* | ||
Very Infrequently | 3 | 2.5 |
Infrequently | 10 | 8.3 |
Occasionally | 42 | 34.7 |
Often | 25 | 20.7 |
Very Often | 3 | 2.5 |
No response | 38 | 31.4 |
Item 12: Do you believe additional support following your training would increase your use of PBIS? | ||
Yes | 83 | 68.6 |
No | 32 | 26.4 |
No response | 6 | 5.0 |
*Only answered if responded “Yes” to item 8 |
Discussion
Using a sample of 121 PBIS training attendees in Ohio, this study examined the degree to which participants (a) report having have applied PBIS concepts after training, and if this changed over time, (b) received implementation supports following their initial training, and whether such supports are related to use of PBIS, and (c) reported strategies they believe are needed to further increase use of PBIS following training. Results revealed a significant positive correlation between reported frequency of implementation support received after the training and frequency of PBIS use in the four weeks before the survey. A significant positive correlation was also found between reported frequency of PBIS implementation following training and frequency of administrator support for PBIS implementation. Significant differences also existed in reported frequency of using information learned at training between participants who did and did not receive support after the training both within the first four weeks after the training and the four weeks before the survey.
An important theme identified within the results of item 9 (see Appendix) was increased administrator support. Other themes identified were team member support and additional feedback. The majority of participants (72.54%) believed that additional support would increase their use of PBIS. Participants identified outside support, such as further training, coaching, and instruction from the SST, continuing education, and planning time as necessary to improve school-wide PBIS implementation.
According to survey results, there were no significant changes over time in the degree of PBIS implementation within the first four weeks after training and the four weeks prior to the survey. Therefore, participants reported implementing strategies at the same rate and consistency initially after training until the time they took the survey. These results are positive and suggest that implementation of PBIS strategies did not decrease over time following the training.
This study found that increased level of support following trainings was positively associated with increased PBIS use in the classroom. This suggests implementation support may help increase implementation. Eighty nine percent of participants also reported using PBIS strategies more after the training, therefore supporting the effectiveness of these Ohio trainings. Furthermore, this study identified crucial support to help increase PBIS implementation, such as outside support, continuing education, and planning time. Participants identified further training, coaching, instruction from SSTs, additional refreshers related to behavior incentives, and increased time for teams to meet as important measures of support.
Implications for Practice & Future Research
Based on these results, future research is necessary regarding the importance of school and district leaders’ role in PBIS professional development. For example, a 2012 study examined the importance of principals and administrators in regard to PBIS implementation. This study found that principals with a positive and supportive outlook regarding PBIS had increased PBIS use in their schools (Richter et al., 2012). The results of this study indicated that administrator support is important for classroom PBIS implementation, but future research could examine if administrator buy-in is also associated with increased implementation. Directions for future research also could include disaggregating the analysis results by participants’ profession (administrators, teachers, etc.). Furthermore, this study measured PBIS implementation over the span of one or more months; however, future research could examine how frequently PBIS is implemented for a year or more after training.
Educational professionals can use these results to identify strengths and weaknesses of PBIS implementation in their schools. Specifically, schools should examine what support, if any, is provided to teachers following PBIS professional development, who provides this support, and how often. Schools should also examine how administrators can support implementation of educational practices such as PBIS. Administrators can provide support to teachers by tending to basic team tasks and setting clear priorities, encouraging knowledge-based decisions through problem solving, encouraging instructional flexibility, and developing strong professional bonds among teachers (Boscardin, 2005). According to a 2016 study regarding factors that help and hinder administrator support of PBIS, administrators were more likely to become supportive after obtaining formal and informal knowledge of PBIS through multiple channels (McIntosh et al., 2016). Administrators also reported that networking with implementing schools, learning how PBIS aligns with personal values, experiencing its effectiveness firsthand, observing a need for PBIS, attending PBIS trainings, connecting with a PBIS coach, and attending team meetings helped participants implement PBIS in their schools. In order to increase administrator support of PBIS, additional trainings specifically tailored to administrators may be helpful, as well as developing a coaching network where administrators can share their successes and areas of improvement (McIntosh et al., 2016).
Limitations
This study provides specific insight about the frequency of PBIS implementation following trainings and what supports help increase this implementation. However, there are several limitations that need to be considered. First, there are limitations related to the sample itself. For example, it is important to consider participant response bias. Survey completers (compared to those who attended training but did not complete the survey) may be those who were more likely to have followed through on PBIS implementation. Also, the representativeness of the sample compared to all schools in Ohio or nationally (e.g., grade levels served, locale, type of school) is unknown. It is possible, for example, that elementary schools or suburban schools are overrepresented in the sample, thus limiting generalizability of the results; however, we do not have the data to explore whether this is true. Future research should strive for a representative sample.
Additionally, the time between the participants’ PBIS training session and the follow-up survey varied. Whereas those with longer time periods between training and survey completion may have had more difficulty recalling the first four weeks after training, participants who attended trainings closer to the survey did not have as much time to successfully implement PBIS strategies as other participants. Furthermore, this study included data from participants attending trainings in several different PBIS topics (e.g., Tier I, Tier II/III); although standard template materials were used to guide most of the trainings within each category, certainly not every participant (even within a training topic such as Tier I) received the exact same training. Future research should attend more to fidelity of training across participants, as well as ensure a more consistent time between the training and the follow-up survey.
There are also limitations with the self-report nature of the data. For example, participant self-report on behaviors may not accurately reflect reality. Furthermore, perceptions regarding what supports would increase participants’ use of PBIS were based entirely on their opinions. It would be beneficial to collect this information in a different way, so it is not opinion-based for future research (e.g., external observer checks). It is also unclear whether the methods of support participants identified would, in fact, improve their implementation of PBIS, and whether their schools would have the capacity to provide such support.
References
Boyle, B., While, D., & Boyle, T. (2004). A longitudinal study of teacher change: What makes professional development effective? The Curriculum Journal, 15(1), 45-68. https://doi.org/10.1080/1026716032000189471
Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the effects of schoolwide positive behavioral interventions and supports on student outcomes results from a randomized controlled effectiveness trial in elementary schools. Journal of Positive Behavior Interventions, 12(3), 133-148. https://doi.org/10.1177/1098300709334798
Bradshaw, C. P., Waasdorp, T. E., & Leaf, P. J. (2012). Effects of school-wide positive behavioral interventions and supports on child behavior problems. Pediatrics, 130(5), 1136-1145. https://doi.org/10.1542/peds.2012-0243
Castillo, J. M., March, A. L., Tan, S. Y., Stockslager, K. M., Brundage, A., Mccullough, M., & Sabnis, S. (2016). Relationships between ongoing professional development and educators’ perceived skills relative to RtI. Psychology in the Schools, 53(9), 893-910. https://doi.org/10.1002/pits.21954
Croft, A., Coggshall, J. G., Dolan, M., & Powers, E. (2010). Job-embedded professional development: What it is, who is responsible, and how to get it done well. Issue Brief. National Comprehensive Center for Teacher Quality. Retrieved from https://files.eric.ed.gov/fulltext/ED520830.pdf
Flannery, K. B., Fenning, P., Kato, M.M., & McIntosh, K. (2014). Effects of school-wide positive behavioral interventions and supports and fidelity of implementation on problem behavior in high schools. School Psychology Quarterly, 29(2), 111-124. https://doi.org/10.1037/spq0000039
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915-945. https://doi.org/10.3102/00028312038004915
Horner, R. H. (July, 2014). Using PBIS to make schools more effective and equitable. Paper presented at the Southern Region Student Wellness Conference, Indian Wells, CA.
McIntosh, K., Kelm, J.L., & Canizal Delabra, A. (2016). In search of how principals change: A qualitative study of events that help and hinder administrator support for school-wide PBIS. Grantee Submission, Journal of Positive Behavior Interventions, 18(2), 100-110. https://doi.org/10.1177/1098300715599960
Ohio Department of Education. (2016). Policy: Positive behavior interventions and support and restraint and seclusion. Retrieved from http://education.ohio.gov/Topics/Other-Resources/School-Safety/Building-Better-Learning-Environments/Policy-Positive-Behavior-Interventions-and-Support
Palmer, K., & Noltemeyer, A. (2019). Professional development in schools: Predictors of effectiveness and implications for statewide PBIS trainings. Teacher Development. https://doi.org/10.1080/13664530.2019.1660211
Penuel, W.R., Fishman, B.J., Yamaguchi, R., & Gallagher, L.P. (2007). What makes professional development effective? Strategies that foster curriculum implementation. American Educational Research Journal, 44(4), 921-958. https://doi.org/10.3102/0002831207308221
Positive Behavioral Interventions & Supports OSEP Technical Assistance Center (2019). Tier I supports. Retrieved from https://www.pbis.org/school/tier1supports
Richter, M.M., Lewis, T.J., & Hagar, J. (2012). The relationship between principal leadership skills and school-wide positive behavior support: An exploratory study. Journal of Positive Behavior Interventions, 14(2), 69-77. http://dx.doi.org/10.1177/1098300711399097
Ross, S., Romer, N., & Horner, R.H., (2012). Teacher well-being and the implementation of school-wide positive behavior interventions and supports. Journal of Positive Behavior Interventions. 14(2) 118-128. https://doi.org/10.1177/1098300711413820
Sugai, G. & Horner, R. H. (2014). Positive Behavior Support, School‐Wide. In Encyclopedia of Special Education (eds C. R. Reynolds, K. J. Vannest and E. Fletcher‐Janzen). https://doi.org/10.1002/9781118660584.ese1902
Whitworth, B. A., & Chiu, J. L. (2015). Professional development and teacher change: The missing leadership link. Journal of Science Teacher Education, 26(2), 121-137. https://doi.org/10.1007/s10972-014-9411-2
Appendix
Follow-Up Survey Items
Q1. What was the date of the PBIS training you attended? If it was a multi-session training, please put the date of the last session. If you are unsure of the exact date, please indicate the month of the training.
Q2. What is your role?
● Administrator
● Teacher
● Related Services
● Parent/Community Member
● Paraprofessional
● Other
Q3. (Item shown only to those who respond “Teacher” “Administrator” “Related Services” or “Paraprofessional” to item 3) How many years of educational or teaching experience do you possess?
● 0-5 years
● 6-10 years
● 11-20 years
● 21+ years
● N/A
Q4. (Item shown only to those who respond “Teacher” to item 3) What do you teach?
● Special Education
● General Education
● Other
● N/A
Q5. Within the first four weeks after the PBIS training you attended, how frequently did you use or apply information learned at the training?
● Never
● Infrequently
● Occasionally
● Often
● Very often
Q6. In the past four weeks, how frequently are you using or applying information learned at the training?
● Never
● Infrequently
● Occasionally
● Often
● Very often
Q7. Are you using PBIS strategies more or less after attending the training?
● More
● Less
Q8. Following the PBIS training session you attended, did you receive support from administrators or other professionals to help implement PBIS? Support could mean additional instruction, technical assistance, modeling, feedback, and/or coaching.
● Yes
● No
Q9. Please describe the type of support received.
Q10. (Item shown only to those who respond “Yes” to item 8) Who provided this support? Please check all that apply.
● Building administrator
● District administrator
● State Support Team staff member
● External consultant
● Other (please describe)
Q11. (Item shown only to those who respond “Yes” to item 8) How frequently was support provided to you?
● Very infrequently
● Infrequently
● Occasionally
● Often
● Very often
Q12. Do you believe additional support following your PBIS training would increase your use of PBIS?
● Yes
● No
Q13. What additional support would be beneficial to you and your school, to further improve PBIS implementation?