Using our PLC for Teacher Growth

Written by Amber Clark, Principal Bea Underwood Elementary School, Parachute, Colorado

We have the pleasure of working with Marzano Academies to increase our effectiveness and reliability as a high performing school as measured by the 16-School Level Indicators. We enjoy working with our school coach, as he consistently inspires us to think differently about schoolwide practices. As we worked with him to firm up our PLCs this year, he pushed back and questioned our process a bit. He proposed that we track teacher actionsover time rather than focus solely on student data. Our task:

  1. look at recent student data, 
  2. identify a problem of practice, 
  3. research and determine an instructional best practice to increase student performance,
  4. track the consistency at which we utilize the agreed on best practice over several weeks, and then
  5. determine the impact on the learners.  

The theory behind the thinking: if teachers consistently employ a research-based best practice over time, then student data will improve.  

We usually approach PLCs by identifying student learning outcomes together, discuss the way we would approach teaching, test drive a common assessment, and agree to bring back artifacts to show how well our students mastered the content. We would end the PLC with a discussion of what went well, what could have been more effective, identify students who might need extra support, and then push on to the next round. The new PLC process required us to think about things a little differently. We asked each team to think of a problem of practice specific to their classrooms and bring some data to illustrate the problem. 

After taking a week to reflect on the state of their classrooms, teams came back to the PLC with a wide range of problems of practice. For example, one teacher group reported they were struggling to provide sufficient writing time. Another had students who should have been scoring higher on growth assessments but were instead stagnating. A different team was providing small group instruction, but it did not seem to be working. Finally, a few teams struggled to score student work consistently, and one wanted more voice and choice for students. As each team brought their area of concern to the PLC, we were somewhat surprised by how different the problems of practice were in our building.  

To help connect the process, here is an example. One of our teams identified that each teacher had a group of struggling readers and they wanted to focus extra instructional time to close gaps in their learning. We had ideas on what we could do; however, we wanted to ensure our practices were research-based. So we sifted through the Marzano Folios of research-based instructional strategies tailored to personalized-competency-based education (PCBE) communities (available to partner sites at We settled on Grouping and Regrouping Students. To be more exact, Instructional Practice Vc: Providing Group Support: Designing Effective Centers and Stations for Common Problem Areas. Teachers agreed that the common problem area for their lowest readers was phonics. So they chose the strategy of providing small-group direct phonics instruction to their lowest readers. They tracked the number of times they met with their lowest reading group to provide direct, explicit phonics instruction for five weeks. This might seem pretty straightforward, yet we were surprised by how often their ability to meet with that particular small group was interrupted. There were various interruptions such as spring assessments, sick kids, attendance issues, and inopportune fire drills. At the end of the initial PLC cycle, which typically lasts a month, we determined that at least one day a week, something came up that got in the way of small group instruction with the identified small groups of students. Therefore we extended the PLC goal for a second cycle. At the end of the second PLC cycle, nearly all of their students had made significant progress as measured by being ready to move on to a different higher-level skill.  

Each one of our PLCs followed a similar pattern:

  • identify a problem of practice.
  • Research an instructional strategy to meet the specific need.
  • Track teacher actions over time.
  • Determine if the instructional strategy was effective.

Each week our PLCs would meet for accountability to discuss if teachers were consistent in the practice they identified and agreed to use. If they were not consistent with the practice, we discussed ways to increase consistency. We also looked at student data to ensure the instructional strategy we agreed to use was working how we hoped it would. Finally, at the end of each PLC cycle, we determined if we would stay the course and continue addressing the problem of practice using the same strategy, choosing a new strategy to address the problem, or ready to conquer a new problem of practice.

There is no doubt that the new focus on PLCs is working. Classroom data shows student gains while normative data at the end of the semester verify growth. We practice action research together in a real-world setting, which is a skill most educators have not practiced since their senior thesis. Teachers report that it is refreshing to focus on shared problems of practice and feel an increase in their teacher agency. Tracking their actions has brought about a new level of responsibility and empowered teachers to see how their consistent use of research-based instructional strategies affects student outcomes. Our new PLC model provides supportive accountability with space to research and try new instructional approaches as a team, increasing our individual effectiveness and combined efficacy as a school.