A Partnership Approach: Empowering Collective Efficacy through Data, Collaboration, and Alignment

Montessa Muñoz, Educator
Chad Dumas, Consultant and Author

It is no secret that the belief that one’s actions impact others is tightly correlated with actual improved student learning. Some might call it self-fulfilling prophecy, others efficacy. Going back to the Pygmalian in the classroom research (Rosenthal and Jacobsen, 1968) we know that teacher expectations drive student performance. And more recent research from Hattie (2018) places this concept among the highest predictors of student achievement. We as educators and schools can, do, and will improve outcomes for kids to the extent that we believe that we actually can.

Knowing this is true, however, does not make it magically happen in schools. It is only the first step. Developing a strong sense of collective efficacy, built upon individual educator efficacy, happens through deliberate actions focused on instructional practices (as opposed to mere managerial issues). Our experience, as a building principal supported by a district administrator, is that both levels of the system support and enhance each other. As such, we will highlight three major areas of our partnership: the collection and use of data, collaborating on instructional improvement, and aligning a culture committed to student success.

 

Background

Prior to an in-depth look at these three areas, let us provide some background. The setting from which most of these experiences derive was a high-poverty, high-diversity, lower-performing school and district. When Montessa arrived as principal (one year following Chad’s arrival at the district level), the building had grade-levels with the percent of students proficient on various benchmarks near single digits. Despite having more than 90% of students from poverty with a high proportion of English Language Learners (ELs), through the application of the ideas presented below, the school became a national model and achieved more than 90% of students reaching benchmark on multiple assessments.

We know that Professional Learning Communities (PLCs) are an effective route to building collective efficacy (Voelkel and Chrispeels, 2017), and our district’s first priority was in doing this work. As such, from a district perspective, we honed in on the advice of DuFour, Dufour, Eaker, and Many (2010). For a more detailed discussion, please see Dumas and Kautz (2014); suffice it to say that the district emphasized three areas for leading PLC implementation: Limit initiatives (i.e. focus the work), build capacity, and create systems for mutual accountability. For the purposes of this article, we will focus on building-level actions taken to build individual and collective efficacy.

 

The collection and use of data

Building individual and collective efficacy, in our experience, is grounded in the collection and usage of data. Data was collected by students, teachers, principal, and central office staff.  It was posted in classrooms, along hallways, around the cafeteria, in the staff lounge, and on the desk of the principal. In central office, the main conference room displayed data from administrator learning meetings, and a spider-like radar chat was prominently displayed in the office of the director of learning. Data could not be missed.

The weekly newsletter to staff from the principal listed goals and progress. Instructionally, 80% of students were to be on-task in any given classroom, and the principal collected and reported this weekly from classroom walkthroughs. For reading and mathematics, data was charted weekly, and scores on the school-wide common writing rubric was charted monthly.

In order to display the data, it had to be collected. Progress monitoring was previously done by non-classroom teachers. This changed (with some push-back, but surprisingly little). Classroom teachers are the rightful owners of progress monitoring and the data provided by this work. District common assessments and benchmarks, DIBELS, easyCBM, and common team-based formative assessments (known as LtoJ in our case) formed this foundation.

Having the data collected and displayed is only the beginning, though. The more important work is in talking about it. All. The. Time.

Data was a focus in evaluations, PLC meetings, at the start of each staff meeting, and in school-wide daily announcements about student and classroom “All-Time Bests” (ATBs). Student artifacts were shared by staff in team and school meetings, building confidence of staff as they talked about what they saw from students. And, believe it or not, students themselves, all the way down to kindergarten, were involved in graphing and tracking their own and the classroom’s progress.

 

Images 1 and 2: Students chart their own progress—including Kindergarten

 

 

Office referral data was reviewed once per month by the Positive Behavior and Intervention Support (PBiS) team. Attendance data was reviewed weekly by a team that included the principal, counselor, nurse, and social worker. Math and reading benchmark data were formally reviewed three times per year.

A partnership with the University of Nebraska helped build capacity to collect meaningful student performance data, set decision rules, configure aim lines, and adjust practice accordingly. Little by little, over time, our mindset changed around data. It changed from expecting someone else to hand us some standardized test score to us doing the collecting, analyzing, and using our own, far more meaningful results of students’ learning. Needless to say, seeing the impact of our work on students built efficacy both individually and collectively.

 

Collaborating on instructional improvement

Just collecting and using data, by itself, however, does little to increase efficacy. This work must be accomplished with learning to improve our practice so that the results will improve. Enter a laser-like focus on collaborating for instructional improvement.

In combination with district-wide efforts, the school had a total of four areas of focus: 1) Engagement strategies, 2) Explicit phonics and vocabulary instruction, 3) Four Block Writing instruction, and 4) Social-Emotional Learning/Proactive behavior strategies.

Regarding engagement strategies, we sent four staff members to a national Kagan training to then come back and help other teachers develop these skills. Not only did this experience build the capacity of these four staff members to be better at engaging their students, but they then taught other teachers how to implement these strategies. Efficacy became integrated into their identities.

The same research project from the University of Nebraska mentioned earlier helped focus on explicit instructional practices. The intense focus on these strategies to teach phonics and vocabulary instruction were then a focus of professional learning and data collection. For example, frequency of choral and physical responses, the use of white boards for formative assessment, and tracking the explicit use of research-based vocabulary instruction strategies to ensure students know and can apply key words.

 

Figure 1: A walkthrough observation form used to collect and use instructional data

 

 

The district had embarked on a four-block writing strategy years before, and school leadership decided to reinforce this work. Staff learning, PLC conversations, and data collection were key tools in ensuring the quality and fidelity of implementation of these practices. Further, the school used vertical teams (K-2 and 3-5) that included EL, Title, and Special Education to ensure consistency of language and structure for writing instruction. And a school-wide rubric was developed and used on common writing prompts to track progress and inform changes to practice. Finally, writing exemplars were identified and reviewed regularly to create a standard of quality writing in the school.

Finally, understanding and proactively addressing SEL and behavioral issues was a focus of the school. We know that students must feel safe, and the adults in the building are responsible for doing this. So the school implemented “families” where each adult led a group of six students (one from each grade level) who met throughout the course of the year to build community and relationships with each other. The school also worked diligently to faithfully implement PBIS to ensure that our core processes and interventions were helping all students self-regulate. Part of this involved holding Friday assemblies to celebrate team and individual accomplishments of students and staff. And finally, the counselor taught “Second Step” materials while teacher’s facilitated classroom meetings.

 

Aligning a culture committed to student success

Developing individual and collective efficacy, grounded in data usage and an instructional focus, manifests itself in a culture that is committed to student success. In our work, this culture was advanced through the alignment of several structures: High expectations, Practices, and Teaming.

Using data and focusing on instruction will do little good without high expectations–for each other and for each student and family. This begins with learning each student’s name, and ensuring that we are not “dumbing down” expectations based on race, socio-economic status, gender, or other characteristic. Further, it involved celebrating successes for students, as well as staff–both individually and in teams. And it involves ensuring that instructional practices are in accordance with having high expectations (Marzano, 2007).

Finally, high expectations shows up in how we partner with families. We found parents ready and willing to help with translating, interpreting, and providing food for myriad events. We encouraged their involvement in other ways, too, like in the structured Dads of Great Students (D.O.G.S.) program, and in providing lists of how and when parents could volunteer for the school.

Second, this culture required the alignment of multiple practices. All work aligned towards the identified school improvement goals. This included ensuring that monthly professional learning time focused on the priorities, and that teachers were the ones leading the professional learning.

Previously-identified interventionists transitioned to instructional coaches. They were trained in Jim Knight (2007) processes, and attended monthly support meetings and trainings with other coaches across the district. These same coaches led instructional rounds with staff throughout the building in an effort to de-privatize practice. And the coaches met with the principal once per week to plan specific professional learning and supports.

Finally, the high expectations and alignment were reflected in how the school worked as a team. Grade-level teams met at least two times per month. Vertical teams (K – 2 and 3 – 5) met once per month to blind-score student writing samples based on the school-wide rubric. And grade-levels from across the district met the remaining one time per month for establishing common district-wide expectations.

 

Figure 2: A sample action planning tool to assist with aligning all priorities

 

 

Closing

Efficacy is built on collaboration. While compliance may get short-term results, long-term gains and sustainability will only happen through working together. This means that all levels of the system are collaborative, for we can’t expect principals to build collaborative environments when the district is top-down and compliance-oriented.

The work of building efficacy is hard. Mindsets don’t change overnight. Creating an environment of caring for kids that includes high academic standards is completely doable. The work of collecting and utilizing data, collaborating on instructional improvement, and aligning a culture focused on student success can happen in any school and district. And the partnership approach between the district and school are fundamental. The question in our minds isn’t, “Can we do what it takes to meet the needs of every child?” Rather, it’s “When and how will we start?”

 

 

Bibliography:

DuFour, R., DuFour, R., Eaker, R., & Many, T. (2010). Learning by doing: A handbook for professional learning communities (Second edition). Bloomington, IN: Solution Tree.

Dumas, C., & Kautz, C. (2014). Wisdom from the factory floor: For best results, limit initiatives, build capacity, and monitor progress. JSD, 35(5), 26-34.

Hattie, J. (2018). Collective teacher efficacy (CTE) according to John Hattie. https://visible-learning.org/2018/03/collective-teacher-efficacy-hattie/

Knight, J. (2017). Instructional coaching: A partnership approach to improving instruction.

Thousand Oaks, CA: Corwin.

Marzano, R. J. (2007). The art and science of teaching: A comprehensive framework for effective instruction. Alexandria, VA: Association for Supervision and Curriculum Development.

Rosenthal, R, and L. Jacobsen. Pygmalion in the classroom: teacher expectation and pupils’ intellectual development. New York: Holt, Rinehart and Winston, 1968.

Robert H. Voelkel Jr. & Janet H. Chrispeels (2017) Understanding the link between professional learning communities and teacher collective efficacy, School Effectiveness and School Improvement, 28:4, 505-526, DOI: 10.1080/09243453.2017.1299015

 

What do you think about the points raised in this article? We would be delighted to read your thoughts below.

 

Is your school assessment approach effective and efficient in promoting learning?

Jamie Scott, Evidence Based Education

Is your school assessment approach effective and efficient in promoting learning?

How do you know?

 

There are, arguably, three key pillars of education management: pedagogy, curriculum and assessment. For too long, the third of these has been both under- and misused; the potential for assessment to be a powerful learning process, as well as an accurate barometer of learning itself, is often not realized. Assessment is inextricable from teaching, and the quality of the latter is – in many ways – dependent on the quality of information derived from the former. Great assessment, the type that helps improve teaching and learning, is not a single entity, but it leads to a single point: a meaningful decision which has positive consequences for students’ learning. It is purposeful, manageable, efficient and effective. Great assessment is lean and valuable. It is used thoughtfully to achieve specific aims, ones for which it is better suited than any other tool or strategy a teacher has at their disposal.

 

Every school has their assessment policy and framework, but is it fit for purpose? Schools use assessment daily, and it can be difficult to stop, step back and review our approach to ensure it is fit for the purpose intended. So, in the spirit of reflection and self-evaluation, here are five questions to ask of your school assessment framework to help determine its efficiency and effectiveness.

 

  1. Are we using assessment to measure important aspects of the curriculum?

Assessment, pedagogy and curriculum are inextricably linked and, when the best of these are brought together well, they form the backbone of effective teaching and learning. When assessment is sharply focused on the curriculum, and used as a tool of good pedagogy, teachers can maximise its value to improve the responsiveness of their teaching. After all, how can we know what to teach tomorrow, if we do not know what has been understood today? Effective assessment needs to relate to the curriculum ‘map’, strategically challenging pupils to recall and strengthen the right pieces of learning, understanding and skill.

 

  1. Do our assessments measure the things we intend them to measure? Are they fit for purpose?

Form should always follow function in assessment. We must know what we want to measure and why in order to select the right tool to achieve our purpose. An assessment that is ideal to measure progress might be a poor choice for identifying strengths and weaknesses to inform, plan or adapt your next lesson. To assess better, we need to be explicit about purpose:

 

  • The construct: What is the specific knowledge, skill or understanding that we intend to assess?
  • The end use: What do we want to do – the interpretation, the decision or action – with the information generated by the assessment process?
  • The best tool: What is the most appropriate, effective and efficient way to assess in this instance?

 

  1. Are we assessing learning or performance of short-term memory?

What is learning and does every teacher and school share the same understanding? Let’s define learning as both the long-term retention of knowledge, understanding and skill, as well as the ability to transfer these to novel contexts. As such, teaching needs to address and promote learning which is retained and transferable, and assessment needs to be designed to gauge students’ long-term retention and transfer to novel contexts. Does your schools’ assessment approach allow you to reliably demonstrate student knowledge and understanding at the point of initial assessment, and that they able to retrieve that knowledge and understanding 6 weeks later, 6 months later, or a year later?

 

  1. How can we be sure that progress is, in fact, real progress and not just measurement error?

Measuring progress reliably is difficult. All forms of educational measurement contain a degree of error and so assessment is less precise than often it is perceived to be – whether that be national tests, classroom quizzes or teacher observation. It is a complex and time-consuming exercise to create an assessment that is sufficiently sensitive to be able to reliably measure progress in a relatively short space of time, so teachers need to understand error in their assessment measurements to make accurate judgements about the needs and progress of pupils.

 

  1. Are you using assessment to create learning, not just record the residue of it?

Assessments or tests have traditionally been used to measure learning. However, a constantly growing body of research demonstrates that high-quality tests (think recaps, quizzes and termly tests rather than just past papers) are better learning opportunities than repeated study/revision.  One example of such research is Roediger III, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological science, 17(3), 249-255. The research evidence indicates that the act of responding to questions thoughtfully strengthens a student’s learning; practice testing using well-crafted questions can actually promote learning, making assessment into more than simply a tool for recording data about learning.

 

Evidence Based Education are the creators of the Queen’s Award Winning Assessment Lead Programme, Assessment Essentials and the new Science of Learning Programme. Trusted by schools around the world, they provide engaging teacher professional development and school improvement services, to improve learner outcomes worldwide and for good.

 

What do you think about the points raised in this article? We’d love to hear what you have to say.

 

CONTACT THE AUTHOR

 

Jamie Scott, Director of Partnerships & External Relations
Evidence Based Education

LinkedIn

 

Real World Motivation

Rachael Thrash, The International School of Helsinki
Ellen Heyting, The International School of Helsinki and Monash University

Real World Motivation: Harnessing Authentic Audience to Transform Assessment for All Students.

Today’s K-12 educators are diligently moving away from the traditional lecture and note taking methods of teaching towards constructivist views. In daily lessons, we strive to put the learner at the centre of educational experiences. Yet, when it comes to assessment, we are dragging our feet. If we accept the value in these constructivist pedagogies, then surely we must acknowledge that it is time to challenge our traditional notions of assessment as well?

The hallmark of traditional school work is the essay. Its ultimate audience, the teacher, assesses the level understanding a student demonstrates. This approach puts students and teaches in a transactional relationship. And, not surprisingly, students who feel prepared to perform in this way may find this process satisfying. Their efforts yield positive recognition from the authority.

But what about the other learners? How does this model strike them? And even for those ready to show their learning in this format, how does it encourage personalization, risk taking, and empowerment? By limiting the students’ audience to their teachers, we establish a primarily hierarchical relationship between teachers and students. Worse, we miss the opportunity to encourage all of our students to find their voices.

The Inspiration of an Authentic Audience 

In contrast to a student who has just completed an essay, picture Ayla, a 10th grader who faces attention and language based learning challenges. Recently, Ayla acted as a docent at a local restaurant where student shared their work in a resistance art installation, “Question The Narrative: Young Artists Challenge Norms That Promote Injustice.” Ayla’s piece exposed Nestle’s abuse of child laborers, including a call to action for consumers. Watching a visitor view her work, she exclaimed, “I have goosebumps! I can’t believe someone actually cares!”

Yolanda, a highly precocious student also grappled with the challenge of creating a meaningful resistance piece for the installation. She pursued ideas and ultimately decided to expose society’s unhealthy version of success. She symbolically repurposed a photo of Justin Beiber on the cover of Forbes, questioning societal values and role models. Her work was both personal and impactful. She explored her own concerns about success and shared them with an audience. The restaurant staff marveled at the customer discussions her piece sparked.

Assessment Constructed to Support Students Connecting with an Audience 

These students and their diverse classmates responded to a complex unit digging into hegemonies and resistance. They worked to expose unjust power structures to an audience beyond the teacher assessing them. Alongside this performance challenge, students developed their abilities by analyzing resistance texts and researching hegemonic structures. Assessment on this work became an important feedback mechanism. Created with intention and student growth in mind, it broke down the process of creating a meaningful resistance piece into manageable steps; students practiced disciplinary skills until they developed confidence.

Traditional Assessment vs. Authentic Assessment 

Problems with assessment arise when the work is an exercise with a foregone conclusion. The same students always succeed while the students who don’t know the answer or feel unsure of their abilities lose motivation. If a student’s goal is only to show that they can identify symbolism in a resistance text, knowing that the teacher has a particular definition of symbolism, where is their sense of agency? How can they feel empowered?

It is easy to find the shortcomings of traditional essay-based assessments: Tests are based on ‘unseen’ questions, whereas in authentic assessments, as much as possible is known about the task ahead of time, and students have had a chance to practice, get feedback and prepare. Authentic assessments are iterative by nature, involving students and teachers working together to co-construct new understandings of the world. Traditional assessments simply take a ‘snapshot’ of a students’ performance at one point in time. While traditional assessments ask students to reproduce a correct answer, authentic assessments are open-ended and allow for student agency and voice.

Traditional methods of assessing offer the illusion of learning in our students. But ask students to take what they’ve learned and transfer it to a new situations and they may lack the deeper understanding or flexibility. Gardner (1993) argues that authentic assessment tasks must ask students to solve a real world problem or create a product with someone else’s needs in mind, and has value beyond the classroom walls.

Value of Supporting Authentic Learning Through Audience 

When we talk about real world experiences, we are showing young thinkers that their opinions matter. They have value outside their achieved grade. Students seek teacher feedback when they know they will share work with a larger audience. The teacher’s role moves from success arbiter to coach. Mistakes become opportunities for improvement rather than reasons to justify a lower grade. Students trust the teacher to help them find their voice.

Consider this range of opportunities for students to share their work: 

  • Class magazine
  • Interviews with community members
  • Teaching the parent community
  • Raising awareness for a local organization
  • Helping other students
  • Displaying work in a public space

 Does the Extra Effort Yield Results? 

We don’t claim to have all the answers to these complex questions. But we have seen the transformation they can bring about in our students. Our 10th graders are visibly enthusiastic to identify concepts they developed in the resistance unit as they read Harlem Renaissance Poetry.  Even better, they willingly share their voice in a poetry slam. They have taken ownership of their learning experiences and find value in the process.

Yes, authentic assessments are not perfect, they are risky and messy, but we’re getting better at them. The more educators and schools that move towards this type of pedagogy and assessment, the more we will all learn how to deal with the ambiguities and execute them better. One could argue using authentic assessments is in fact an authentic assessment of our own teaching. If we are willing to take risks in the real world, we model the perseverance and creativity we hope to inspire in our students.

Suggested Breakout Box 

Wiggins’ (1998) characteristics that makes assessment authentic:

The assessment is realistic; it mirrors how this knowledge or skill would be used in the real world.
The assessment requires judgment and innovation; the task is open-ended and has more than one right answer.
The assessment incorporates skills that are required in the discipline being studied.
The assessment is done in contexts as close to the real world as possible
The assessment involves a range of skills and deals with a complex problem that requires some degree of informed judgement or choice from the student.
The assessment is iterative and allows for feedback, practice, and redos

 Works Cited 

Gardner, H. (1993). Frames of Mind (2nd ed.). New York: Basic Books.

Wiggins, G. (1998). Educative Assessment: Designing Assessments to Inform and Improve Student Performance. San Francisco: Jossey-Bass.

ABOUT THE AUTHORS

Rachael is a teacher and social justice advocate who has worked in schools in Poland, the United States, and Finland. She is passionate about inclusive education and its capacity to empower every student to make positive change.

 

Ellen is a teacher and educational researcher who has worked in schools in Melbourne, Beijing, Singapore and Helsinki and who believes education can be a force to unite people, nations and cultures for an equitable and sustainable future.

BIBLIOGRAPHY

Gardner, H. (1993). Frames of Mind (2nd ed.). New York:Basic Books.
Wiggins, G. (1998). Educative Assessment: Designing Assessments to Inform and Improve Student Performance. San Francisco: Jossey-Bass.