School Performance Institute

View Original

Getting on the Same Page with Operational Definitions

In September and October, I introduced the Plan-Do-Study-Act (PDSA) cycle and gave an example of the PDSA in action. A critical component of the Planning phase of the cycle is the idea of operational definitions. The concept of operational definitions is straightforward. The idea is that language must be made operational in order to perform the basic functions in an organization. To put it another way, an operational definition puts communicable meaning into a concept.  Concepts that are important to schools such as attendance, engagement, and learning have no communicable value until they are expressed in operational terms. Or, as W. Edwards Deming put it in Out of the Crisis:

An operational definition is one that people can do business with...Misunderstandings between companies and between departments within a company about alleged defective materials, or alleged malfunctioning of apparatus, often have their roots in failure on both sides to state in advance in meaningful terms the specification of an item, or the specifications for performance, and failure to understand the problems of measurement.[1]

Edward Baker picked up this line of thinking in terms of the importance of operational definitions in The Symphony of Profound Knowledge:

Deming warned management that misunderstandings and conflicts between people who do business together often are rooted in their failure to state in advance and in operational language how they will know when a commitment of one to the other has been fulfilled.[2]

Operational definitions are vital to doctors, vital to lawyers, vital to technologists, and in reality, vital to every sector. This is no different in the education sector where thankfully, the idea of well-specified definitions for the most critical concepts was taking hold in my learning of Deming’s methods just as the pandemic hit in Spring 2020. Because of the speed with which schools had to transition to remote learning, we quickly learned that although the four schools within United Schools Network were reporting student engagement levels, there wasn’t a shared definition for this important concept. USN’s two middle schools, which share school models, are located just five miles from each other geographically. However, as we dug into student engagement rates and discussed the methods for calculating those rates, it became apparent that the two schools were working from very different conceptions of engagement. There wasn’t a right or wrong definition, but we did have to work through a process of deciding what definition would give us the best data for supporting students under the unique circumstances of the shutdown. 

One middle school defined student remote learning engagement as follows: A CCA-Main St. student has demonstrated engagement in a remote lesson if the teacher assigns one of three engagement values for the practice component of the lesson: “Completed Online”, “Completed Paper Copy”, or “Completed but Poor Effort.” The other middle school defined engagement this way: A CCA-Dana student has demonstrated engagement in a remote lesson if the teacher assigns a score for the assignment (i.e., 5/5), if the teacher assigns a “C” to indicate that the lesson was complete, or if the teacher assigns a “*” to indicate that the student did not complete the lesson but showed some engagement.

In the case of CCA-Main St., a student had to complete the lesson practice or question set in its entirety. The first part of CCA-Dana’s definition is pretty similar in that lesson completion is required; however, the last part of the definition opened the door to counting students as engaged if they had partially completed a lesson practice set. Without a shared operational definition across the two middle schools, it is quite clear that when the staff of CCA-Main St. talked about student engagement they meant something altogether different from the staff at CCA-Dana. It is easy to see how comparisons between the two schools could go off the rails pretty quickly if there wasn’t a realization of these fundamental differences in the two definitions of student engagement. It is likely not very hard for most of us to imagine a scenario where the lack of a shared definition for the concept of student engagement could lead to praise at CCA-Dana and admonishment at CCA-Main St. However, this might very well be due to a less rigorous definition in use at CCA-Dana and not necessarily due to a higher quality remote learning program there. 

Thankfully, this isn’t what happened, but my point is that this sort of apples-to-oranges comparison occurs with frequency across all sectors. In our case, teams from both schools worked to develop a shared definition for remote learning engagement, and we settled on this version: “A USN middle school student demonstrates engagement in a remote lesson by completing the accompanying practice set in its entirety.” With the definition in place, we then were able to turn our attention to building a measurement system that allowed us to assess the process capability of our remote learning system. 

This example from the transition to remote learning exemplifies the power of operational definitions. They make it possible to share meanings for the concepts that we measure to avoid misunderstandings and conflicts, and ultimately so that we can both communicate during improvement efforts and have a clear yardstick to tell if these efforts did in fact lead to improvement.

***

John A. Dues is the Chief Learning Officer for United Schools Network, a nonprofit charter-management organization that supports four public charter schools in Columbus, Ohio. Send feedback to jdues@unitedschoolsnetwork.org

Notes:

[1] W. Edwards Deming, Out of the Crisis. (Cambridge, MA: MIT, Center for Advanced Engineering Study, 1986), 277-278.

[2] Edward Martin Baker, The Symphony of Profound Knowledge: W. Edwards Deming’s Score for Leading, Performing, and Living in Concert. (Bloomington, IN: iUniverse, 2017), 76.