Reflection on schedule change process

Last school year, a group of teachers—Brad, Elizabeth, Bobby, Paco and I—sat down together in the library and hatched an idea. The faculty had been told to create groups, called Professional Learning Communities (PLCs), with the aim to collaboratively take on a project to improve teaching and learning within the school. Teachers formed a variety of PLCs as a result, with goals as varied as studying the impact of library research in English classes to remaking the middle school mathematics curriculum. The five of us set ourselves a fairly ambitious goal: we would reform our school’s bell schedule.

This is a reflection on the process of our PLC, not a narrative, so I will jump to this year: we designed a new schedule that our school implemented in the 2015-16 school year. I must admit to being more than a bit relieved to see the data from our student and teacher survey, three months in, confirm what I had suspected: it was successful. I think it’s worth reflecting on why this was so, on how our project worked well when many do not. I write this now in part due to a recent conversation with Elizabeth. She is excited about a new proposal for a “makerspace” and wants potential to innovate the way the school teaches design and entrepreneurship. While discussing this proposal, she mentioned that she’d like to emulate the process we used for schedule change for her new project, and that she was frustrated that others proposing similar projects weren’t approaching these in the same way.

Going into the project, I didn’t think our process was very special. We had focused on taking some defined steps: research, consult, ideate, consult, design, pilot, consult, tweak, implement, consult. The repeated consultation was perhaps the most important part. Since the new schedule would affect everyone in the school, it made sense to our group to consult not just on the final project but also on the faculty and student needs, in order to create a design that fit these. Since I was the spokesperson for our PLC to the faculty, I ended up with a reputation as “the survey guy”, and I it accepted with pride.

However, it took a lot of focus and determination to ensure that the process went smoothly, especially since we didn’t have a clear mandate from our administration. I noticed throughout that most of our group seemed to be outcome based. As we were interested in school scheduling, of course we wanted to spend our meetings discussing the merits or disadvantages of particular schedule models, and so we ended up often sidetracked into interesting but unproductive conversations, running two-hour meetings, or getting into implementation details when we should have focused on stakeholder needs. Even with the top-rate team we had, it was a struggle to maintain focus on the process; however, in retrospect, as Elizabeth’s comment today alludes to, the process was one of the most important parts of the project.

I mentioned the lack of mandate, and this is worth explaining. Certainly we received support from administration, including the school sending myself with two administrators to make visits to schools in Wisconsin and Colorado that used alternative schedule models. Personally, this research trip was invaluable to my own conceptualization of the possibilities and limits of a bell schedule, and I suspect we wouldn’t have completed the project without it. However, our PLC did not have a deadline nor a requirement to present any particular recommendation. Indeed, early on we discussed the possibility of taking two years to design and pilot, and at points it seemed like the administration would not make a decision on the schedule until the next school year. Here I learned the value of making a formal proposal: desiring to implement change more quickly, we put together an 11-page document outlining our research and consultations and asked the administration for a formal meeting to decide whether to proceed. The administration’s agreement to our proposal in this meeting effectively gave us our first real mandate.

Our group was somewhat peculiar since it was non-hierarchical; most other instances of change I have observed in the school context are quite hierarchical and seem to have a lack information-sharing. I suspect that this occurs for different reasons: the decision-makers may not want to share it, may not realize the value of sharing it, or may not know how to effectively disseminate it. In our PLC we made a conscious decision to default to sharing everything. We made our ideas public in a teacher forum even before we knew whether they were workable, we shared survey results as detailed data visualizations to all stakeholders, and I have made all the relevant documents to the process available on my website. I believe that this openness was crucial. Whether teachers and students actually read the many survey results we put out is almost beside the point; by publishing our data, we engendered trust in the process and socialized the change effectively.

Two other change processes run recently at our school offer interesting comparisons and contrasts. First: another PLC decided to look at the several “learning management systems” that our school was using and consolidate these into the best possible option. This group ran an extensive pilot program with any teachers who wanted to be involved, at all grade levels, then consulted extensively with the pilot group and made a decision that has been a great improvement over the fractured system last year. Their process seemed open and was built from the ground up with the decision made by teachers. Second: our school is also running a process to change its teacher compensation structure, which has been set up with a hierarchical committee structure and a clear mandate for change. While some of the teachers involved (bravo to them!) have endeavoured to consult with faculty, this consultation has been ad-hoc and the committee’s deliberations have not been made public. The school is now sharing the new compensation plan with faculty, but as far as I know no data on how the decisions were reached has been or will be made public. There is certainly no obligation to do so, but my experience in discussions with teachers so far leads me to suspect that the lack of transparency may turn out to be a problem with the stakeholders–teachers–that feel distant from the process.

* * *

Returning to the schedule change process, there was one point in the process where communication broke down somewhat, and where both the process and result were less optimal than desired, both from my point of view and from teacher feedback. I’ve spent a fair amount of time thinking about this, first looking for blame and then trying to learn from it.

Here’s what happened. When we proposed the pilot week to administration, the best schedule we had at the time had an early dismissal on Tuesday. Since our at-that-time-current schedule had early dismissal on Wednesday, a switch of days would require the elementary school to also change their schedule. We discussed this with our admin, who said they would discuss with Elementary admin. The pilot was approved early February to take place on a week in April. Some time in between the approval and the pilot week, the PLC discovered that we could improve on our proposed schedule (for example, we could make High School lunch times the same across the week) if we moved the early dismissal to Monday!

We had a dilemma at that point. Some in our group argued for changing the pilot week schedule to this new, improved version. I pushed back. We had already announced the pilot week schedule to students, staff and parents, so I argued that we could make the changes after pilot week, when the schedule was implemented. After all, we were planning to get a lot of detailed feedback from pilot week and may propose other changes as well. I won the argument and we kept the schedule as originally proposed for pilot week.

After the April schedule pilot week, our group did a massive survey of almost all students and faculty as well as a fair number of parents, and I put together a visualization of this data which told us what we had already suspected: the proposed schedule was very popular, but we should make a few changes, including the ones we had discovered were possible by changing early dismissal to Monday. We sat down with administration and proposed our improved model, only to discover that changing the early dismissal day would no longer be possible. Apparently Elementary administration had agreed on Tuesday and had been told by our administration that this would not change. Our improved schedule was dead on arrival. As a result, we were not able to respond to much of the feedback from pilot week, which was noticed by faculty. Indeed, in the feedback we received around our process, the only significant critique was that we did not seem to listen to any of the pilot week feedback. (Since we published all our data, the stakeholders knew what the feedback was!)

My initial reaction, being professionally and emotionally invested in this process, was immense frustration. In my view at the time, we had been promised a process which was responsive to stakeholders’ input, and this was being undermined by what seemed like a political maneuver. I wanted our admin (and the superintendent) to go to bat for us with Elementary admin and argue for the benefits the improved schedule would have. I also was not communicated any pedagogical reasons that Elementary could not move to a Monday early dismissal, only being told that their admin had already started planning for Tuesday.

If I had written this reflection last summer the entire reflection would have been a dissection of the early-dismissal “debacle”, but time has given me a more considered view on those events. There was indeed a lack of communication between my group and our administration regarding the constraints on our plan. There were also mismatched assumptions between the two groups. When I recall the discussions we had in approving the pilot, the administration may have assumed that our pilot schedule was the final version, whereas our group had always conceived of the pilot as a test schedule which we could improve on with feedback. There were also legitimate (though not 100% compelling) reasons for Elementary not to change to a Monday early dismissal. All of these speak to me of the need for clear communication among all decision-makers in any change process. From a strategic point of view, we should have listened to those in the group who wanted to change the proposed schedule before the pilot week, and communicated this change of plans earlier, when it was still up for negotiation.

However, ultimately this was a minor bump in the mainly successful road to schedule change. We ended up with a suboptimal schedule, but I am hopeful that this can be remedied for the next year. Our survey data from this year shows us that faculty and staff are mainly very happy with the schedule, and the process in whole was lauded by stakeholders for its openness. I’m hopeful that the school can continue to make progress through such faculty-driven (or student-driven) change processes in the future, and I know that I will take the lessons learned into future projects.