Today marks the due date for the MNPS School Board’s evaluation of the Director of Schools. Dr. Joseph has already completed his self-evaluation (SJoseph Summative Self-Eval Evidence Companion – June 2018 to Board 06.1…) and widely shared it with the public. A more cynical man might think this is an effort to control the narrative before the board writes theirs. In discussing these developments, and their appropriateness, I felt it was warranted to check actual board policy.
Per board policy, as printed on the MNPS web page under board policy: “Both the board and director will prepare for the evaluation; the director will conduct a self-evaluation and board members will document the evidence used in rating the director’s performance.” So based on the stated policy, the director is in compliance with the letter of the policy, though I am not so sure about the spirit. The term “self-evaluation” is not actually defined, so it is open to interpretation.
My interpretation is that it a self-evaluation is done by the “self” and is not a document written and compiled by a staff member. Though I understand that this “ghost writing” of the director’s self-evaluation is not without precedent. I have to question, due to the district being in a financial crunch, what the cost of compiling this self-evaluation was and if those resources could have been better utilized elsewhere.
Board policy states that all documentation will be supported by objective evidence. “Objective,” another term that begs for definition. The director’s self-evaluation is rife with data, but I’d argue that it is not “objective” data. Rather, it is data that supports the narrative he would like to spin.
In order to have objective data, the board and the director would have to agree on what data would be used in the evaluation, and that data would be made available to all parties. Currently the only one who has access to the raw data is the director, who picks and chooses what he wants to share. Take away that control, and board members and the director would then be drawing their own interpretations of the agreed-upon data and the evaluation would be more authentic. As it is, board members are not making evaluations based on the actual data, but rather based on interpretations of data that supports a desired narrative. Spock would be pissed.
In order to effectively utilize the agreed-upon data base, an appendix would need to be added. There is no appendix included in the director’s self-evaluation and therefore members are not privy to how the information was gathered, what the norms are, what the ranges are, nor the intended use of the data. Quick… tell me what the average RIT score of a 4th grader is? My point exactly.
We need to add further clarification on MAP testing here. MAP testing is a measurement of growth. When you see that “students in grades 2-8 exceeded the national average in reading and math proficiency,” you need to recognize that what that means is, when nationally normed, MNPS students grew at a slightly faster rate than students at the same level nationally. They are making more growth than national peers, but not necessarily performing at a higher level. That is good news, but still needs to kept in perspective.
Earlier in the week, I told you about the director touting progress on central office culture because the number of people answering “making progress” had risen in regard to the question, “Do you work in a trusting environment, that allows for an open exchange of ideas?” This was in spite of the number of people who answered “yes, absolutely” had dropped from 19% to 3%. The same problem potentially holds true on MAP results. By not looking at the proficiency and growth scores together, we run the risk of celebrating one while the other falls.
MAP testing is traditionally done three times a year. We’ve only done it twice a year in both the years we’ve administered it, and this year we moved the spring window to the winter window based on the supposition of “test fatigue.” Again, there is no evidence that the lower test results in the spring of 2017 were a byproduct of test fatigue, or that test fatigue actually existed. Perhaps if the test would have been given multiple times with fidelity, we would have seen indications of a trend that pointed towards test fatigue, but instead we chose to create a narrative and rig the game by not offering the test at the time intended before supporting evidence could be gathered.
One last caveat here on the performance data collected – MAP, attendance, or any other data – it needs to be broken down into snapshots reflecting individual schools. If I say all MNPS students are growing at a rate of 54% over the rate of their national peers, what does that mean? If we break it down by ethnicity, it still doesn’t tell us anything. Are you going to argue that the African-American child enrolled at Buena Vista is identical to the African-American child enrolled at Eakin? What real significance is it if the AA child at Eakin is growing at a hypothetical rate of 68% while the child at Buena Vista is growing at a hypothetical rate of 40%? That gives you a composite 54% growth rate for AA kids. Same holds true for attendance figures or other key performance indicators.
Interestingly enough, back in January I asked Dr. Paul Changas, who oversees MNPS’s data accumulation, if it was possible to get growth scores in individual schools for the different levels of kids. It was my feeling that if a parent knew a school was as good at producing growth for high-end kids as it was for low-end kids, more parents may invest in MNPS. I was told that MAP couldn’t really produce that measurement. Yet, here in the director’s report is MAP data supporting the growth of kids in advanced academics. See my point? The director needed support for his narrative and the data to support that narrative is made available to him, but perhaps not to others.
We could spend the next 6 months citing similar examples and sparring over the director’s interpretation. The bottom line is the whole argument/discussion would be based on in-house interpretation. In order to have a robust discussion, everybody needs full access to ALL the data. Data that has been collected based on agreed methods, at agreed upon intervals, with agreed upon tools. Not like in the director’s self-evaluation where the data is delivered at varying intervals and in some instances, like the teacher retention data, is a year old.
The next part of my analysis of policy is going to disappoint some sitting board members who I now have worked very diligently on their evaluation. The director evaluation policy states, “A part of the evaluation may be a composite of the evaluation by individual board members, but the board, as a whole, will meet with the director to discuss the composite evaluation.”
My interpretation here, is that individual board members will submit their evaluation, which will then be molded into a composite evaluation. There is no definition of who will be responsible for creating the composite evaluation, though I assume it will be the committee chairs. There is also no definition of to what extent individual board member’s concerns need be included in the composite evaluation. It could be that those individual evaluations are given minimal consideration and never see the light of day.
So back to my initial question: is Dr. Joseph following the letter but not the spirit of the law? I don’t know because the policies listed are vague at best. Further complicating things is that I’m not sure if the policy cited is actually the current policy or the soon-to-be revised policy. It is listed on the MNPS web page under board policies. However, the page does contain the caveat that “until the policy revision process is completed it will be necessary to look diligently at both sets of policies in order to assure that the most current version is referenced. The process for policy revision is scheduled to be completed during the summer of 2018.” I can’t find the old policy, so I can’t compare.
Sections 1, 2, and 3 have been passed. Sections 4 and 6 of the board policies are currently up for review, but section 5, where the director review lives, is apparently in limbo. So who knows what the actual policy is? I wonder whose definition of exceeding expectations this meets? We continue to try to fly the plane while we build it.
WHAT’S $250K AMONG FRIENDS?
At Tuesday’s board meeting, a very troubling conversation took place. As you know, this has not been a very kind budgetary season for MNPS. They wanted an increase of $45 million and they got $7 million. Then they had to ask for $3.5 million out of the rainy day fund due to some unforeseen circumstances. Now Chief of HR Deborah Story was in front of the board asking for an extra $250K to cover overages on the contract with Education Solution Services. ESS is who we hired last year to help address the district’s sub shortage. Let’s go to the video on this one. The exchange over the contract takes place at about the 15:06 mark of the video.
Story shows up with no supporting documentation, no explanation, and a caveat that there may be smaller requests forthcoming. No definition of what smaller means is given. $100K or even $200K is smaller than $250K. Nobody but board member Jill Speering takes an exception to this.
Story continues throughout to assert that there was no way to predict the overage. When asked when the invoices come in, Story replies. “Monthly, sometimes, early on less frequent…” So Story is telling me that HR is incapable of managing billing cycles? That if a purveyor doesn’t submit bills in a timely manner, we don’t hold them accountable?
Story was doing so poorly at her explanation that her assistant Sharon Pertiller took it upon herself to just stride up to the microphone and start talking. Is that how we do things now on the board floor? You don’t need to be recognized, you just step to the mic?
Pertiller is clearly exasperated that she has step up and explain to the board that there was no way to predict how successful this program was going to be. How successful was the program? Very. What does that mean? Apparently that is just what it means, very. Are we using ESS next year? Nope, they are too expensive because there is a 27% up-charge on every sub we use. Was there a 27% up-charge included when the proposal was brought forth in September of last year? The answer to that is yes. The whole conversation has a very “who’s on first” feel about it.
On a quick side note, this concept of “too expensive” bugs me. What does that mean? If we recognize that in order to increase our retention of teachers we must increase the fill rate at schools., the evaluation needs to start with, was the program successful? How successful? If it was indeed successful and it costs a lot of money, the next question needs to be, is there anything out there that would give us similar results at a lower cost? If the answer to that question is no, then the program isn’t expensive; it’s just what it costs.
If priority in question is one of our top priorities, then we need to meet that cost of doing business and look at something else to cut. It’s the same with Reading Recovery, the Universal Screener, and paying for advanced academic tests. If the program is delivering results and meeting an established top priority, then we need to pay for it. Period. Or get different top priorities.
Back to ESS. In the end the board approved the additional $250K, but nobody mentioned where the money would come from or what would have to be sacrificed in order to fund this overage. I could probably name a half a dozen other areas that similar incidents have taken place and the questions were never raised then either. Nobody seems to connect these overages to the fact that we need to go into the fund balance for $3.5 million. In other words, the budget is so tight in a $900 dollar budget that we can’t find $3.5 million, yet it’s no big deal to approve an additional $250K. We can’t spend $1.3 million on advanced academics tests, but with a shrug we approve a $250K overage. Does something seem, to quote the bard – that’s Shakespeare to those of you who don’t read the classics – rotten in Denmark?
For me, this whole interaction speaks to a lack of respect for the MNPS school board. To show up with no supporting documentation or real explanation for a substantial overage is a sign of disrespect. To just approach the board without being recognized by the board is a sign of disrespect. To have no systems in place to ensure that overages don’t occur is a sign of disrespect for the board. Remember, the board is an extension of the taxpayers. Disrespect one and you risk disrespecting the other.
Congratulations to everyone who worked so hard on the 2018 Music City SEL expo. It was a fantastic event and one that is heading to the Music City Center next year. I look forward to attending again. Kyla Krengl and her team should be extremely proud of their work.
Riddle me this. At the school board meeting this week, I heard Paul Changas and Doug Renfro talk about why they couldn’t do a proper RFP because of the length of time that it took to imput and migrate data – 6 months. Yet the initial contract that was brought forth to the board was only for one year. Knowing how labor intensive the transition would be, why wasn’t a longer contract initially brought forth?
Charlotte Park ES has a new principal. Mrs. Julia Elmore is the new boss. She is the daughter of long time MNPS HR associate Barry Potts. We’d wish her good luck, but know that she’ll knock it out of the park.
The new Principal at Jere Baxter MS is former Pearl-Cohn AP Traci Sloss. Congrats are in order there as well.
Beloved McGavock ES principal Hildateri Smith resigned this week for personal reasons. She will be missed.
Yesterday I attended a wonderful event put on by the Power of 10 PAC. They announced their endorsements for the upcoming election. Not surprisingly, I didn’t win their endorsement. They endorsed Gini Pupo-Walker in District 8 and Aron McGee in District 6. They didn’t endorse anyone in District 2. I very much enjoyed the event and the conversations I engaged in while in attendance. This is a group doing good work and warrants support.
Tamika Tasby, who came from Atlanta to join Joseph’s administration with the directive to oversee professional development, is moving on. There is some question as to whether or not she led a single professional development session during her tenure. But nobody marked time for speakers at events like she did. Her official title was Executive Director of Innovation & Strategic Project Management Office. We wish her luck.
I have been remiss in announcing that Director of Visual & Performing Arts Nola Jones is retiring. Jones recently grew the Music Makes Us program. She will be missed.
According to the MNPS employee portal, there are still roughly 400 certified positions open in MNPS. Is that an accurate figure? Considering that human resources can’t keep track of substitutes, job interviews, applicants, or just about anything else… probably not. But it is the only number we have to go by. So… a month away from the start of school, there are still close to 400 openings.
In his remarks at today’s Music City SEL Conference – that have the fingerprints of MNPS’s communications department all over ’em – Dr. Joseph told attendees, “One of the things we are committed to doing this year is paying attention to our adults and our adult needs because one thing we recognize is that if we don’t feed the adults, they eat the children.” Sigh… the man never learns. I hope he attended a few of the sessions at the SEL conference. Modeling, do not forget, modeling is the most powerful form of teaching.
In all fairness, the comments referenced above were probably in reference to a book recommendation that had been making its way around the conference. But in a climate where next year, teachers are going to actually take home less money while being expected to do more, it was not taken that way by all, or even most. Doing SEL right requires a high degree of sensitivity and a willingness to empathize with others.
Congratulations are in order! Dr. Susan Kessler, principal at Hunters Lane High School, has been elected president of the Tennessee chapter of the Association for Supervision and Curriculum Development (TNASCD). TNASCD is a statewide professional organization whose mission is to provide an open forum for the analysis of educational issues. The membership collectively influences policy and serves as a catalyst for change. Through an open, diverse and expanding membership, this organization provides personal and professional renewal and the means for effective networking.
That’s another blog post in the bag. Hope y’all have an awesome weekend. Don’t forget to answer the poll questions. If you need to contact me, you can do so at Norinrad10@yahoo.com. I’m always looking for more opinions and will try to promote as many of the events that you send me as possible, but I do apologize in advance if I fall short and don’t get them all out there.