You Don’t Know Charlotte

While the rest of the nation is embracing Charlotte Danielson’s framework for evaluating teachers, much of the Garden State is just getting to know this New Jersey resident.

By Michael Yaple, Public Affairs Officer at NJSBA

ROCK STAR. That’s the kind of phrase people who work with Charlotte Danielson use to describe her. Danielson – a New Jersey-based economist turned teacher turned school administrator, who turned into an author and consultant whose writings are shaping teacher assessments nationwide – seems to have cultivated that acclaim within educational circles. At least, that’s the perception outside of New Jersey.

“What’s the phrase from the Bible? … ‘A prophet has no honor in his own country,’” said Catherine Thomas, a coordinator at The Danielson Group, the Princeton-based consulting firm founded by Danielson that trains local and state education officials about her educational philosophies.

Danielson herself jokes about any suggestion of celebrity outside of New Jersey, saying, “They say that you’re only an expert when you’re 50 miles away from home.”

Sure, Danielson’s concepts are used in scores of cities from New York to Chicago to L.A., and she’s been tapped as a principal consultant by education departments in at least eight states, ranging from Arkansas and Washington to Pennsylvania and Idaho. Sure, she’s consulted for entire nations, such as Portugal, Chile and the United Arab Emirates. And sure, Education Week magazine’s online bloggers have called Danielson a “teacher eval guru” whose framework has become “wildly popular.”

But to say she’s been entirely snubbed by her home state is a bit of a stretch. In New Jersey, she has consulted with the state Department of Education not only on improvements to the alternate route program for teachers, but also on the design and implementation of a new teacher-evaluation pilot program dubbed Excellent Educators for New Jersey, or “EE4NJ,” which involves 11 districts. Some other districts like Cherry Hill have been using Danielson’s teacher-evaluation concepts for years.

While there are many New Jersey education leaders still unfamiliar with Danielson, her framework for evaluating teachers is now gaining traction in the Garden State – especially with Gov. Chris Christie’s effort to make measuring teacher effectiveness a centerpiece of his administration’s education reform agenda, including tenure reform.

Genesis to breakthrough

Danielson traveled a crooked road to get where she is today. Born in West Virginia, her family moved to Princeton during high school. She graduated from Cornell with a degree in history – specializing in Chinese history, actually – and then went to Oxford University to earn her master’s in philosophy, politics and economics. Twelve years later, in 1978, she earned another master’s from Rutgers in educational administration and supervision. After college, she worked as a junior economist in think tanks and policy organizations. While working in Washington, D.C., she got to know some of the children living on her inner-city block – and that’s what motivated her to choose teaching over economics. She obtained her teaching credentials and began work in her neighborhood elementary school.

She and her husband moved to New Jersey, where she worked her way up the spectrum from teacher to curriculum director, then on to staff developer and program designer in several different locations, including ETS in Princeton, and a developer and trainer for teacher observation and assessments. Those experiences shaped her vision of teacher evaluations.

The breakthrough for Danielson was her book, Enhancing Professional Practice: A Framework for Teaching, originally published in 1996. “Framework for Teaching,” as it’s often referred to, was one of several of her books published through the Association for Supervision & Curriculum Development.

“I didn’t create a ‘program,’” Danielson insists. “What I did is describe good teaching. That’s all I did.” Still, her book was seen as a tool for educators to understand and analyze the complexities of teaching, and it ultimately became the foundation for many teacher evaluations worldwide.

Framework Basics

Under Danielson’s program…er, framework…the qualities of good teachers fall into four domains: Planning and preparation, school environment, instruction, and professional responsibilities. Under those domains are 22 components, such as “demonstrating content knowledge” and “managing student behavior.”

Branching from those components are dozens of descriptions designed to help teachers clearly understand their level of expertise on a four-point scale from “unsatisfactory” to “distinguished.” For instance, when it comes to a teacher’s interaction with students, is the teacher sarcastic or inappropriate? (unsatisfactory)…generally good, but occasionally exhibiting inconsistent behavior like showing favoritism or disregarding student cultures? (basic)…friendly and showing care and respect? (proficient) … or reflecting genuine respect to the point where students seem to trust the teacher with sensitive information? (distinguished).

Trained observers

The concept of utilizing trained observers is key to Danielson’s framework. “I’ve never known a union activist who is opposed to the idea of evaluations. But they are opposed to bad evaluations,” she said. “It’s a fundamental principle of equity that anyone making a high-stakes judgment should actually be capable of doing it accurately and consistently.”

She added, “To suggest you can go into a classroom for five minutes with a little checklist and decide whether someone should get tenure is absurd.”

While Cherry Hill’s evaluation was never that simplistic, it has evolved dramatically since the district adopted Danielson’s framework nine years ago.

Before that time, classroom observations consisted of a narrative report written by an observer after a lesson, said Stanley Sheckman, a Cherry Hill principal who was with the district when it changed its teacher-evaluation system in 2003. The annual performance report was generally a single page with a checklist that gave the options of “satisfactory” or “unsatisfactory.”

“There was little collaboration in this process,” Sheckman added.

Cherry Hill’s revamped system blended Danielson’s framework with learning principles from other experts such as Lauren B. Resnick, an educational psychologist from the University of Pittsburgh. The district’s updated model requires the observer to identify key aspects of the lesson, the classroom environment, and teacher-student interaction. After the observation, the administrator and teacher meet for a collaborative discussion. The teacher is also required to write a personal reflection after each observation.

That element of “personal reflection” on the part of the teacher is an important part of Danielson’s framework. “It’s not a fancy concept,” she said. “It’s where the teacher takes an active role intellectually in the evaluation process, rather than being a passive recipient in someone else’s feedback.”

But that two-way dialogue seems to ring well with many teachers and administrators. “I’m hearing so many teachers say, ‘Going through this process, I’ve learned so much about myself,’” said Laura Morana, Red Bank Borough superintendent, adding that the framework “calls for a lot of reflection – and a lot of honest reflection.” Morana said teachers are asked how well they know their children, and how they see themselves as a vehicle to promote the development of the children.

Like Cherry Hill, Red Bank instituted Danielson’s framework in a modified form several years ago. This school year, Red Bank has incorporated the framework on a full-scale level and is now one of the 11 districts in the state pilot program for teacher evaluations.

Before instituting the changes, “We didn’t have the tools to go into the classroom to determine whether effective teaching was taking place,” said Morana. So Morana reached out to Danielson directly, and Danielson’s consultants were soon working in the district.

Morana has seen improvements: Greater consistency in evaluations and increased positive feedback from teachers. “We certainly learned of the power of establishing a common language to describe the art of teaching – and learning,” said Morana. “It’s not always about teaching; it’s about learning, too.”

Obstacles? A commitment of time is one. Morana said the more thorough evaluations are “a time-demanding effort.” And Sheckman in Cherry Hill notes that continual training and retraining of administrators and teachers is necessary to provide consistency.

Training the evaluators

Sheckman’s point is one emphasized by Danielson time and again: Evaluators need to demonstrate they know their stuff. “The minute this is high stakes – and certainly compensation would qualify as high stakes, as would dismissal or denial of tenure – it’s enormously important that anyone making the judgment demonstrate that they are capable of doing so, that they will be accurate, reliable and valid,” said Danielson.

“If the school district cannot guarantee that, they’re going to get challenged, and I think they’re going to get challenged in court,” she added. “School boards and superintendents have to be alert to this possibility of legal challenge to the evaluative judgments made by administrators, and they better have an answer to it. The best answer to this challenge is to have a process for training and certifying evaluators, so superintendents and school boards can have confidence in their judgments.”

The Big Questions

Danielson doesn’t claim to know all the answers. “Our understanding of what constitutes good teaching has evolved, and will continue to evolve,” she said. “Anyone claiming to be an expert should also recognize that.”

When it comes to teacher evaluations, it sometimes seems there are two camps. On one side are the people who say teaching is more of an art that can’t be measured accurately. On the other side are those who say every other job is measured, and the pay is tied to performance, so there should be no special mystery about teaching. Danielson sees both sides.

“Teaching is enormously complex work, and anyone who doesn’t appreciate that hasn’t tried it,” she said. “On the other hand, we do know what is good teaching. It’s not that it’s an art form and there is no answer as to what good teaching is, and that you can’t measure it. Of course you can. You use complex assessment systems, like for any complex performance.”

But one of the big debates revolves around the use of student assessments to measure teacher salaries. In fact, one of the directives of the 11-district pilot program is to base part of the teacher evaluations on “learning outcomes” that include progress on statewide assessments such as the NJ ASK.

“In general, the degree of student learning is an indication of the quality of teaching,” Danielson stated. “No one, no teacher or even union activist, would disagree with that.” But the tricky part is deciding what counts as evidence, and how that evidence can be attributed to individual teachers. For instance, a third grade teacher’s student scores in reading might increase. But what if there’s a reading specialist in the building? How much of the increase do you attribute to the teacher? And there are numerous other variables that can make it hard to base teacher assessments on students’ test scores. “Until those issues are sorted out,” Danielson said, “I don’t think high-stakes decisions should be based on student learning results.”

Then, of course, there’s merit pay. It’s the $64,000 question. In New Jersey, which spends more per student than any other state, it’s more accurately the $24.7 billion dollar question. That’s how much New Jersey spent on public education last school year, according to the National Education Association’s most recent Rankings & Estimates report. Linking teacher evaluations to pay is a high-stakes issue, especially when one considers that, easily, three-fourths of a typical school district’s budget goes toward salaries and benefits. So how does a school district go about connecting pay with performance?

Again, Danielson doesn’t claim to have the solution to sorting out how to link teacher evaluations with performance pay. “I honestly don’t have an answer to that,” she said. “I understand people are looking for something. It’s nowhere close to my area of expertise.”

But there’s one area that worries her: When she hears of her methods being misapplied. “When people take my book, and use it in a very top-down punitive way as kind of a hammer to teachers, I regret that a lot,” she said. In short, she described it as swapping an old teacher-evaluation “gotcha” system with a new one.

In Demand

t’s a Thursday morning in January, and about 100 people are gathered for a daylong program at Rider University in Lawrence. The people are members of the EE4NJ Evaluation Pilot Advisory Committee, which is collaborating with representatives from the 11 pilot school districts. Their charge is to provide the state Department of Education with a report by the end of June, giving recommendations on how New Jersey should proceed with a statewide teacher evaluation system.

The morning’s speaker is Danielson, who’s accompanied by Mark Atkinson, founder of the company that created the technology and software to train the people who evaluate teachers. The two speak for well over an hour and a half, diligently answering any and all questions from the audience. But she can’t stay for the remainder of the daylong program … she needs to catch a train to Washington, D.C.

Charlotte Danielson is off to her next gig.

Michael Yaple is public affairs officer at NJSBA.

Published in NJ School Boards Association: School Leader, January/February 2012

"Danielson’s Framework for Teaching has been a revelation to me; the best analogy I can offer is that the Framework is like having voice-guided GPS to direct you to a destination, when before you might have only had a destination name and an outdated road map."

Pre-Service Teacher, May 2016

“[The consultant] gave the best PD I have seen in 15 years of teaching, and was the first to explain [the] Danielson [Framework] in a human way. Bravo.”

A teacher, June 2015

“I am so impressed with the Danielson Group consultants. They are all so real. Your trainers helped make [proficient] teaching stronger and steered [basic teaching] toward increasing effectiveness.” 

A principal, June 2015

"Due to your consultant's seamless and meaningful transitions, knowledge of content, and rapport with the audience, the room was alive with energy and it made us all feel ready to begin the year off with success."

"Never before have I seen a group of seasoned educators like your consultants master the art of communicating with an audience with varied levels of expertise and interests. The two days that I spent with your team, I walked away with a desire to use the rubric to truly enhance my own practice."

"I left with a renewed look at the rubric, thinking that the rubric is the Great Equalizer! We can ALL enhance our practice by using it as a tool and a roadmap to produce students who think and are ready for college and careers. THANK YOU!"

"Your consultants' presence and organization of the day will not only impact the new teachers that attended, but will make the year alive for a vast number of students this year."

"Our workshop focused on calibration and inter-rater agreement training, so it was directly aligned to our individual and collective work with teacher performance evaluation.  With new administrators on the team, this type of training is critical."

"We were highly impressed with our Danielson Group consultant and the workshop. We have nothing but positive things to share. Staff have been emailing us, thanking us. This is the most worthwhile presentation we've been to in a while."

"The workshop you provided was hands-on, interesting, practical, and respectful of time limits. I heard more positive feedback about this workshop from staff than I have about any other."

"We wanted to let you know how much we appreciated the flexibility and professionalism that your consultant provided in our unique context. It helped us to keep on track with our schedule at a critical time. For that we are truly grateful."

"Your consultant presented a perfectly differentiated learning experience for all our principals. They were highly engaged, as demonstrated by on-topic conversations using academic language, completion of tasks requiring evidence identification, and note taking and 'grading' during classroom videos of teaching."

"Our school principals said the Framework observation training was the best training they had ever had, including the training provided when earning their Master’s degrees."

"I have a principal who was so excited about the breakthrough work with her staff in special education. I am already getting my money back!"

"My concern about the extra time it would take to implement the Framework successfully was not accurate. It took about the same amount of time as our prior evaluation system, and the benefits in professional growth and increased student achievement were more than worth it."

"I want to truly thank you for the brilliant job that you did with our training. I got such positive feedback from the team. They feel re-energized and like they have a direction and new tools to do the job."