My school is currently trialing and evaluating different forms and modes of professional development. I did some reading to gain perspectives on how to evaluate the effectiveness of professional development.
I read two articles about PD evaluation, both by Thomas Guskey, who seems to have made this his academic niche.
Article 1: What Works in Professional Development?
Abstract of “What works in Professional Development“: a A research synthesis confirms the difficulty of translating professional development into student achievement gains despite the intuitive and logical connection. Those responsible for planning and implementing professional development must learn how to critically assess and evaluate the effectiveness of what they do.” You can read the article with detailed comments by a broad group of educators here: https://kami.app/J15eQMqxFzOk.
While this article is interesting, it is also questionable because its quite considerable conclusions seems to rest from just 9 ‘valid’ studies. All the other studies were dismissed because of problems with the methodology. So; it is too hard to do valid studies into the efficacy of Professional Development and its impact on student outcomes? Can we make any statements about the impact of PD on student outcomes? Jenny Gore would say she has evidence for the effectiveness of the Quality teaching rounds model.
Article 2: Does It Make a Difference? Evaluating Professional Development”, by Thomas R. Guskey
Abstract: “Using five critical levels of evaluation, you can improve your school’s professional development program. But be sure to start with the desired result—improved student outcomes.”
Notes from: Article 2: Does It Make a Difference? Evaluating Professional Development.
It’s an easy to read article, but I used Kami to highlight and distill key areas. My notes from “Does It Make a Difference? Evaluating Professional Development”, by
Thomas R. Guskey b are here in Kami (you can add your own annotations).
- Evaluation is “the systematic investigation of merit or worth.
- Good evaluations don’t have to be complicated. They simply require thoughtful planning, the ability to ask good questions, and a basic understanding of how to find valid answers. What’s more, they can provide meaningful information that you can use to make thoughtful, responsible decisions about professional development processes and effects.
- Some educators understand the importance of evaluation for event-driven professional development activities, such as workshops and seminars, but forget the wide range of less formal, ongoing, job-embedded professional development activities—study groups, action research, collaborative planning, curriculum development, structured observations, peer coaching, mentoring, and so on. But regardless of its form, professional development should be a purposeful endeavour. Through evaluation, you can determine whether these activities are achieving their purposes
- At Level 1, you address questions focusing on whether or not participants liked the experience. Did they feel their time was well spent? Did the material make sense to them? Were the activities well planned and meaningful? Was the leader knowledgeable and helpful? Did the participants find the information useful?
- Some educators refer to these measures of participants’ reactions as “happiness quotients,” insisting that they reveal only the entertainment value of an activity, not its quality or worth. But measuring participants’ initial satisfaction with the experience can help you improve the design and delivery of programs or activities in valid ways.
- Level 2 focuses on measuring the knowledge and skills that participants gained
- Although you can usually gather Level 2 evaluation information at the completion of a professional development activity, it requires more than a standardized form. Measures must show attainment of specific learning goals. This means that indicators of successful learning need to be outlined before activities begin. You can use this information as a basis for improving the content, format, and organization of the program or activities.
- At Level 3, the focus shifts to the organization. Lack of organization support and change can sabotage any professional development effort, even when all the individual aspects of professional development are done right.
- At Level 3, you need to focus on questions about the organization characteristics and attributes necessary for success. Did the professional development activities promote changes that were aligned with the mission of the school and district? Were changes at the individual level encouraged and supported at all levels? Were sufficient resources made available, including time for sharing and reflection? Were successes recognized and shared? Issues such as these can play a large part in determining the success of any professional development effort. Gathering information at Level 3 is generally more complicated than at previous levels. Procedures differ depending on the goals of the program or activity. They may involve analyzing district or school records, examining the minutes from follow-up meetings, administering questionnaires, and interviewing participants and school administrators. You can use this information not only to document and improve organization support but also to inform future change initiatives.
- At Level 4 we ask, Did the new knowledge and skills that participants learned make a difference in their professional practice? The key to gathering relevant information at this level rests in specifying clear indicators of both the degree and the quality of implementation. Unlike Levels 1 and 2, this information cannot be gathered at the end of a professional development session. Enough time must pass to allow participants to adapt the new ideas and practices to their settings. Because implementation is often a gradual and uneven process, you may also need to measure progress at several time intervals.
- You may gather this information through questionnaires or structured interviews with participants and their supervisors, oral or written personal reflections, or examination of participants’ journals or portfolios. The most accurate information typically comes from direct observations, either with trained observers or by reviewing video-or audiotapes. These observations, however, should be kept as unobtrusive as possible (for examples, see Hall & Hord, 1987).
- Level 5 addresses “the bottom line”: How did the professional development activity affect students? Did it benefit them in any way? The particular student learning outcomes of interest depend, of course, on the goals of that specific professional development effort.
- Measures of student learning typically include cognitive indicators of student performance and achievement, such as portfolio evaluations, grades, and scores from standardized tests. In addition, you may want to measure affective out-comes (attitudes and dispositions) and psychomotor outcomes (skills and behaviors). Examples include students’ self-concepts, study habits, school attendance, homework completion rates, and classroom behaviors. You can also consider such schoolwide indicators as enrollment in advanced classes, member-ships in honor societies, participation in school-related activities, disciplinary actions, and retention or drop-out rates. Student and school records provide the majority of such information. You can also include results from questionnaires and structured interviews with students, parents, teachers, and administrators. Level 5 information about a program’s overall impact can guide improvements in all aspects of professional development, including program design, implementation, and follow-up. In some cases, information on student learning outcomes is used to estimate the cost effectiveness of professional development, sometimes referred to as “return on investment” or “ROI evaluation” (Parry, 1996; Todnem & Warner, 1993).
- Can you now demonstrate that a particular professional development program, and nothing else, is solely responsible for the school’s 10 percent increase in student achievement scores or its 50 percent reduction in discipline referrals? Of course not.
- Keep in mind, too, that good evidence isn’t hard to come by if you know what you’re looking for before you begin. Many educators find evaluation at Levels 4 and 5 difficult, expensive, and time-consuming because they are coming in after the fact to search for results (Gordon, 1991). If you don’t know where you are going, it’s very difficult to tell whether you’ve arrived. But if you clarify your goals up front, most evaluation issues fall into place.
- Each of these five levels is important. The information gathered at each level provides vital data for improving the quality of professional development programs.
- The third implication, and perhaps the most important, is this: In planning professional development to improve student learning, the order of these levels must be reversed. You must plan “backward” (Guskey, 2001), starting where you want to end and then working back. In backward planning, you first consider the student learning outcomes that you want to achieve (Level 5). For example, do you want to improve students’ reading comprehension, enhance their skills in problem solving, develop their sense of confidence in learning situations.
The unwieldy table below is here in a Word Doc: Evaluating PD Guskey
Evaluation Level | What Questions Are Addressed? | How Will Information Be Gathered? | What Is Measured or Assessed? | How Will Information Be Used? |
1. Participants’ Reactions | Did they like it?
Was their time well spent? Did the material make sense? Will it be useful? Was the leader knowledgeable and helpful? Were the refreshments fresh and tasty? Was the room the right temperature? Were the chairs comfortable? |
Questionnaires administered at the end of the session | Initial satisfaction with the experience | To improve program design and delivery |
2. Participants’ Learning | Did participants acquire the intended knowledge and skills? | Paper-and-pencil instruments
Simulations Demonstrations Participant reflections (oral and/or written) Participant portfolios |
New knowledge and skills of participants | To improve program content, format, and organization |
3. Organization Support & Change | Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources made available? Were successes recognized and shared? What was the impact on the organization? Did it affect the organization’s climate and procedures? |
District and school records Minutes from follow-up meetings Questionnaires Structured interviews with participants and district or school administrators Participant portfoliosMinutes from follow-up meetingsQuestionnairesStructured interviews with participants and district or school administratorsParticipant portfolios |
The organization’s advocacy, support, accommodation, facilitation, and recognition | To document and improve organization support
To inform future change efforts
|
4. Participants’ Use of New Knowledge and Skills | Did participants effectively apply the new knowledge and skills? | Questionnaires
Structured interviews with participants and their supervisors Participant reflections (oral and/or written) Participant portfolios Direct observations Video or audio tapes |
Degree and quality of implementation | To document and improve the implementation of program content |
5. Student Learning Outcomes | What was the impact on students?
Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? |
Student records
School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios
|
Student learning outcomes:
Cognitive (Performance & Achievement) Affective (Attitudes & Dispositions) Psychomotor (Skills & Behaviors)
|
To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development |
- What Works in Professional Development?
Guskey, Thomas R.; Yoon, Kwang Suk
Phi Delta Kappan, v90 n7 p495-500 Mar 2009
https://tguskey.com/wp-content/uploads/Professional-Learning-5-What-Works-in-Professional-Development.pdf Accessed 29/02/20 (back) - Does It Make a Difference? Evaluating Professional Development. Thomas R. Guskey, University of Kentucky Feb 2002, Published in Educational Leadership, v. 59, issue 6, p. 45-51. http://www.ascd.org/publications/educational-leadership/mar02/vol59/num06/Does-It-Make-a-Difference%C2%A2-Evaluating-Professional-Development.aspx Accessed 29/02/20 (back)