NowComment
2-Pane Combined
Comments:
Full Summaries Sorted

EDU 807 - Week 12 Spring 2018 - Framing the Assessment of Educational Technology Professional Development in a Culture of Learning Group 2


0 General Document comments
0 Sentence and Paragraph comments
0 Image and Video comments


Framing the Assessment of Educational Technology Professional Development in a Culture of Learning

New Thinking Partner Conversation New Conversation
Paragraph 1 0
profile_photo
Mar 24
Dr. Troy Hicks Dr. Troy Hicks (Mar 24 2018 2:47PM) : Goals for Reading more

There are four questions in the reading task this week. Please answer each of my original questions individually (4 responses) and then reply to two of your classmates initial comments in any of the four elements in a substantive manner (2 responses).

4 initial posts + 2 responses to classmates = 6 total comments

New Thinking Partner Conversation New Conversation
Paragraph 1, Sentence 1 0
No sentence-level conversations. Start one.

Melissa Pierson
University of Houston

New Thinking Partner Conversation New Conversation
Paragraph 2 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 2, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 2, Sentence 2 0
No sentence-level conversations. Start one.

Arlene Borthwick
National-Louis University

New Thinking Partner Conversation New Conversation
Paragraph 3 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 3, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 3, Sentence 2 0
No sentence-level conversations. Start one.

2010. Journal of Digital Learning in Teacher Education, 26(4), 126–131.

New Thinking Partner Conversation New Conversation
Paragraph 4 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 4, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 4, Sentence 2 0
No sentence-level conversations. Start one.

Abstract

New Thinking Partner Conversation New Conversation
Paragraph 5 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 5, Sentence 1 0
No sentence-level conversations. Start one.

Assessing the effectiveness of educational technology professional development (ETPD) must go beyond obtaining feedback from participants about their level of satisfaction with a workshop presenter. Effective and meaningful assessment of ETPD requires that we design inservice learning activities that can be measured using methods that are consistent with what we know about teaching and learning, recognize teacher and student change as it relates to the larger teaching and learning context, and view evaluation as an inseparable component of ongoing teacher action. We therefore offer for consideration an ETPD assessment model that merges three theoretical constructs through which professional development consumers might interpret research findings: (a) technological pedagogical content knowledge (TPACK), (b) organizational learning, and (c) participant research and inquiry.

New Thinking Partner Conversation New Conversation
Paragraph 6 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 6, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 6, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 6, Sentence 3 0
No sentence-level conversations. Start one.

(Keywords: professional development, assessment, TPACK, action research)

New Thinking Partner Conversation New Conversation
Paragraph 7 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 7, Sentence 1 0
No sentence-level conversations. Start one.

We’ve all been there. As you gather your notes and belongings at the end of a long day of learning some trendy, new technology tips and tricks, someone hands you a page with a few questions to answer: What did you learn today? How likely are you to use what you learned? Was the presenter effective? You’re anxious to get home, eyes weary from peering at the computer screen all afternoon, so you do your best to circle the numbers that would be complimentary to the presenter—she tried hard to be funny and keep our attention, after all—but you skip past the large, open-ended comments box in your rush to get out the door. Or maybe you are the professional developer charged with collecting data to evaluate the outcomes of a grant-funded project. You plan ahead with teachers and
other school staff. You provide an introduction during large- and small-group activities. Later you proceed to “collect data” in classrooms, but instead you find yourself helping students and teachers with new software features and collaborative student projects. While in the thick of providing support to the success of classroom implementation, you might find yourself setting aside your observation protocol. Soon the class period is over and students have no time to enter comments for the daily journal prompts you developed (those data you planned to use for the evaluation).

New Thinking Partner Conversation New Conversation
Paragraph 8 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 6 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 7 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 8 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 9 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 10 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 11 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 8, Sentence 12 0
No sentence-level conversations. Start one.

These scenarios are all too common, and yet these data—haphazardly solicited and carelessly offered—may represent the total information source we have for judging the effectiveness of our efforts in preparing teachers to teach with 21st-century skills and tools. Any conclusions we draw from these data are necessarily limited, skewed, and of questionable validity. In the first scenario, data are likely more representative of how personable the presenter was, or how tired the participants were, than how well facilitated, well designed, or well matched to learners’ needs the learning experience was. In the second scenario, the presenter frets over lost opportunities for meaningful data collection, acknowledging that the data set will be smaller than desirable.

New Thinking Partner Conversation New Conversation
Paragraph 9 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 9, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 9, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 9, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 9, Sentence 4 0
No sentence-level conversations. Start one.

Assessing the effectiveness of educational technology professional development must go far beyond obtaining feedback from participants about their level of satisfaction with a workshop presenter. The “impact the professional development activity had on pedagogical change or student learning” (Lawless & Pellegrino, 2007 p. 579) is of particular importance to our field’s efforts to prepare teachers at all levels to use technology effectively. However, basing our evaluation on incomplete data leads us to an incomplete understanding and incomplete assumptions about the role of professional developers, teacher educators, and teachers.

New Thinking Partner Conversation New Conversation
Paragraph 10 0
profile_photo
Mar 24
Dr. Troy Hicks Dr. Troy Hicks (Mar 24 2018 2:45PM) : Before getting too deep into the document... more

Before delving into the article, consider the initial argument here… how well do typical end-of-session surveys do at measuring the deeper impact of professional development?

What are the other measures — both quantitative and qualitative — that we would need to see, over time, in order to draw valid conclusions about the long-term effects of a professional development program?

profile_photo
Mar 26
Robert Norman Robert Norman (Mar 26 2018 9:57AM) : measures more

I like to pair evaluation data with usage rates or other back-end data. It’s one thing for participants enjoy a session on a piece of software, but it’s more helpful for me to compare that with actual adoption.

profile_photo
Mar 29
Pamela Wegener Pamela Wegener (Mar 29 2018 2:28PM) : Clicks more

I agree Robert, evaluating their “click-through” process is valuable.

profile_photo
Mar 29
Robert Norman Robert Norman (Mar 29 2018 5:49PM) : discussion more

Julie and I actually talked about this a bit more last night, and she brought up the point that sometimes this back end log data can be misleading, which I would certainly grant. Students with different learning styles are going to have different “click through” speeds, which may or may not predict achievement. I think this is the reason why a predictive model based on this kind of data has seemed to be so elusive, but I still think it’s worth pursuing.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 6:52PM) : Connecting multiple data sources more

I agree, this is definitely interesting to think about. How might we pair quantitative data about the user experience with the quantitative/qualitative data that we might gain from the survey? Also, what might happen if we track the data over time?

profile_photo
Mar 28
Sarah Lewis Sarah Lewis (Mar 28 2018 10:06AM) : Measures more

First off, I think end of session surveys can be tough, people often don’t know exactly what they are evaluating, rightfully so, but it can still make the aggregate data a bit skewed. Other measures could be student success in education long term or perception surveys long term on how effective the PD was or continual follow up surveys to see where they needed more PD

profile_photo
Mar 30
CEDRIC GREEN CEDRIC GREEN (Mar 30 2018 2:44PM) : End of session surveys [Edited] more

As well intended as they are, end of session surveys more often than not the responders dont take the the surveys seriously or submit surveys that are incomplete. Hence this limits how valid the measurements will be.
In addition, it would be quite challenging to use questionable data to make long term determinations on professional development.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 6:53PM) : Taking evaluation more seriously more

I agree with you here – it is difficult to get people to take these end of session of valuation seriously. To what extent might we need to adapt the ways in which we invite participants to provide feedback after — as well as before or during — a session?

profile_photo
Apr 1
CEDRIC GREEN CEDRIC GREEN (Apr 01 2018 8:05PM) : Carrot and Stick [Edited] more

I’m sure this sounds a bit cliche but perhaps associating the pre and post session feedback exercises with group activities or rewards. For example, a pre session group activity where participants remain anonymous complete the surveys, discuss, and share findings with the larger group. For post session survey, perhaps some form of reward for completed surveys. However, with either this risk of data validity still remains.

profile_photo
Mar 29
Pamela Wegener Pamela Wegener (Mar 29 2018 2:24PM) : Measures more

End of session surveys have minimal impact in my view of measuring the success of PD outcomes. PD content is still fresh in mind, however, once the real application of what is learned takes place, the learner’s perception could change. I like the idea of pre-and-post perception surveys followed by either an evaluation or interview once the PD material is practiced over a reasonable amount of time.

profile_photo
Mar 29
Robert Norman Robert Norman (Mar 29 2018 5:50PM) : surveys more

Sometimes I get the sense that we rely so heavily on end of session surveys because it’s really the only thing we can do at that moment that doesn’t require a ton of follow-up for everyone involved. Are we taking the easy way out, or is it really the most logical way to start measuring effectiveness?

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 6:55PM) : Data points more

To some extent, I think that we use the end of session evaluation as a mere placeholder, a way to show that we (as workshop organizers) have “done” something, anything to engage the participants. As we think about true, demonstrable impact, however, the data timing and collection becomes much more difficult.

When, how, and by whom would data be collected after a month, two months, six months, or even beyond? What kind of data would need to be collected to show a genuine impact?

profile_photo
Apr 1
CEDRIC GREEN CEDRIC GREEN (Apr 01 2018 8:24PM) : Culture [Edited] more

For the purposes of discussion lets say we’ve collected valid pre and post session PD data at some predetermined interval we’ll need another tool/survey to determine effectivness. In addition, we should be mindful of how these new learned PD concepts and strategies will work within the culture/environment its being introduced into. Typically, when change is introduced resistance will surely follow.

New Thinking Partner Conversation New Conversation
Paragraph 10, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 10, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 10, Sentence 3 0
No sentence-level conversations. Start one.

The genesis of our thoughts on the assessment of professional development in educational technology was our work in co-editing the text Transforming Classroom Practice: Professional Development Strategies in Educational Technology (TCP) (Borthwick & Pierson, 2008). Through the solicitation of proposals and collaboration with chapter authors, what became clear to us in so many ways is the necessary connection between professional development models and the assessment strategies used in the evaluation of each model.

New Thinking Partner Conversation New Conversation
Paragraph 11 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 11, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 11, Sentence 2 0
No sentence-level conversations. Start one.

Effective Evaluation of Educational Technology Professional Development

New Thinking Partner Conversation New Conversation
Paragraph 12 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 12, Sentence 1 0
No sentence-level conversations. Start one.

The climate of accountability in the early 21st century has heightened the awareness of stakeholders at all levels of the education system to the need for data. These many audiences for the findings of educational technology professional development (referred to as ETPD throughout this article) demand more than descriptive studies reporting isolated anecdotal narratives, even if those narratives share compelling stories of success. Clearly, a planned evaluation strategy, in place from the inception of the project, could assist professional developers in understanding the extent to which ETPD is effective, rigorous, and systematic.

New Thinking Partner Conversation New Conversation
Paragraph 13 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 13, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 13, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 13, Sentence 3 0
No sentence-level conversations. Start one.

Unfortunately, the field has not yet arrived at a level we can term effective by means of rigor and system. Instead, PD efforts are deemed effective based on teacher self-report and opinion—those data so easily collected on photocopied surveys and so unrealistically depended upon as meaningful facts. These means fall considerably short of demonstrating “whether professional development has changed practice and ultimately affected student learning” (Hirsch & Killion, 2009, p. 468) and have led to a literature base of well-intentioned descriptions of promising, yet isolated, implementations and informal lessons learned from individual programs. This “show-and-tell” line of publication misses the definition of rigor with the absence of stated research questions, planned designs, and matched and multiple data collection. It hinders the flow of research into practice, leaving educators to wonder what PD workshops they should attend to improve their teaching or their students’ learning, as little or “virtually no information exists to help consumers of professional development” (Hill, 2009, p. 473). Further corroborating the troubling state of professional development literature, a 2009 review of published accounts of professional development with respect to student learning outcomes found that “only nine of the original list of 1,343 studies met the standards of credible evidence set by the What Works Clearinghouse” (Guskey & Yoon, 2009, p. 496).

New Thinking Partner Conversation New Conversation
Paragraph 14 0
profile_photo
Mar 24
Dr. Troy Hicks Dr. Troy Hicks (Mar 24 2018 2:46PM) : "Standards of credible evidence" more

While I don’t expect that you would read this entire document, please do read pp. 7-8 and then return here: https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_procedures_v3_0_standards_handbook.pdf

Now that you have a sense of how the WWC reviews studies, why do you think it would be particularly difficult to set up a study of effective professional development that meets the standards (notably, the criteria for randomized-controlled trials or quasi-experimental designs)?

In other words, if holding to a statistically viable understanding of “what works,” why is it that we are very unlikely to ever really know what works with technology implementation based on professional development?

profile_photo
Mar 26
Robert Norman Robert Norman (Mar 26 2018 10:07AM) : sample size more

The first thing I think of is sample selection. With professional development, you often have a self-selected population, not a randomly selected one, so already you don’t have the prerequisites for a true experimental study.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 6:56PM) : Random selection for PD more

I think this is definitely an interesting element to consider for control in a statistical analysis of professional development and it’s effectiveness. To what extent – even when people are “required” to be there – do they fully engage with the professional development experience?

profile_photo
Mar 28
Sarah Lewis Sarah Lewis (Mar 28 2018 10:25AM) : Standards more

Based on Pages 7-8 it seems like they want replicative studies only… I may be reading that wrong but if the review only allows for the same sample alignment, similar design, aligned protocol and other like standards it seems like their standards are quite narrow. I would think with adapting technologies, many of the ideas would be newer or fresh and may need some testing with skills that are older than 20 years but with new techs and ideas. Also, the idea that “what works” will work for everyone seems a bit narrow minded. There are so many outside factors that create dependency with adapting new technologies and PD, to ignore those seems dangerous.

profile_photo
Mar 29
Pamela Wegener Pamela Wegener (Mar 29 2018 2:30PM) : Standards more

Sarah, you are right along with my thoughts that ignoring outside factors and assuming that what works fits all is detrimental.

profile_photo
Mar 29
Robert Norman Robert Norman (Mar 29 2018 5:53PM) : experiemental more

I think this is one of the key areas that many of us are pushing back against when it comes to randomized experiments. They are very highly prized because, statistically, they have generalizability where other studies do not. However, one of the things that Dr. Deschryver would point out is that generalizability might not apply to ill-structured domains the same way is does to well-structured domains. It’s something I think about now every time I read a research article.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 6:58PM) : Standards for "what works" more

I’m really glad that you brought this up, Sarah, as I think that the difficulty of measuring any educational intervention – especially one mediated by technology – is that the interfaces and opportunities are constantly changing.

Moreover, I personally considerIt to be ethically suspect as an educator to offer something to one group of students who could benefit while other students remain in ways of thinking and doing that, most likely, will not. Thus, the research design – and the epistemology that undergirds it – is very difficult for me to reconcile.

profile_photo
Mar 30
CEDRIC GREEN CEDRIC GREEN (Mar 30 2018 4:18PM) : What works [Edited] more

Even if the success criteria are met from a implementation perspective, there’s no accurate way to determine that based on professional development. As mentioned earlier, gathering data to validate its effectiveness is always difficult given all the many external factors with potential influence.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 7:00PM) : Measuring the effectiveness/impact of PD over time more

I agree, Cedric, that any accurate accounting of the effectiveness or impact of professional development, as measured over time, would be incredibly difficult.It would demand both extensive time for the researcher/evaluator, and lots of feedback from the participants involved in the PD.

New Thinking Partner Conversation New Conversation
Paragraph 14, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 14, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 14, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 14, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 14, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 14, Sentence 6 0
No sentence-level conversations. Start one.

Researchers have suggested frameworks to guide the assessment of ETPD. Desimone (2009) suggests that researchers seek a consistent framework to enable PD to be “based on a strong theoretical grounding” confirmed through multiple methods including case study, correlational, and experimental approaches (p. 186). Lawless and Pellegrino (2007) have recommended an evaluation schema that addresses (a) types of professional development, (b) unit of analysis, and (c) designs and methods. Much earlier, professional developers had been advised to make concerted efforts to systematically collect data on professional development in terms of the teacher and student outcomes (Guskey, 2000). The persistent challenge for professional developers as consumers of evaluation research reports is sifting out effects on teaching and learning that can be attributed to technology use from those results that are the result of other initiatives. Along these lines, Pianta & Hamre (2009) assert that the value of “observational assessment of teachers for leveraging improvements in educational outcomes is that they can be directly related to the investigation and experimentation of specific interventions aimed at improving teaching” (p. 115).

New Thinking Partner Conversation New Conversation
Paragraph 15 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 15, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 15, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 15, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 15, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 15, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 15, Sentence 6 0
No sentence-level conversations. Start one.

Effective and meaningful assessment of ETPD requires that we design inservice learning activities that can be measured using methods that are consistent with what we know about teaching and learning; recognize teacher and student change as it relates to the larger teaching and learning context; and view evaluation as an inseparable component of ongoing teacher action. We therefore offer for consideration an ETPD assessment model that merges three theoretical constructs currently enjoying much note and utility, through which professional development consumers might interpret research findings: (a) technological pedagogical content knowledge (TPACK); (b) organizational learning; and (c) participant research and inquiry.

New Thinking Partner Conversation New Conversation
Paragraph 16 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 16, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 16, Sentence 2 0
No sentence-level conversations. Start one.

Evaluating PD According to TPACK: The What

New Thinking Partner Conversation New Conversation
Paragraph 17 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 17, Sentence 1 0
No sentence-level conversations. Start one.

Professional development comes in all sizes and flavors, and to make an accurate assessment of the quality and impact of an activity, professional developers must consider the variety of ways teachers learn and the variety of variables that could affect teacher learning. However, simply recognizing that there are differences among professional development attributes, and recognizing just how those attributes can be interrelated for effective technology use, are two very different things. Layering any examination of ETPD findings with the TPACK model (Mishra & Koehler, 2006; Pierson, 2001) provides a helpful lens through which to view the process in light of current pedagogical thinking for 21st-century learners and teachers. The fields of educational technology and teacher education have come to agreement around the concept of TPACK to describe the meaningful use of technology in teaching and learning. Derived from Shulman’s (1986) notion of teaching as the intersection of content knowledge and pedagogical knowledge, the definition of 21st-century teaching also demands excellence in technological knowledge. True technology integration is said to be at the intersection of all three elements (Pierson, 2001). Further, the intersection of any two of the elements defines worthwhile knowledge sets: technological content knowledge, or technologies used for specific content applications; and technological pedagogical knowledge, or technology use for specific pedagogical purposes. Evaluating the effectiveness of professional development, then, must consider how well teachers are prepared to meaningfully use technologies in discipline-specific ways as well as ways that are compatible with multiple teaching and learning strategies and roles.

New Thinking Partner Conversation New Conversation
Paragraph 18 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 6 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 7 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 18, Sentence 8 0
No sentence-level conversations. Start one.

It stands to reason that if the elusive goal of effective technology-integrated teaching can be found at the intersection of content, pedagogy, and technology, then it logically follows that at this same center will be effective assessment (see Figure 1, p. 128). Assessment is an integral and inseparable part of the curriculum development and teaching process— one leading to the next and cycling back again. And if effective technology integration—and assessment of such—is the goal of educational technology professional development, then these elements should be prominent in any evaluation model. Lawless and Pellegrino hint at this importance when they say that “the most important impact a professional development activity can have on a teacher is that of pedagogical practice change ostensibly reflecting a deeper change in pedagogical content knowledge” (p. 597). Likewise, Guskey and Yoon (2009) found that PD projects are effective when they focus on enhancing teachers’ content knowledge and pedagogical content knowledge.

New Thinking Partner Conversation New Conversation
Paragraph 19 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 19, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 19, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 19, Sentence 3 0
profile_photo
Mar 26
Robert Norman Robert Norman (Mar 26 2018 10:13AM) : assessment and TPACK more

The authors make a good case here for TPACK being as relevant to assessment of education as it is to education itself. This is something that has been on my mind since I’ve been developing PD for educational tools. It’s helpful to keep in mind that the same TPACK framework is equally applicable to the assessment of that PD as it is to the development and execution of the PD.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 7:01PM) : Tech frameworks more

I think that you make a good point here, Robert, about the fact that all of our evaluation should be informed by – and continue to re-inform – the ways in which our educational frameworks, especially those for technology, affect our thinking and action related to professional development.

New Thinking Partner Conversation New Conversation
Paragraph 19, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 19, Sentence 5 0
No sentence-level conversations. Start one.

So, as the field continues to explore the usefulness of the TPACK construct to define teacher knowledge with technology, it must also push that exploration into how TPACK can shape evaluation of ETPD efforts.

New Thinking Partner Conversation New Conversation
Paragraph 20 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 20, Sentence 1 0
No sentence-level conversations. Start one.

Evaluating Professional Development within the Context of Organizational Learning: The Where

New Thinking Partner Conversation New Conversation
Paragraph 21 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 21, Sentence 1 0
No sentence-level conversations. Start one.

The problem with those descriptive studies of isolated pockets of successful professional development is that the authors are not always clear enough—and the readers make incomplete assumptions about—the surrounding context of school, student, and administrative factors. Of course, a highly successful effort in one instance may not work as well, even if implemented to the letter, in a less supported or funded or engaged context. In short, context matters, for both ETPD implementation and assessment. To assess the effectiveness of professional development in leading teachers to longlasting gains in knowledge, attitudes, and instructional behaviors, we must examine supporting factors within the teaching and learning context.

New Thinking Partner Conversation New Conversation
Paragraph 22 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 22, Sentence 1 0
profile_photo
Mar 28
Sarah Lewis Sarah Lewis (Mar 28 2018 10:26AM) : What works more

This goes back to my other point, what works with one area, does not necessarily work at a district 50 miles away or on the other side of the country. A BIG factor to consider.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 7:02PM) : Extend this to technology more

To what extent, then, do you think that there are different uses of technology for various schools, classrooms, and even individual students? How must we prepare educators (from K-college and into the workforce) to think critically and creatively about technology in these different contexts?

New Thinking Partner Conversation New Conversation
Paragraph 22, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 22, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 22, Sentence 4 0
No sentence-level conversations. Start one.

Professional development has been characterized in the literature as a variety of leveled developmental models with fixed sequences of stages and levels of knowledge and skills acquisition (Dall’Alba & Sandberg, 2006). Early frameworks for understanding how teachers learned to use technology aimed to shift the focus from the technology tools themselves to the teachers and their developmental needs, and in doing so, they addressed the uniqueness of learners who participate in any professional development session. The role of the school organization, then, was to address these individual needs, moving teachers to more advanced stages.

New Thinking Partner Conversation New Conversation
Paragraph 23 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 23, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 23, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 23, Sentence 3 0
No sentence-level conversations. Start one.

However, although assessing teachers’ needs, as well as understanding the types of assistance teachers might require and concerns they might have based on developmental levels, are indeed necessary steps, the focus on a single teacher’s needs and progress cannot reveal all requirements for success. In fact, the implied linear nature of these staged models may conceal “more fundamental aspects of professional skill development” (Dall’Alba & Sandberg, 2006, p. 383). Rather, we need to examine the learning of an individual within the learning of the system as a whole; this implies a shift toward thinking about how individuals fit into the larger organization and the additional learning that must take place on the organizational levels. Successful educational technology PD initiatives are characterized by an expanded, informed, and connected view of learning on both the individual and the organizational level.

New Thinking Partner Conversation New Conversation
Paragraph 24 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 24, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 24, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 24, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 24, Sentence 4 0
No sentence-level conversations. Start one.

A systems view suggests a focus not only on individual teacher and student growth, but also on changes in organization policies and procedures, infrastructure, curriculum and instruction, expectations for stakeholders, and organizational climate (Newman, 2008). Teacher growth and change will not be sustained without organizational support and, in fact, may be sabotaged (Borthwick & Risberg, 2008). Harris (2007) recommends that making choices about the particular professional development methods and strategies—selecting, combining, and sequencing aspects to craft an overall approach based on its unique needs as a learning organization—is exactly what a school or district must do. Trying out new technologies requires teachers to assume a level of risktaking; to allow for success, teachers will need to work in “a climate of trust, collaboration, and professionalism” where administrators “promote technology-related risk taking among teachers on behalf of students” (Borthwick & Risberg, 2008, p. 39). Desimone (2009) sees context as “an important mediator and moderator” for implementation of professional development models (p. 185), and even further, Zhao, Frank, and Ellefson (2006) suggest, in correlating classroom technology use with professional development activities, that “schools should develop a culture instead of a program of professional development” (p. 173). Evaluation of ETPD effectiveness must, then, include an assessment of the context in which that development is occurring. In that way, consumers of ETPD findings can determine in what ways they can scale a successful project, locally situated within the context of a school whose staff are committed to continuous learning, to different contexts.

New Thinking Partner Conversation New Conversation
Paragraph 25 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 2 0
profile_photo
Mar 27
Robert Norman Robert Norman (Mar 27 2018 3:35PM) : organizational support more

One recurring theme I’ve heard from educators specifically in K12 is that administration has trouble committing to programs (either tech or otherwise) for various reasons. Perhaps it’s the local school board, perhaps the state department of education, perhaps national trends. In looking at educational tools and what makes them successful, I keep coming back to this idea of administrative/organizational support and how vital it is.

profile_photo
Mar 28
Sarah Lewis Sarah Lewis (Mar 28 2018 11:44AM) : support more

Completely agree, Robert! Or they tend to support certain techs without the buy in from those that are going to have to use it. Making sure everyone is on the same page and being as transparent as possible is key to successful integration from the top of the administration to the students that will be forced into implementing it in their education.

profile_photo
Mar 29
Robert Norman Robert Norman (Mar 29 2018 5:56PM) : certain tech more

Sarah, I hear you. When you say that admin might support certain tech without buy-in from teachers, more often than we would like, those contracts before any teacher input is given.

New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 6 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 25, Sentence 7 0
No sentence-level conversations. Start one.

Evaluating Professional Development through Practitioner Research: The How

New Thinking Partner Conversation New Conversation
Paragraph 26 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 26, Sentence 1 0
No sentence-level conversations. Start one.

Change in pedagogical practice is the ultimate goal of professional development. However, few studies use data on teacher use to inform their practice (Lawless & Pellegrino, 2007). So, even if professional developers ensure that assessment of ETPD focuses on teacher knowledge (TPACK) and occurs in a supportive context, how can they feed the results back into practice and allow that practice to inform continued research? The simple answer is that the two—research and practice—must in fact be one and the same.

New Thinking Partner Conversation New Conversation
Paragraph 27 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 27, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 27, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 27, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 27, Sentence 4 0
No sentence-level conversations. Start one.

The evaluation of such a complex educational endeavor as ETPD must employ multiple, flexible, systematic, and rigorous methods. It is the latter two on this list that are infrequently found in reports of ETPD. Yet, if approached from a rigorous stance, even such research methods as action research and case study can meet the “platinum standard” of research. Editors of journals in technology and teacher education propose: “The platinum standard requires rigorous research in authentic school settings that approaches idealized designs as nearly as possible, given the constraints of schools and real-world learning environments” (Schrum et al., 2005, p. 204). These editors recommended undertaking research based on authentic classroom settings as long as it is grounded in theory and builds upon the existing knowledge base. Operating within a platinum research standard opens up as “acceptable” those forms of experimental research that do not require a control group. The approach is a more reasonable methodological goal for classroombased research than a true experimental model, which is often referred to as the “gold standard.”

New Thinking Partner Conversation New Conversation
Paragraph 28 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 4 0
profile_photo
Mar 26
Robert Norman Robert Norman (Mar 26 2018 10:22AM) : constraints more

I kind of chuckle at the phrase “given the constraints of schools and real-world learning environments.” I’m glad they acknowledge this, but for me, those constraints are so wildly varied that they seem impossible to control for.

profile_photo
Apr 1
Dr. Troy Hicks Dr. Troy Hicks (Apr 01 2018 7:04PM) : Given these constraints more

As you think about this from the perspective of being a researcher/evaluator as well as a designer/planner, what is it that you think we might be able to do in order to gather statistically accurate, timely, and useful data from teachers and students in order to make effective decisions about educational technology?

New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 6 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 28, Sentence 7 0
No sentence-level conversations. Start one.

In particular, action research, with its practical and indigenous classroom applications (Nolen & Putten, 2007, p. 403) can provide a framework to explicitly connect professional development to evaluation methodology. As explained by Ham, Wenmoth, & Davey (2008), a self-study approach, then, might not only be the professional development method, but also serve as a form of assessment, leading to larger, common answers about student outcomes. Such a spiraled sequence of inquiry, data collection and analysis, and reflection uses the results of systematic inquiry to inform and lead into the next phase of questioning. The role of teachers as participant researchers is critical to the diagnosis of learning outcomes, identification of subsequent instructional strategies, and input to policy development (Borthwick, 2007–2008). Participant research as a 21st-century professional development assessment model allows teachers as professionals to look at their practice in new ways (Linn, 2006); respects teacher knowledge and experience; and provides a long-term as well as immediate evaluation and feedback loop, with small findings continuously driving the next steps of instruction (Fullan, Hill, & Crévola, 2006).

New Thinking Partner Conversation New Conversation
Paragraph 29 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 29, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 29, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 29, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 29, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 29, Sentence 5 0
No sentence-level conversations. Start one.

In other words, instead of just outside evaluators coming in to assess teaching practice at the end of a prescribed period and some time later feeding those findings back into another professional development workshop, teachers as researchers constantly ask questions of their teaching, collect and analyze multiple forms of data, collaborate with one another, and feed ongoing findings into tomorrow’s teaching plans. This metacognitive approach to the evaluation of professional development enables teachers’ lifelong learning, thus extending the reach of every formal professional development effort.

New Thinking Partner Conversation New Conversation
Paragraph 30 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 30, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 30, Sentence 2 0
No sentence-level conversations. Start one.

Unfortunately, in actual practice, classroom teachers are rarely supported or encouraged to engage in research or scholarly writing; thus, this metacognitive approach may not ever come to be. Ideal partners to facilitate this process are those in higher education—researchers in educational technology and related fields who are regularly engaged in such scholarly activity (Cunningham, et al., 2008). Teachers who have experienced the opportunity to collaborate and co-teach with such outside technology partners were afforded opportunities to experiment with teaching with new
technologies (Zhao, Frank, & Ellefson, 2006). Such school–university partnerships can create the framework for ongoing co-research habits that will continually inform classroom practice and research alike.

New Thinking Partner Conversation New Conversation
Paragraph 31 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 31, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 31, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 31, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 31, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 31, Sentence 5 0
No sentence-level conversations. Start one.

A Model for Effective, Rigorous, and Systematic Evaluation of Educational Technology Professional Development

New Thinking Partner Conversation New Conversation
Paragraph 32 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 32, Sentence 1 0
No sentence-level conversations. Start one.

Assessment of ETPD must recognize the interaction among the TPACK variables of content, pedagogy, and technology to understand the extent of meaningful technology-supported teaching. Assessment
of ETPD must consider the teaching and learning context, both locally and broadly. And assessment of ETPD must close an action research loop joining research with practice. We propose extending the TPACK model, defining effective technology-enhanced teaching, to guide the assessment of ETPD when supported by the surrounding “frame” of context and participant research, specifically phases of Reflection, Inquiry, Collaboration, and Sharing (see Figure 2, p. 130). In this model, individual and organizational learning occur over and over again as educators and their research partners engage in the action research process in light of the organizational context, thus positioning assessment of ETPD as a culture of learning.

New Thinking Partner Conversation New Conversation
Paragraph 33 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 33, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 33, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 33, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 33, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 33, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 33, Sentence 6 0
No sentence-level conversations. Start one.

This contextually situated and inquiry-framed model of ETPD positions TPACK as not only the center of a conceptual model for effective professional development, but also as the center of a comprehensive plan for assessment embedded in the work and culture of teaching. Further, it implies that successful
assessment of PD efforts must have at their centers the basic definitional components of TPACK elements. They must resemble the iterative action research cycle: encouraging PD facilitators to begin any staff development session by asking participants to think about what they can learn from it and how what they will learn is situated in the work that they already do, by posing questions about how teaching and learning can improve, by collaborating with peers and more experienced colleagues to solve problems of practice, and by evaluating and sharing findings with one another as part of an ongoing effort at collective improvement. Such a cycle of inquiry exemplifies Fullan, Hill, and Crévola’s (2006) Breakthrough model, in which “teachers will operate as interactive expert learners all the time” (p. 95). Observing and documenting teacher behaviors “in the thick” of classroom practice, whether done by teachers as participant researchers or in collaboration with research partners, suggests the most direct route to evaluating effects of both formally supported and personally initiated professional development.

New Thinking Partner Conversation New Conversation
Paragraph 34 0
profile_photo
Mar 24
Dr. Troy Hicks Dr. Troy Hicks (Mar 24 2018 2:48PM) : Consider the WWC concerns about qualitative research more

“Why aren’t qualitative studies reviewed by the WWC? The goal of the WWC is to assess evidence of program effectiveness. Therefore, studies included in WWC reviews are based on research designs (randomized controlled trials, quasi-experimental designs, regression discontinuity designs, or single-case designs) that are widely believed to assess the impacts of programs on outcomes of interest through the identification of credible counterfactuals (what would have been observed in the absence of the intervention). Qualitative studies are useful for many things, such as understanding context, assessing the fidelity of implementation, or identifying relationships for further study. However, qualitative studies do not allow for an assessment of effectiveness resulting from a credible comparison of outcomes between program and counterfactual conditions. For this reason, they are not included in WWC reviews.”

What could be done – in this model of ETPD – to mitigate some of the (potentially negative) effects of qualitative elements of the research? In other words, how could qualitative data be used in a productive way to complement the quantitative data being gathered?

profile_photo
Mar 26
Robert Norman Robert Norman (Mar 26 2018 10:29AM) : interviews more

I always appreciate a mixed methods study that uses actual quotes from the participants to support the data analysis of the qualitative findings. Of course, it’s not possible with all quantitative studies, but even the large-scale studies that involve thousands of students could include a few interviews from the sample population. This sort of study might be mostly quantitative in nature with a qualitative sheen, but if the quantitative requirements would first satisfy the WWC standards, the qualitative aspect could make the findings accessible to a various readers.

profile_photo
Mar 28
Sarah Lewis Sarah Lewis (Mar 28 2018 10:54AM) : Students more

I agree with Robert’s point of interviews, getting actual verbal feedback to tie in with grades or ratings is helpful. I also think that you can use PD for teachers on how to teach with new technology but if the students don’t know “how” to learn with the technology then it is a waste of time. For example, I can show my mom, over and over, how to use an iPhone but it will just never stick. She has zero tech background and has no need for it. Learners in K-12 often have a hard time understanding how this will effect their learning or how the skills they are learning can be carried on long term. Tying both the educator and student into the PD would be successful in my opinion along with the grades the students are earning and the success of the educator.

profile_photo
Mar 29
Robert Norman Robert Norman (Mar 29 2018 5:58PM) : discussion more

It’s interesting for me to revisit this discussion after our meeting last night, because when Julie and I were designing our short research project, I had the assumption in my mind that we would be measuring student achievement, and she had the assumption that we would be measuring teacher achievement. It was a surprising moment that led to some reflection on my part, so I really like what you put forth here: tying both students and teachers into the PD evaluation process.

profile_photo
Apr 1
CEDRIC GREEN CEDRIC GREEN (Apr 01 2018 8:59PM) : Mixed [Edited] more

Yes a mixed methods approach is certainly warranted in hopes that both methods will confirm the same findings within the study which further validates the findings. In doing so the results are deemed as useful to others and the population which the study was designed to investigate.

New Thinking Partner Conversation New Conversation
Paragraph 34, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 34, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 34, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 34, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 34, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 34, Sentence 6 0
No sentence-level conversations. Start one.

The Utility of this Model

New Thinking Partner Conversation New Conversation
Paragraph 35 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 35, Sentence 1 0
No sentence-level conversations. Start one.

The potential power of ETPD to enhance teacher knowledge and skills, and thus improve student learning, means it is worth our time to understand what works and in what contexts. Although there are some good, reliable tools available, “they have not been subject to repeated use and validation and are not widely available” (Desimone, 2009, p. 192). Collaborative research efforts such as the Distributed Collaborative Research Model (DCRM) (Pierson, Shepherd, & Leneway, 2009) show the promise of providing guidance about how assessment of ETPD might meet the “platinum standard” of educational technology research. The intentionally collaborative nature of DCRM embeds
into the research design from the outset the capacity for rigorous and valid practice. Partnerships, both in the form of higher education partners working with school partners and partners collaborating across distant locations to obtain a larger database about which to speak more broadly regarding joint findings, are at the heart of this planned collaborative research. The DCRM model further recommends the need for consistent methods and data collection tools to facilitate interinstitutional collaborations. Assessment tools with which we are familiar, including observations, surveys, interviews, and text and video analysis, all have value.

New Thinking Partner Conversation New Conversation
Paragraph 36 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 6 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 7 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 36, Sentence 8 0
No sentence-level conversations. Start one.

The quest for rigor will not be easy. Chapter authors in our TCP text spent much more time describing the activities undertaken as part of the implementation process of their model than providing evidence of student achievement. There are several logical explanations for this, including minimal budgets for program evaluation and minimal time for participant observers to collect data when they were in the thick of preparing for and implementing professional development activities. But our expectations as consumers of professional development reports need to change. We must expect a reporting of evaluation measures, something that now is rarely done (Desimone, 2009). We must expect that we see detail about “what works, for whom, and under what conditions” (Lawless & Pellegrino, 2007, p. 599). We must expect that the ETPD community will share findings among themselves, such as in a clearinghouse (Nolen & Putten, 2007) or in database format, such as the Action Research for Technology Integration (ARTI) (http://etc.usf.edu/fde/arti.php) at the University of Florida (Dawson, Cavanaugh, & Ritzhaupt, 2008).

New Thinking Partner Conversation New Conversation
Paragraph 37 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 5 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 6 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 37, Sentence 7 0
No sentence-level conversations. Start one.

Instead of viewing the complexity of classroom implementation of new instructional approaches as a detriment, we must learn to base ETPD in the heart of where the action takes place. And in fact our proposal of a contextually situated and inquiry-framed TPACK model as a template for the design of ETPD capitalizes on the work of teachers and research partners who embrace an inquiry approach to teaching and learning, connecting systematic evaluation of their participation in professional development activities directly to their effectiveness in the classroom. No matter what format PD takes, through all types of partnered approaches—including mentoring, peer coaching, students as professional developers, and professional learning circles—a contextually situated and inquiry-framed model of ETPD assessment can scaffold the assessment process. If we seek validity in our work and reporting, we must commit ourselves to systematic study of our work and documentation of related outcomes. In this way, our expectations as consumers of—and participants in—professional development evaluation can change.

New Thinking Partner Conversation New Conversation
Paragraph 38 0
profile_photo
Mar 24
Dr. Troy Hicks Dr. Troy Hicks (Mar 24 2018 2:49PM) : Please review the “Ohio ABLE Professional Development Evaluation Framework” more

http://uso.edu/network/workforce/able/reference/development/PD_Eval_Framework_Report.pdf

I encourage you to look through the entire booklet, but focus in on the “Description” section for each level:
- Level 1: Satisfaction (p. 6)
- Level 2: Learning (p. 8)
- Level 3: Behavior (p. 10)
- Level 4: Impact (p. 12)

In order to measure ETPD at levels 2, 3, and/or 4, what would you — as a professional development leader and researcher — need to be able to do with/for participants? In what ways would you need to collect data from them, over time, and across contexts? How would this align with the what (TPACK), where, and how noted above?

profile_photo
Mar 26
Robert Norman Robert Norman (Mar 26 2018 2:47PM) : data more

In thinking about how you would collect this data, for level 2, learning, you could collect test data. Pre- and post-test data would be valuable in measuring learning. For level 3, Behavior, the data collection would probably take more time, and would likely involve some qualitative data. I am picturing some observations, perhaps some interviews. For Impact, you might need a longer-range data collection. This might be a multi-stage research project where data is collected year upon year in a single school system, for example.

profile_photo
Mar 28
Sarah Lewis Sarah Lewis (Mar 28 2018 11:39AM) : Time more

I think the key component of level 2 is collecting data over time at multiple intervals. The educator receiving the PD cannot truly know if the PD was useful right away and may need time to assess whether or not it was useful. So I like that the include possible instruments and time frames to do so. In terms of levels 3 and 4, they seem more long term and more student success driven. These levels can rely more on student success and how they perceive their teachers to be using the new technology/skill that they have been evaluating. The teacher and the student can almost work together on this and the teacher can even go back to the delivery system of the PD and say what did/didn’t work and even have ideas that the students gave them on how to better teach what they were supposed to learn in the PD session.

profile_photo
Apr 1
CEDRIC GREEN CEDRIC GREEN (Apr 01 2018 9:40PM) : Levels more

Level one data collection would be fairly straight forward with a combination of pre/post session feedback. Its important to design the questions to elicit valuable responses. Levels 2-4, are all dependent upon the time allowed to learn the PD concepts, observe behaviors PD learners applying what was learned, and documenting successful impacts and results. Each of these three levels requires time.

New Thinking Partner Conversation New Conversation
Paragraph 38, Sentence 1 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 38, Sentence 2 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 38, Sentence 3 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 38, Sentence 4 0
No sentence-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 38, Sentence 5 0
No sentence-level conversations. Start one.

See references and additional info in the original article

New Thinking Partner Conversation New Conversation
Paragraph 39 0
No paragraph-level conversations. Start one.
New Thinking Partner Conversation New Conversation
Paragraph 39, Sentence 1 0
No sentence-level conversations. Start one.

DMU Timestamp: February 21, 2017 15:38

General Document Comments 0
New Thinking Partner Conversation Start a new Document-level conversation

Image
0 comments, 0 areas
add area
add comment
change display
Video
add comment

Quickstart: Commenting and Sharing

How to Comment
  • Click icons on the left to see existing comments.
  • Desktop/Laptop: double-click any text, highlight a section of an image, or add a comment while a video is playing to start a new conversation.
    Tablet/Phone: single click then click on the "Start One" link (look right or below).
  • Click "Reply" on a comment to join the conversation.
How to Share Documents
  1. "Upload" a new document.
  2. "Invite" others to it.

Logging in, please wait... Blue_on_grey_spinner