In the document below, I have selected key passages out of the following three articles, purposefully picked from three different decades and describing three different populations of students:
As you read, I want you to provide at least two comments on each article (six total) as well as two replies to classmates on each article (six total). For your two initial comments on each article, please address the following:
Then, of course, please reply to two of your classmates in a substantive manner.
In short, your goal for discussion this week is to analyze the research designs of each study as well as to engage in substantive dialogue about what "counts" as technology, teaching, and learning in each example, drawn across multiple decades.
ABSTRACT. Past research on cognition has demonstrated that cognitive learning strategies used to complement instruction can have beneficial effects on memory and subsequent achievement. The utilization of microcomputer technology to deliver instructional content to students provides an optimum environment to examine the instructional effectiveness of embedded instructional strategies. The purpose of this study was to examine the effect of an imagery cue and an attention directing strategy within a context of a microcomputer learning environment that provided both selfpaced and externally paced instruction. Achievement was measured on five different tests designed to measure different educational objectives. One hundred eighty freshman students were randomly assigned to one of nine treatment groups. The results of the study indicate that embedding an imagery cue and an attention directing strategy in an instructional sequence increases student achievement. A combination of the two embedded strategies was also effective in improving students' achievement; however, the combining of the two strategies did not have a cumulative effect. It was also determined that the effectiveness of the embedded strategies was dependent on whether the instruction was self-paced or externally paced.
THE INTEGRATION of the microcomputer into the instructional environment has led to new interest in cognitive-oriented learning strategies. Consequently, considerable research is being conducted on microcomputer based instruction (MCBI) to determine the relative merits of self-paced versus externally paced delivery strategies (Belland et al., 1985). Although much of the early research on basic programmed instruction and computer-assisted instruction attempted to adjust for learner individual differences by using self-paced instructional models, there is indication that these selfpaced models may not be appropriate for all learning conditions . For example, some researchers have noted problems with learner procrastination with completely self-paced instructional models (Reiser, 1985). Self-paced instruction has proved to be effective for some learners (Keller, 1974; Postlethwait, 1974); however, Carrier (1984) has questioned the validity of allowing students to exercise their own judgments about how much instruction they need and in what order.
Wittrock (1979) and Travers (1972) in citing the attentional and instructional research models contend that self-pacing may not be the most effective delivery strategy for all instructional and learning conditions because self-pacing may reduce the attention and motivation levels below those necessary for effective interaction with the content material. This conclusion was supported by Belland et al. (1985). Results from this study found that moderate levels of external pacing of microcomputer-based instruction were significantly more effective than completely self-paced microcomputer-based instruction in facilitating student achievement of complex concept learning and free recall of spatial problems.
Materials and Procedure
The instructional materials used in this study were developed originally by Dwyer (1972) and consisted of an 1,800-word instructional unit on the human heart describing its parts, part locations, and the internal functions during the diastolic and systolic phases.
This content was subsequently revised for this study.
This content was selected because it permitted evaluation of several types of learning objectives that are directly generalizable to those commonly taught in the classroom.
Students participating in the study were 180 first-term freshmen enrolled at Ohio State University.
Participation in the study was one of the available options for receiving extra credit in their psychology course .
After signing up for the study, names on the sign-up sheet were randomly assigned to the instructional treatment conditions, and each instructional treatment group and the control had 20 students.
Treatments
Each of the nine microcomputer-based treatment groups were designed to teach students about the parts and operation of the human heart during systolic and diastolic functioning.
The instructional content was an adaptation of the Dwyer (1972, 1978, 1987) instructional stimulus materials.
The instructional content in each of the three instructional programs and sequence of content were identical.
During the MCBI instructional programs, students viewed and interacted with 57 different instructional segments that consisted of a visual with a verbal description and arrow, or arrows, pointing out the important information in that display.
The instructional display, in its basic form without strategies, consisted of visual information, verbal labels, and a verbal description.
This information in each instructional display was generated with visual first, then part names or operation names appear, then three to eight lines of verbal text appear under the visual.
There were three types of visuals used in the instructional programs.
Each instructional display consisted of some combination of one of the visuals and a verbal explanation (Figure 1).
...
Discussion
The results of this study indicate that an imagery cue strategy embedded in the instructional content increases the amount of information acquired and the students' ability to use that information.
Similar findings also resulted when the attention directing strategy was embedded into the instruction.
A combination of the two embedded strategies was also effective in improving the students' information acquisition; however, the combination of the two strategies did not have a cumulative effect. It was also determined that the effectiveness of the embedded strategies was dependent on the basic format design of the microcomputer-based instruction, e.g., selfpaced versus externally paced.
The findings of this study may be explained in part by the fact that the imagery cue and attention directing strategies are different forms of rehearsal. Focusing attention allowed time for incoming information to remain in short-term memory long enough to be elaborated on and encoded for longterm memory (Anderson, 1980; Atkinson & Shiffrin, 1968; Dwyer, 1987; Lindsay & Norman, 1972; Murray & Mosberg, 1982).
...
Conclusion
The program embedded learning strategies of imagery cue and attention directing can be used individually or in combination.
Both strategies tend to increase learning in terms of amount and ability to use learned information.
If information to be learned is mostly spatial, combining the two strategies would be helpful for most learners.
However, if the basic MCBI program design is self-paced, the program embedded learning strategies may not be as effective.
The results of this study fit past research results on cognitive learning strategies using imagery cue and attention directing. Further work needs to be done on cognitive learning strategies that are practical and can be easily used by students. Such learning strategies should be examined within the context of innovative instructional methods using the new electronic technologies. In addition, learning strategies should be evaluated in terms of the amount of information processing and the effect that the different levels of information processing have on student achievement of different types of educational objectives.
Abstract
This study examines the effects of Interactive Multimedia instruction upon the variables of achievement and problem solving skills on non-science majors in an Environmental Science course at a mid-western university.
The findings indicate that the Interactive Multimedia had a significant effect on both of the variables.
The findings are discussed in terms of the impact on self-study when students are learning outside of the classroom in a distance learning environment.
...
Introduction
During the 1990s critics and supporters alike have been questioning the effectiveness of science education within the American Education System.
Pearlman (Gandolfo, 1993) states, "In the United States alone, education costs $450 billion a year.
It is a huge burden, yet almost everybody agrees that schools are failing."
Pearlman believes schools have not taken advantage of the teaching and learning enrichment that technology tools provide and it is one of the major reasons they are failing.
Perhaps schools are having some difficulty in catching onto many of the new opportunities that technology tools offer, but computers are widespread in the schools and some good things are happening.
This article examines how schools are now using interactive computer-based multimedia as a tool to develop thinking skills needed to assimilate and transform massive quantities of information into solutions for today's fast paced changing society.
Case study
One of the problem areas for schools today revolves around the lack of student interest in science.
Enrollment is down and performance is low.
However, today's generation shows an interest in things pertaining to the environment.
In this study a package entitled Environmental Science: Computer Lab Simulations, Hirschbuhl, Bishop and Jackson (1996) an interactive multimedia (IMM) program based on actual geological field studies, was used by undergraduate non-science majors at a mid-western university.
For this study these simulations were added to one section of an environmental studies course and compared to a section that used only the traditional method of instruction (classroom lecture).
According to Trollip and Alessi (1988) one of the purposes of adding computers to classroom instruction is to facilitate learning for students by improving the quality and quantity of what they know. Schwier and Misanchuk (1993) believe an advantage of interactive multimedia instruction is the creation of meaning developed by the learner's interaction with the new information in the program.
At the time of this investigation there was very little empirical evidence regarding the use of interactive multimedia instructional technologies in higher education (Sports and Bowman, 1995 March/April). Most articles were anecdotal describing outstanding professors using the latest technological invention. The problem stems from the lack of research (Heller, 1990; Park, 1991: Park and Hannafin, 1993: Preece, 1993: Zachariah, 1995) regarding the effects of a self-paced interactive multimedia computer simulation on students' learning, motivation, and attitude. Reeves (1993) stated for a worthwhile study of interactive multimedia, a diverse population spending several hours in purposeful study should be examined.
Purpose of the study
This research examined the impact on students' grades and higher level thinking skills when computers were added to the classroom.
Interactive multimedia simulations of "real world situations" (actual field trips of a geology professor with 22 years' experience) were incorporated into one section of an environmental geology course.
The interactive multimedia modules, which promoted participation and interaction, were designed for students to gain scientific knowledge and concepts, and develop problem-solving skills without the heavy use of math.
Research design
The research design was quasi-experimental because it combined the use of "naturally assembled" intact groups (Campbell and Stanley, 1963), pre-test and post-test, and the use of a control group.
The control group research design used is shown in Table 1.
Sample
One hundred and fifty-two students were involved in the study in the spring of 1996.
The control group (113 students) received the traditional lecture method of instruction, while interactive multimedia replaced part of the traditional instruction for the treatment group (39 students).
GALT instrument
The Group Assessment of Logical Thinking (GALT) (Roadrangka et al., 1982, 1983) was designed to measure student cognitive development.
This instrument has been used by many as a predictor of student's math and science achievement (Bitner, 1986, 1988, 1991).
Roadrangka et al.
(1983) reported a coefficient alpha of 0.85 for the total test and validity was reported with a strong correlation (0.80) between the GALT and Piagetian interview results.
The GALT consisted of 21 questions of which 18 questions not only required the student to pick the most appropriate answer, but also the reason the student chose that answer. For the question to be considered correct the student must correctly choose both the answer and the reason. Students are classified as concrete thinkers with a GALT score of 0 to 8, transitional with a GALT score of 9 to 15, and those with GALT scores of 16 to 21 are recognized as formal thinkers.
Interactive multimedia instruction
Multimedia can be loosely defined as computer-based technology integrating some, but not necessarily all, of the following: text, graphics, animation, sound, and video (Barron and Orwig, 1995).
There are several definitions of IMI (interactive multimedia instruction (Galbreath, 1992)) and, as the multimedia environment changes rapidly, so may the meaning of interactive multimedia instruction.
According to Schwier and Misanchuk (1993) IMI is "instructional, multiple-sourced (ie, multiple media sources are involved) intentionally designed, and coherent" (p. 4).
Interactive multimedia modules for environmental geology
The topics of the eight units of interactive multimedia are: 1) Introduction to Environmental Science, 2) Energy from Coal, 3) Geology of Homesite Selection, 4) Minerals for Society, 5) Legal Control of the Environment, 6) Stream Pollution, 7) Streams and Floods, and 8) Radiation in the Environment.
All of the modules are based on actual field studies, and the student assumes the role of investigator, with each module presenting a different environmental problem.
All modules have a consistent screen format and each consists of an introduction to the problem to be addressed, the role that the student must play in the investigation, a means for the student to collect data relevant to the problem, a modeling book where the student enters, reviews and draws conclusions based on the collected data, a multiple choice test and an essay test. The essay test puts the responsibility of solving the problem on the shoulders of the student, who takes on the identity of an official in charge, who must make critical decisions based on the problem and data collected. The student is always in control and may access additional data, recheck data collected, access a glossary of appropriate terms, listen to audio associated with the problem, and may elect to take or repeat the multiple choice test. From the menu page a student can access any one of the eight modules, the progress report, or send the instructor email. Students must make some data collections before they are allowed to take the test or write their essays; thus forcing active participation and preventing students from rapidly paging through the modules and declaring themselves finished.
...
Summary of Conclusions
The following conclusions were drawn from this study:
Discussion
Overall, this study validates the effectiveness of the IMM treatment in significantly increasing student achievement and problem solving skills in environmental science. The following statements support this claim:
First, this study appears to validate the use of the GALT as a predictor of student performance because the probability of those students with a GALT score of 11 or above receiving a passing grade (B or better) was significantly greater than those with GALT scores less than 11.
Next, both groups had post-test GALT score gains over the pretest. The treatment group showed a significant reasoning gain while the gain for the control group did not. When the gains of the GALT scores between groups were compared the difference was not significant. Part of the increase in the GALT scores might be attributed to students taking the same GALT test for both the pre- and the post-test.
Finally, the proportion of students with a passing grade (B or better) was significantly higher for students in the treatment group when compared with those in the control group. This increase was so significant it is hard to suggest uncontrollable variables normally attributed to the difference, such as; classes meeting at different times of the day, different days of the week, being taught by different instructors, grading standards or student attendance could account for all of the variance. This result is supported by the findings of Massaro's (1995) study.
According to Wills and McNaughton (1996) educational software using interactive multimedia must actively engage the student, which is exactly what we witnessed one morning when not a sound could be heard as students were intensely engrossed interacting with the computer program in the Multimedia Lab.
This study validates the effectiveness of the use of interactive multimedia as field trip simulations for an environmental geology course. However future studies should be conducted using different research design, methodologies, disciplines and quality software to determine the long-term consequences of the use of interactive multimedia.
Abstract
An experimental study of the Technology Immersion model involved comparisons between 21 middle schools that received laptops for each teacher and student, instructional and learning resources, professional development, and technical and pedagogical support, and 21 control schools. Using hierarchical linear modeling to analyze longitudinal survey and achievement data, the authors found that Technology Immersion had a positive effect on students’ technology proficiency and the frequency of their technology-based class activities and small-group interactions. Disciplinary actions declined, but treatment students attended school somewhat less regularly than control students. There was no statistically significant immersion effect on students’ reading or mathematics achievement, but the direction of predicted effects was consistently positive and was replicated across student cohorts.
...
Introduction
The present vision for educational technology imagines technology's infusion into all aspects of the educational system. Many educators, policymakers, and business leaders recognize technology's pervasive presence in individuals’ daily lives and its ties to future opportunities for students who must compete in a global, knowledge-based economy (Friedman, 2005). Providing the technological, informational, and communication skills needed by 21st century learners, however, challenges schools to move beyond conventional modes of teaching and learning as well as the traditional boundaries of the school day and school walls.
Some researchers believe widespread technology use in society is moving schools inevitably toward more extensive and innovative applications of technology in curriculum and instruction (Dede, 2007; Smith & Broom, 2003). This view acknowledges that students who attend schools today are different from those of previous years because using technology in nonschool settings is altering their “learning styles, strengths, and preferences” (Dede, 2007) New technologies are reshaping how students access information, communicate, and learn within and outside of classrooms (Smolin & Lawless, 2007). Schools, accordingly, must capitalize on students’ natural inclinations as learners.
Emerging technologies are also supporting more innovative forms of teaching and learning. For example, lessons supported by technology can involve real-world problems, current and authentic informational resources, virtual tours of remote locations, simulations of concepts, or interactions with practicing experts and global communities. These kinds of experiences are important because research shows that students learn more when they are engaged in meaningful, relevant, and intellectually stimulating work (Bransford, Brown, & Cocking, 2003; Newmann, Bryk, & Nagoaka, 2001). Technology-enhanced learning experiences also can help students develop 21st century competencies, such as thinking and problem solving, interpersonal and self-directional skills, and digital literacy (Partnership for 21st Century Skills, 2006).
Texas, similar to other states, recognizes that students’ long-term success is tied to their preparation as lifelong learners, world-class communicators, competitive and creative knowledge workers, and contributing members of a global society. Yet, despite high aspirations for technology, the piecemeal way in which most schools have introduced technology into the educational process has been an obstacle to the effective use of technology for teaching and learning (Texas Education Agency [TEA], 2006).
Recognizing this limitation, the Texas Legislature in 2003 set forth a different vision for technology in Texas public schools. Senate Bill 396 called for the TEA to establish a Technology Immersion Pilot (TIP) that would immerse schools in technology by providing individual wireless mobile computing devices and technology-based learning resources along with teacher training and support for effective technology use. In response, the TEA has used more than $20 million in federal Title II, Part D monies to fund Technology Immersion projects for high-need middle schools. Concurrently, a research study, partially funded by a federal Evaluating State Educational Technology Programs grant, has investigated whether exposure to Technology Immersion improves student learning and achievement.
The Present Study
The present article reports third-year findings for students involved in a comprehensive experimental study of the effects of Technology Immersion on schools, teachers, and students. Specifically, we contrast outcomes for two cohorts of middle school students who attended Technology Immersion schools with students in control schools on measures of technology-related learning experiences and competencies and measures of academic achievement (reading and mathematics test scores). We present longitudinal outcomes for Cohort 1 students who attended schools across three project implementation years (Grades 6–8) and Cohort 2 students who attended schools during two implementation years (Grades 6–7).
Research Questions
The overarching purpose of the study was to investigate the effects of Technology Immersion on students’ academic achievement—however, we also examined the relationships among Technology Immersion and intervening factors at the school, teacher, and student levels. The research involved 42 middle schools assigned to either treatment or control conditions (21 schools in each group). In the present study we addressed two research questions:
Research Question 1: What is the effect of Technology Immersion on students’ learning opportunities (i.e., classroom activities, engagement)?
Research Question 2: Does Technology Immersion affect student achievement?
...
Effects of Technology Immersion on Academic Achievement
Given that changes in students and their learning experiences were expected to mediate academic performance, we next estimated treatment effects on students’ TAKS T scores. Our analyses concentrated on reading and mathematics scores because students completed TAKS tests for those subjects annually, whereas they completed TAKS tests for writing, science, and social studies at intermittent grade levels. We used three-level HLM growth models to examine how students’ TAKS reading and mathematics achievement varied across time (the point at which students completed TAKS assessments each spring), students, and schools. As Table 5 shows, we estimated school mean rates of change as well as the separate effects of student economic disadvantage and the school poverty concentration on TAKS reading and mathematics performance. Each HLM analysis included approximately 3,000–3,330 students divided nearly equally between the 21 treatment and 21 control schools. Comparable proportions of students were retained in analyses across years (58%–59% of treatment students, 58%–61% of control students).
Discussion
The study of Technology Immersion is distinguished from previous research on one-to-one computing environments by its experimental design and use of a theoretical framework to investigate causal mechanisms. The theory of change assumes that treatment students experience technology-rich school and classroom environments that foster more active and meaningful schoolwork, which in turn, enhance students’ personal competencies and engagement and ultimately increase academic achievement. Before discussing results, it is important to note that teachers and students in control schools typically had access to computers and digital resources in computer labs or media centers, as classroom stations (usually 1–3 computers), or on checkout laptop carts. Thus, control schools continued the traditional approach with technology integration resting largely on the motivation of individual teachers, whereas Technology Immersion schools committed to whole-school integration. In sections to follow, we discuss key findings relative to the study's research questions and the implications for one-to-one laptop programs at other schools.
Summary of Effects
Implications for Technology in Schools
The relationship between technology and student achievement continues to be an important topic and the focus of considerable research. Some recent and influential studies have raised concerns about the viability of financial investments in educational technology (e.g., Cuban, 2001; Dynarski et al., 2007). Likewise, if improved standardized test scores is the primary justification for investments in one-to-one laptop programs, then results probably will be disappointing. Evidence from this study suggests that large-scale one-to-one laptop programs are difficult to implement, and, as a result, programs may produce either very small or no improvements in test scores. Nonetheless, as the costs of laptops decline and the uses of wireless computers expand (e.g., digital textbooks and resources, online testing, school-to-home communication), interest in laptop programs is increasing (Zucker & Light, 2009; Zhao, Y. and Frank, K. A. 2003.) . This pilot study of the Technology Immersion model offers lessons for school leaders as well as policymakers who are considering laptop programs for their schools.
Foremost, effective technology use clearly involves more than just buying computers and software. This study and others suggest that laptop programs may be more effective when technology is part of comprehensive school reform initiatives (Ringstaff & Kelley, 2002; Zhao & Frank, 2003 Woodul, C., Vitale, M. and Scott, B. 2000.) . Successful Technology Immersion schools had highly committed administrative leaders who secured teacher buy-in for student laptops and provided the support components specified by the model. Particularly important were investments in technical support for school networks and timely laptop repairs, and the provision of ongoing professional development for teachers (Shapley, Maloney, Caranikas-Walker, & Sheehan, 2008). Consistent with other research, schools that served mainly economically disadvantaged student populations encountered numerous obstacles in trying to implement a complex school reform model (Desimone, 2002; Vernaz, Karam, Mariano, & DeMartini, 2006). Thus, those schools needed additional planning time to build capacity and secure adequate supports prior to implementing an immersion project.
Additionally, one-to-one laptop programs were more likely to be well implemented and sustained if laptops advanced overall goals for student learning and achievement. District and school leaders who embraced Technology Immersion believed that individual student laptops had benefits above and beyond simply raising standardized test scores. Financial investments in laptops were part of an overall migration toward digital school environments, including electronic textbooks, online assessments, and virtual coursework. These leaders believed laptops helped prepare their students for the 21st century, exposed them to worldwide cultures, expanded learning outside of school, and moved students toward product creation and away from drill and practice for tests. Technology Immersion supported their vision for learning opportunities that intellectually challenged and motivationally engaged students, inspired students to learn on their own, and prepared students for life, further education, and careers.
Logging in, please wait...
0 General Document comments
0 Sentence and Paragraph comments
0 Image and Video comments
Combining the instructional strategy of redirection from the teacher with the MCBI program makes it difficult to attribute any increased learning to the MCBI program. Could it have been more effective to do this research in multiple steps?
New Conversation
Hide Full Comment
The researchers used a quasi-experimental design with “intact groups.” This could affect the internal validity of the research because participants were not randomly assigned to the experimental group or the control group. In addition, I’m not sure they can disregard the outside factors as we know it has been proven since that the actual teacher accounts for 30% of variance in learner achievement (Hattie, 2003).
Reference:
Hattie, J. (2003). Teachers Make a Difference, What is the research evidence? Australian Council for Educational Research (ACER), 18. Retrieved from https://research.acer.edu.au/cgi/viewcontent.cgi?article=1003&context=research_conference_2003
New Conversation
Hide Full Comment
I find it interesting that in 10 years time, the technology went from merely being a visual aid to including “interactivity.” In 1989, the visual aid was simply graphics, text and sound…in 1999 the technology added animation and video (and I’m sure better graphics).
New Conversation
Hide Full Comment
With each new or innovative (for the time) technology, engagement increases. Should we be looking more at the effects educational technology has on engagement and motivation, rather than its effects on student achievement? Wouldn’t an increase in engagement and motivation naturally produce an increase in achievement? I’m curious to hear others thoughts. Technology is not a magic potion.
New Conversation
Hide Full Comment
…technology is available, does not mean it is being used effectively or in an “innovative” manner. Our classrooms are fully one-to-one and vastly different in their technology use. We have found that to be highly dependent on the teacher and their comfort level with technology and their mindset.
New Conversation
Hide Full Comment
This claim made by the researchers confuses me. How can the effect be statistically insignificant, but the predicted effects be positive? Don’t we use the results of a study to make our predictions? Am I misreading?
New Conversation
Hide Full Comment
Again, I have to go back to how we, as a nation, measure student achievement. Technology can allow students to create and innovate, but that probably won’t show up in a standardized test. How can we change the way we assess students to reflect the positive impact technology provides?
New Conversation
Hide Full Comment
General Document Comments 0
In the document below, I have selected key passages out of the following three articles, purposefully picked from three different decades and describing three different populations of students:
Canelos, J., Dwyer, F., Taylor, W., Belland, J., & Baker, P. (1989). The Effect of Embedded Learning Strategies in Microcomputer-Based Instruction. The Journal of Experimental Education, 57(4), 301–318.
Frear, V., & Hirschbuhl, J. J. (1999). Does interactive multimedia promote achievement and higher level thinking skills for today’s science students? British Journal of Educational Technology, 30(4), 323–329.
Shapley, K., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2011). Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement. The Journal of Educational Research, 104(5), 299–315.
As you read, I want you to provide at least two comments on each article (six total) as well as two replies to classmates on each article (six total). For your two initial comments on each article, please address the following:
With the first of your comments, address some aspect of the research design. In what ways was the study framed (experimental, quasi-experimental, case study, etc)? What are the advantages and disadvantages of framing a research study about educational technology in this manner?
With the second comment, address some aspect of the discourse surrounding educational technology. How is the technology described? What is its purpose (as a component of teaching and learning)? In what ways does this use of technology position teachers? In what ways does it position students? In short, who has power and agency as you look at the way technology is described?
Then, of course, please reply to two of your classmates in a substantive manner.
In short, your goal for discussion this week is to analyze the research designs of each study as well as to engage in substantive dialogue about what “counts” as technology, teaching, and learning in each example, drawn across multiple decades.
New Conversation
Hide Full Comment Hide Thread Detail
The researchers approach was experimental design and their methodology was pretests and posttests. In educational research, a pretest measures attributes before the subjects receive treatment. In posttests, on the other hand, measure attributes after the subjects receive treatment. The pretest and posttest have advantages and disadvantages. The disadvantages are as follows: the processing time is consuming; subjects may inflate the test answers; and subjects may ask pre-questions about the test. An advantage, on the other hand, is the researchers control the treatment and outcome of treatment.
New Conversation
Hide Full Comment
In terms of teaching-learning, the researchers examined the effects of MCBI within the context “of an imagery cue and an attention directing strategy…that provided both self-paced and externally paced instruction” (p. 11). In the context of the study, the teacher was positioned on the outside of the study as an observer or facilitator. The study, for the most part, was focused on student achievement in terms of self-paced or externally paced in a learning environment. That is, MCBI was an embedded technology designed to cognitively teach students from computer-based instructions at a self-pace. Thus, this puts the power of using a computer-based technology in the center of a learning environment. In the outcome, MCBI alone without instruction was not effective, according to the researchers.
New Conversation
Hide Full Comment Hide Thread Detail
This study did focus on student achievement with regards to the 5 tests following the instructional treatments. Keeping in mind this was a study for imagery, MCBI appeared to be void of verbal instruction; although it stated the word “verbal instruction” below the images, it was text and it did not seem to provide the learners with audio for listening. Knowing student learning styles, especially when learning within a traditional environment (which contains quite a bit of verbal instruction), I wonder if the students would have increased their scores on the tests by simply adding the audio? Might have been too advanced and not as easy to do at that time, but curious if it would have impacted outcome.
New Conversation
Hide Full Comment
The research design was quasi-experimental, and methodology was grouping. Quasi-experimental was useful for this study because the researchers used intact groups (Creswell, 2015), and the assignments were not random. That is, the groups were naturally assembled. However, according to the text, the disadvantage of using quasi-experimental or naturally assembled groups was that it imposes a threat on internal validity. Moreover, a threat on internal validity lessons the reliability of the outcomes.
New Conversation
Hide Full Comment
The purpose of the study was to investigate the “impact on students’ grades and higher-level thinking skills when computers were added to the classroom” (p. 30). The researchers used the Group Assessment of Logical Thinking (GALT) to measure higher level and critical thinking skills. The researchers examined the effects of computers in the classroom alongside instructions from a teacher. The controlled group received instructions from the teacher, while the interactive multimedia group received a combination of instruction and media training. The student was in the center of the instruction and media training. In the outcome, the students with the interactive multimedia treatment had significantly higher grades than the controlled group.
New Conversation
Hide Full Comment Hide Thread Detail
This is where I struggle to attribute success (or lack of success) to the interactive media. There is no controlling for the experience level, characteristics or nature of the teacher of the control group. We all have had teachers who are extremely engaging and motivating, and those who were not so much. I find this to be a limitation of the study. Thoughts?
New Conversation
Hide Full Comment
The research design was experimental research and the approach was group comparison. The advantage of using experimental research in education was that researchers could compare the relationships between the independent variable (or integration of technology) to the dependent variables (or student achievements). The disadvantage of using experimental research or grouping was that random error may not reflect the true comparison in the outcome.
New Conversation
Hide Full Comment
The purpose of immersing the technology in a classroom environment was to investigate the effects on school, teacher and student achievements. Both the teacher and student were at the center of the study. In the outcome, immersing technology in the classroom had no positive effects on student achievement in math or reading. The researchers also discovered that implementing computers in a classroom environment is a difficult task and logistical nightmare.
New Conversation
Hide Full Comment Hide Thread Detail
Thanks, John, for your thorough discussion of the research designs in each of these segments from the articles. You’re definitely noting the relative strengths and weaknesses of different research designs, and as you continue to think about how you might explore educational technology in your own work, I would also encourage you to think about the ways that the technologies themselves are described.
What is it that the researchers in these different decades focused on when describing the different technologies? How is it that they attributed value to what the technology was and what it could do?
New Conversation
Hide Full Comment
John, although the study indicated small or no improvement on test scores, state standardized assessments are “one size fits all” and does not adequately determine achievement in the true sense of student knowledge and understanding. Possibly using benchmarking throughout the year in both math and ELA internally may have proved more beneficial. I would be interested to find out further overall course grades, attendance (both teacher and student), student academic behaviors noted and teacher perceptions of the laptop use.
It can be a “logistical nightmare” at times, but when properly planned and phased in appropriately, student and teacher technology access 1:1 is quite powerful. Think of no longer having snow days :)
New Conversation
Hide Full Comment
In this study, an experimental design was used with treatment groups and a control group. While this design is effective in determining the effects of a treatment, it is difficult to do in the field of education. Many times, this is due to the fact that there are so many other factors to consider when exploring student achievement.
New Conversation
Hide Full Comment Hide Thread Detail
Corinne, I did find it interesting the students all participated in the same computer lab, unknowing what type of treatment each was experiencing and testing in a separate room. The researchers attempted to make the environment consistent, yet with online instruction such as MCBI, that is precisely the beauty of it, access anytime or anywhere.
New Conversation
Hide Full Comment
The study was quantitate experimental research conducted using MCBI, with 9 different instructional treatments; the control group (9th) simply taking the 5 tests void of the MCBI experience in the computer lab.
Advantages relate to technology use, and potential use for instruction, identifying key characteristics within the MCBI system that allows for visual imagery to “improve memory of spatial information”.
Disadvantages might be the MCBI system itself as an instructional tool (keeping in mind this was 1989) with limitations of interaction and adaptation as students click through the instructional segments. Feedback was present for the activity questions, but lacked extensions or hyperlinks for students to go deeper into the human heart and its operations.
New Conversation
Hide Full Comment
This is a good example of TPACK application, taking a look at learning strategies made possible through technology tools and systems. Focusing more on a self-direct system such as the concept MCBI, it positions students to control their pace of learning and processing. Planning and preparation when using educational technology is so important, with the role of the teacher critical in this process to customize an experience for each student.
New Conversation
Hide Full Comment
The research conducted was quasi-experimental, using pre and post test data student data for quantitative study, along with student grades.
Advantages may be identified through the use of interactive multimedia for instruction, exploring different real world problems with engagement required prior to assessing for understanding. Focusing on student grade data is a simple measure that most understand and can be easily related to student achievement.
Disadvantages might be identified with the student computer access; was it limited to class time or did students have continued access beyond class hours (assuming it was during class based on the number of students in the treatment group and access to a computer lab)? The addition of extra credit for participation may have narrowed the test group to the higher achieving students as well.
New Conversation
Hide Full Comment
This one was interesting to me, by using the GALT instrument for pre/post testing it seemed to validate an element of educational technology impacting higher level thinking skills. The description of the technology used in this study did appear more engaging, especially with the email feature to the teacher. I wonder about adding one additional layer to include peer-to-peer engagement within the system and how that might relate to the collaborator ISTE standard for students who participated.
New Conversation
Hide Full Comment
The research conducted was experimental, which was longitudinal over the span of 3 years. The authors examined a 2-year cohort, as well as a 3-year cohort following the implementation of 1:1 laptops for both teachers and students. It appeared to be a large group, over 40 middle schools, with 21 of them in the experimental group…depending on the size of the schools, that is a lot of people.
Advantages can be the size of the treatment group, it provides a greater generalization on that scale. The research questions also focused on student achievement, but also the effect of technology has on learning opportunities highlighted accessibility for students and teachers.
Disadvantages might be the definition of student achievement, limiting to state assessment scores. Also the ability to successfully implement and train teachers to effectively use the laptops for instruction within the treatment schools. Creating the opportunities takes time and planning.
New Conversation
Hide Full Comment
With this more recent study, the purpose is related to opportunity and accessibility to technology. Providing hardware is a terrific first step, but also as noted by the authors, training is critical and school leaders need to “embrace” the immersion, creating time for learning of software and tools as well.
New Conversation
Hide Full Comment