NowComment
2-Pane Combined
Comments:
Full Summaries Sorted

PAPER 7

Journal logo Imprint logo Computers & Education 183 (2022) 104496

Improving student engagement during in-person classes by using functionalities of a digital learning environment

J´eroˆme Hutain, Nicolas Michinov *

Univ Rennes, LP3C (Psychology of Cognition, Behavior, Communication Laboratory), Rennes, France

A R T I C L E I N F O

Keywords:

Digital learning environment Quizzes

Lecture

Student engagement Wooclap

A B S T R A C T

The introduction of digital learning environments in higher education requires teachers to be able

to optimize their use to improve student engagement in the learning process during in-person classes. In a quasi-experiment (N = 303), an increasing number of functionalities of a digital learning environment was used to examine the impact on changes in cognitive, affective, and

behavioral student engagement between the beginning and the end of a series of lectures. The three conditions were: ‘low number of functionalitiesin which students had only to answer quizzes during the lectures; ‘moderate number of functionalitiesin which, in addition to quizzes, students could ask the teacher written questions at different moments during the lectures; ‘high number of functionalitieswhich added a functionality compared to the previous two enabling students to visualize the teachers slideshow for the course on their own device in real time during the lectures. Results revealed that visualizing the teachers slideshow on their own device in addition to quizzing and questioning increased affective engagement of students between the beginning and end of the lectures. Furthermore, when only quizzing activities were provided, a greater proportion of students engaged behaviorally to perform additional quizzes administered one week after the end of the last lecture to prepare exams. The discussion evokes both preventing multi-tasking activities, and the need for students to self-evaluate by performing additional quizzes depending on the functionalities used by the teacher during the lectures.

Learning begins with student engagement (Shulman, 2002, p. 38).

1. Introduction

For many years, engagement has been recognized as being at the core of the learning process (Mosher & McGowan, 1985), and extensive research has investigated how to improve student engagement in higher education (Zepke & Leach, 2010). Although a number of studies have examined student engagement in online and blended learning settings (Akkoyunlu & Soylu, 2006; Galikyan & Admiraal, 2019; Manwaring et al., 2017), less research has examined the improvement of student engagement between the beginning and end of a series of lectures using functionalities of a digital learning environment available on the students device during in-person classes.

Digital learning environments can be used in higher education to help teachers to promote engagement in learning among students,

* Corresponding author. LP3C, Universite´ Rennes 2, place du Recteur Henri Le Moal, 35043, Rennes Cedex, France.

E-mail address: [email protected] (N. Michinov).

https://doi.org/10.1016/j.compedu.2022.104496

Received 23 August 2021; Received in revised form 3 March 2022; Accepted 7 March 2022

Available online 10 March 2022

0360-1315/© 2022 Elsevier Ltd. All rights reserved.

and students are incited to engage in learning mediated in this way (Bergdahl et al., 2018, 2020). A number of studies have shown an increase in student engagement when different tools were used such as web 2.0 technologies (Cakir, 2013; Schindler et al., 2017), clickers (Han & Finkelstein, 2013), virtual worlds (Pellas, 2014), gamification (da Rocha Seixas et al., 2016), etc. Overall, these studies demonstrated that it is not the digital learning technology itself, but how it is used by teachers that influences student learning (Giesbers et al., 2013). Although a vivid debate among practitioners has recently emerged about banning or integrating student devices during in-person classes in higher education (Vahedi et al., 2019), growing evidence suggests that teachers would gain by using them to engage their students in learning, and potentially, improve academic performance (Derounian, 2020; Elliott-Dorans, 2018). Thus, integrating technologies in lectures is a new challenge for teachers in higher education, which requires determining how to enhance student engagement during in-person classes using certain functionalities available in digital learning environments, and identifying how many of them should be used to improve learner engagement. Indeed, using some of these functionalities may contribute to focusing students on learning, not only by using overt or engaging activities (e.g., answering quizzes and asking the teacher questions), but also by promoting deep and active processing of information (e.g., focusing attention on the course and preventing multitasking activities).

1.1. Digital learning environments (and their functionalities) during in-person classes

For a long time, audience response systems (ARS or clickers) have been used to engage students during lectures (Chien et al., 2016; Hunsu et al., 2016; Kay & LeSage, 2009; Wood & Shirazi, 2020), with a positive impact on academic performance and other outcomes (Buil et al., 2016; Caldwell, 2007). These systems have evolved to be available on the web and have been transformed into digital learning environments or web-based applications, providing new ways to stimulate engagement and learning during in-person classes (Papadopoulos et al., 2021; J. Schell et al., 2013). Many of the environments proposed by several EdTech companies (e.g., Kahoot!, PollEverywhere, Mentimeter, Socrative) are essentially used for administering quizzes (Wang & Tahir, 2020), notably because of the recognized positive effect of testing on learning and academic performance (McDaniel et al., 2007; Roediger et al., 2011). Although other functionalities beyond quizzes are possible using digital learning environments, they have not systematically been explored. For example, some of them may provide additional functionalities such as allowing students to ask the teacher written questions during a lecture by posting messages and/or synchronizing the teachers slides with the studentsdevices to visualize slideshows in real time. Such environments offer both students and teachers new learning and teaching opportunities and novel ways of apprehending course contents. Indeed, by offering students the possibility to post messages in a written format, anxious students (and any other student) may overcome their apprehension about asking the teacher questions in large lecture halls (Beekes, 2006; Stowell et al., 2010). Similarly, because students may not be able to see the screen easily in a large lecture hall (Cunningham, 2011; Maclaren et al., 2017), offering them the possibility to visualize the teachers slideshow on their own device should contribute to improving focus on the course contents. It may also be a useful solution to prevent distractions due to multitasking activities on their own device. Beyond quizzes, which are one of the main activities, it is important to know how teachers can use and exploit other functionalities of digital learning environments (and how many of them should be used), to improve the engagement of their students in learning.

1.2. Student engagement in learning

Definitions and ways to measure engagement have proliferated (Appleton et al., 2008; Christenson et al., 2012), and much research has also examined how engagement may contribute to learning in a variety of educational settings from primary to higher education (Lee, 2014; Siddiq et al., 2020; Xie et al., 2020). To illustrate the diversity in engagement research, a plethora of variables has been examined under the term ‘engagementincluding motivation, self-efficacy, self-regulation, involvement, participation and belonging (Fredricks et al., 2004; Libbey, 2004; Reschly & Christenson, 2012). A variety of terms have also been used to characterize engage- ment, including for example student engagement in studies, academic engagement, school engagement, learner engagement, and engagement in learning (Finn & Zimmer, 2012; Reschly & Christenson, 2012). Concerning student engagement in learning, it is generally considered as a multidimensional construct, differentiated according to the number of dimensions which vary from two to four (Christenson et al., 2012). A three-dimensional approach has been proposed distinguishing cognitive, affective/emotional, and behavioral engagement (Fredricks et al., 2004; Renninger & Hidi, 2015; Schreiner & Louis, 2011). Cognitive engagement refers to investment in learning by students using deep learning strategies to integrate new information with prior existing knowledge (Greene, 2015; Greene et al., 2004; Kahu, 2013; Lam et al., 2012; Richardson & Newby, 2006). Affective/emotional engagement refers to feelings learners have about their learning experience both in terms of attention and interest in the course and social connection with peers (Fredricks et al., 2004; Kahu, 2013; Wimpenny & Savin-Baden, 2013). Behavioral engagement focuses on actions taken by the learner and is related to some student behaviors such as attendance, time and effort spent participating in activities, involvement in activities, raising its hand to ask or answer questions, etc. (Fredricks et al., 2016; Kahu, 2013; Mundelsee & Jurkowski, 2021; Zepke, 2014). A definition supporting the three-dimensional approach of engagement in learning has been proposed referring to a positive energy invested in ones own learning, evidenced by meaningful processing of information, attention to what is happening in the moment, and involvement in learning activities (Schreiner & Louis, 2006, p. 6).

Among the different ways of measuring engagement either observation of behaviors (Appleton et al., 2008) or self-report measures

(Finn & Rock, 1997; Reeve & Tseng, 2011) can be used. A self-report measure has been developed by Schreiner and Louis (2011): the Engaged Learning Index (ELI). In contrast to behavioral measures of engagement, often confounded with involvement, the Engaged Learning Index takes into account the psychological dimensions of engagement in learning: meaningful processing of information (cognitive engagement), focused attention on the course (affective engagement), and active participation during the course

(behavioral engagement). As suggested by Schreiner and Louis (2011), faculty developers could also use the ELI as a pretest and posttest measure of the effectiveness of interventions they design for the improvement of learning-centered teaching.(p. 16). In this perspective, the present study aimed to use the ELI to measure changes in the different facets of student engagement in learning between pre- and post-test according to an educational intervention.

1.3. The current study

The aim of this study was to examine an educational intervention consisting in increasing the number of functionalities of a digital learning environment, and measuring how this impacted the different aspects of engagement (cognitive, affective and behavioral) between the beginning and the end of a series of lectures. To this end, three conditions were compared varying from a low to high number of functionalities of a digital learning environment offered to the students by the teacher. These three conditions applied during the educational intervention were: 1) answering quizzes only (‘low number of functionalities), 2) answering quizzes and asking the teacher written questions (‘moderate number of functionalities), and 3) answering quizzes, asking the teacher questions and visualizing the teachers slideshow for the course on their own device in real time (‘high number of functionalities). Although this latter functionality may suggest relative passivity during the lectures, it is important to consider that being engage in learning is not simply about completing tasks (answering quizzes, asking questions, etc.), but also thinking cognitively about what we are doing, for example, by concentrating attentively on the course (see Bonwell & Eison, 1991). This additional functionality was added to prevent students from multitasking on their own devices, as has frequently been observed during lectures (Jamet et al., 2020).

Although a digital learning environment may offer various functionalities to teachers to engage their students in learning, we ignore the type and the number of functionalities necessary to increase student engagement, and which dimensions of engagement benefit most from them. An increasing number of functionalities used by the teacher should contribute to fostering student engagement during lectures, and we expected that this engagement would improve more as the number of functionalities used increased. Due to a lack of studies in the field, we were unable to formulate different hypotheses in relation to the dimensions of engagement (cognitive, affective and behavioral). In addition to self-report measures of engagement, a measure based on observation of student behaviors was used. It consisted in counting the number of students performing additional quizzes after the lectures to self-assess their knowledge and prepare exams.

2. Method

2.1. Participants

The study sample consisted of 360 students in their first year of a psychology degree from which 57 were removed because they did not respond to the questionnaire at the beginning of the course, or because they were retakers who had already taken the course the previous year. Thus, 303 students (259 females and 39 males, 5 undetermined), aged between 17 and 36 years old (M 18.6 and SD 1.62) followed a social psychology course. During the lectures, 66.7% of students used a laptop, 27.2% a smartphone, 2% a tablet, 4.1% used both laptop and smartphone.

2.2. Materials and instruments

The digital learning environment Wooclap was used in this study (https://www.wooclap.com). It is a web-based Audience Response System specifically designed for higher education aiming to transform student devices into learning tools. It provides a pallet of functionalities such as: (1) completing various types of quizzes (multiple-choice questions, fill in the blanks, find on an image, rating, open questions, word cloud, sorting, brainstorming, etc.); (2) asking the teacher written questions that are then displayed on a message wall when the teacher decides to show it; (3) visualizing the teachers slideshow on their own device in real time.

2.3. Procedure

The study was conducted in accordance with the Declaration of Helsinki, and was approved by the Ethics Committee of the University (No 2020-006). It took place during the first semester from November to December 2019 i.e., before the pandemic thus in- person courses were privileged at the university. The students were divided alphabetically into three lecture halls to take the same 6-h social psychology course on ‘emotional facial expressions in nonverbal communication. Each of the three lecture halls corresponded to one of the three experimental conditions, and in which a 2-h lecture was given by the same teacher in social psychology each week over a three-week period. The lectures were dispensed in the morning on two different days of the week, and the condition attributed to each of the lecture halls had been randomly selected before the beginning of the course. None of the other lectures given by other teachers over the semester in different disciplines of psychology used a digital learning environment (Wooclap or another application). At the beginning of the first lecture, and after completing a consent form, the students filled in a web-questionnaire containing measures on engagement in learning (used as pretest) and sociodemographic measures. In order to match the students answers on the two questionnaires administered at the beginning and at the end of the lectures, they were each allocated a unique anonymous code. In the ‘low number of functionalities condition (n 110), students had the opportunity to answer nine different quizzes administered by the teacher at different moments in the lectures, and distributed evenly over the three lectures (three quizzes per lecture). Each quiz appeared on the students devices and they selected the correct answer to a question by clicking on it. An example of

a quiz question, for which one of the four proposed answers was correct, is: An adult expresses fear on his or her face when seeing a child wearing a scary mask during a Halloween party. This is a process of: a) amplification, b) simulation (correct answer), c) masking,

d) deamplification. These quizzes aimed to check that the students had understood some theoretical notions of the course previously presented by the teacher. After the students had answered a quiz, feedback was provided by the teacher allowing students in the lecture hall to visualize the proportion of answers for each answer option and the correct answer displayed in green (see Appendix A). The feedback could be visualized by all the students on a screen in the lecture hall by means of a video projector, but their personal re- sponses (correct or incorrect) only appeared on their own device in a private mode.

In the ‘moderate number of functionalities condition (n = 75), in addition to answering the same quizzes, the students could ask

the teacher written questions at any moment during the lecture by clicking on a ‘message button. Moreover, approximately every 40 min during each lecture (twice per lecture), the teacher initiated a minute of questionswhen the students could ask questions about the previous part of the lecture. The teacher instigated this by displaying the following message on a slide on the digital learning environment: Its time to break for ‘a minute of questions: ask your questions about the part of the course you have just followed to clear up any doubts, misunderstandings, or ambiguities. You have 1 min to post any questions you like. After each of the 1-min sessions, the teacher displayed the message wallwhere the questions could be visualized by all the students in the lecture hall without revealing the identity of the people who had posted the questions (see Appendix B). Some of them were answered immediately by the teacher for five to 10 min depending on the number of questions. The teacher chose the questions requiring an immediate answer according to whether they were a prerequisite to understanding the rest of the lecture. After the last lecture, the remaining questions requiring less urgent responses were recorded in a Frequently Asked Questionssection and the answers were available for the students in the Moodle Learning Management System.

In the ‘high number of functionalities condition (n = 118), in addition to the other two functionalities of completing a series of

quizzes and asking the teacher questions, the students could visualize the teachers slideshow for the course on their own device in real time. The teachers slides were synchronized with the studentsdevices in real time, and the students could not move forward or backward in the slideshow on their own device.

Under all conditions, the students could visualize the teachers slideshow projected on a screen in the lecture hall. All the students also had the opportunity to download the course slideshow at the end of each lecture. In addition to the nine quizzes for which feedback was displayed with the correct answers, all the students watched nine short videos of around 2 min each during the lectures (three per lecture) which illustrated some notions of the course. They also completed 12 tests illustrating some parts of the course without receiving any feedback (e.g., a facial expression was presented on a photograph and the students had to identify what emotion this face expressed by selecting the right emotion on a list).

At the end of the third and last lecture, and before leaving the lecture hall, the students filled in a web-questionnaire containing measures on engagement in learning (used as post-test) and questions to verify the efficacy of the experimental manipulation. One week after the lecture, the students received a message in the discussion forum of the Learning Management System, notified by an e- mail, inviting them to answer a series of 20 additional quizzes freely and at their own pace by clicking on a web-link. These quizzes were administered on the same digital learning environment for self-assessment before the exam; the latter taking the form of a series of similar multi-choice questions.

2.4. Measures

2.4.1. Manipulation check. To verify the efficacy of the experimental manipulation based on the use of the functionalities of the digital learning environment, students had to answer a series of three questions on a six-point Likert-type scale ranging from 1(strongly disagree) to 6 (strongly agree). They were asked to what extent when using the digital learning environment Wooclap, it was possible: (1) to answer the quizzes administered by the teacher, (2) to ask the teacher questions, and (3) to follow the course slideshow on their own device.

2.4.2. Engagement in learning. The Engaged Learning Index (ELI) was used to measure student engagement in learning (Schreiner & Louis, 2011). Its short-format enabled it to be administered rapidly during a lecture. The ELI measures cognitive, affective and behavioral dimensions of an individual students level of engagement in the learning process. It consists of 10 items, each expressing a positive or negative statement to which the student responds on a six-point Likert-type scale ranging from 1(strongly disagree) to 6 (strongly agree). The items were translated into French, and some of them were adapted to the present educational context in two ways (see supplementary materials): (1) removing the plural to describe the current class instead of classes in general (items 2, 3, 4 and 9), and (2) converting into the present tense instead of referring to the past (item 7). It was administered twice, at the beginning of the first lecture (pretest), and three weeks later, at the end of the last lecture (post-test).

Preliminary analyses of the data collected immediately prior to the first lecture (i.e., before dividing the students into one of the three experimental conditions) were performed to check whether the three-factor structure of the ELI was found using an adaptation of the scale (back)-translated into French. The fit of the three-factor model was assessed with classical indices: the Comparative Fit Index (CFI), the Tucker-Lewis Index (TLI), the Root Mean Squared Error of Approximation (RMSEA), and the Standardized Root Mean Squared Residual (SRMR). The fit indices were interpreted using Hu and Bentlers (1999) suggested values, which should be close to

0.95 for CFI and TLI, close to 0.06 for RMSEA, and close to 0.08 for SRMR. Firstly, an Exploratory Factor Analysis using the ‘Maximum

likelihoodextraction method in combination with a ‘varimaxrotation confirmed the three-factor structure of the Engaged Learning Index, χ2(18) = 42.6, p < .001, TLI = 0.92, RMSEA = 0.06. Secondly, a Confirmatory Factor Analysis (‘Maximum-likelihoodmethod) indicated satisfactory fit indices for a three-factor structure, χ2(32) = 105.0, p < .001, TLI = 0.87, CFI = 0.91, SRMR = 0.06, RMSEA = 0.08, AIC = 8911, BIC = 9034. A reliability analysis for each dimension of the scale gave acceptable values (>0.70): cognitive

engagement or meaningful processing of information (Cronbachs alpha = .75), affective engagement or focused attention on the courses (Cronbachs alpha .80), and behavioral dimension or active participation in class (Cronbachs alpha .67). As active participation in class contains only two items, it is not surprising that a relatively low alpha coefficient was obtained (i.e., just below

the normal threshold of 0.70). Indeed, Cronbachs alpha is based on a set of restrictive assumptions such as unidimensionality, un- correlated errors and tau-equivalence. With only two items, it is impossible to test these assumptions (Eisinga et al., 2013). After previously averaging items of each dimension for the two measurement times, an index of engagement was computed by subtracting the post-test from the pretest scores on each of the three ELI dimensions. A positive index reveals that engagement increased over the course, whereas a negative index indicates that engagement decreased.

2.4.3. Additional measure of behavioral engagement. Twenty additional quizzes were proposed to the students one week after the last lecture (and three weeks before the exams). They were available on the digital learning environment from a web-link to all the stu- dents. After performing the quizzes, students obtained the correct answer for each to self-assess their knowledge. The number of students performing additional quizzes was counted for each condition as a measure of behavioral engagement.

3. Results

All the data analyses were performed using Jamovi (The Jamovi Project, 2020).

3.1. Manipulation check

A MANOVA was conducted on the three measures to check the efficacy of the experimental manipulation. It appeared that no significant effect was found on answering quizzes administered by the teacher between the three conditions, F (2, 144) = 0.14, p = .87, η2 = 0.002, whereas significant effects were observed on asking the teacher questions and following the course slideshow on the

students own device. Then, a series of ANOVAs was conducted on each dependent variable corresponding to the manipulation check.

Results revealed a difference between the experimental conditions on asking the teacher written questions using the digital learning environment, F (2, 144) = 21.9, p < .001, η2 = 0.23: students considered it was possible to ask questions in both the ‘moderate number of functionalitiesand ‘high number of functionalitiesconditions rather than in the ‘low number of functionalitiescondition. A difference was also found on the visualization of the course slideshow, F (2, 144) = 3.20, p = .043, η2 = 0.04: students considered that it was possible to follow the slides on their own device in the ‘high number of functionalities condition rather than in the other two

conditions (see Table 1). Taken together, these results demonstrated the efficacy of the experimental manipulation based on the cumulative use of three different functionalities of the digital learning environment.

3.2. Engagement in learning

A first MANOVA considering the experimental condition as the between-subject factor was performed on the three dimensions of engagement at the beginning of the first lecture. It revealed no significant difference between the conditions on any of the measures, suggesting a random distribution of students in the three conditions. No significant difference between the conditions was observed on cognitive engagement (Meaningful processing of information), F(2, 300) 0.877, p .42, affective engagement (Focused attention on learning contents), F(2, 300) 2.49, p .08, or behavioral engagement (Active participation), F(2, 300) 1.91, p .14.

A second MANOVA was performed on the index measuring the difference between pre- and post-test on each dimension of engagement. A significant difference was observed only on affective engagement, measured by focused attention on the lectures. An ANOVA was performed on the index of affective engagement and this demonstrated a significant difference between the three con-

ditions, F (2,144) = 3.06, p = .05, η2 = 0.04. As predicted, an a priori contrast revealed that improvement of affective engagement was greater in the ‘high number of functionalitiescondition than in the other two conditions, t(144) = 2.441, p = .016 (see Fig. 1 and Table 2). No difference between the conditions was observed on cognitive engagement, F(2, 144) = 0.317, p = .32, or behavioral engagement, F(2, 144) = 0.327, p = .33. To explore further, we verified whether the number of questions posted by students and displayed on the message wallvaried under the ‘moderate number of functionalitiesand ‘high number of functionalitiesconditions, but no difference was found (n = 32 and n = 30, respectively).

Table 1


Mean differences (and Standard Deviation in parentheses) between the experimental conditions on each item of the manipulation check.

Using the digital learning environment, it wasExperimental condition (Number of functionalities used)p-value

possible

Low (Quizzing)

Moderate (Quizzing +

Questioning)

High (Quizzing + Questioning +

Slideshow)

To answer the quizzes administered by the teacher

5.33 a (1.05)5.41 a (0.68)5.30 a (0.94).87

To ask the teacher questions3.52 a (1.52)4.93 b (1.10)5.03 b (1.22)21.9***

To follow the course slideshow on ones own device

4.10 a (1.68)4.28 a (1.03)4.75 b (1.30)3.20*

Note. Values with differing subscripts within rows are significantly different at p < .05.

*p < .05. ***p < .001.

Fig. 1. Improvement in affective engagement from pre-to post-test according to the condition.

Table 2

Means (and Standard Deviations) of scores of differences (post-test minus pre-test) on cognitive, affective, and behavioral dimension of student engagement in the experimental conditions.

0.009

0.631

3.3. Additional measure of behavioral engagement

The number of students who completed the additional self-assessment quizzes represented only 28.38% of students (n 86). A chi- square analysis was performed to compare the proportion of students who did or did not carry out the quizzes for each condition. Contrary to our predictions, a significantly higher number of students from the ‘low number of functionalitiescondition completed the additional quizzes (40.9%) compared to the other two conditions for which a ‘moderateand a ‘high number of functionalitieswere used (25.4% and 14.7%, respectively), χ2 (2, N = 303) = 15.9, p < .001, θ = 0.23.

4. Discussion

The question of how to promote student engagement during in-person courses using functionalities of digital learning environ- ments is a new challenge in modern higher education, and teaching would be enhanced by understanding how engagement in learning could be increased, and which facets are involved (cognitive, affective, or behavioral). Nowadays, various functionalities of digital learning environments can be used beyond quizzes, and thus different solutions can be proposed to promote student engagement during in-person classes. In a pre- and post-test study design in which engagement in learning was measured at the beginning and end of a series of lectures delivered by the same teacher, we expected that student engagement from the first to the last lecture would improve when the number of functionalities of the digital learning environment used by the teacher increased. Results partly

supported this prediction, revealing only an improvement of affective student engagement in learning, with focused attention on the courses being greater under the ‘high number of functionalitiescondition than under the other two conditions. The ‘high number of functionalitiescondition differed from the other two conditions by providing students with the additional possibility of visualizing the teachers course slideshow in real time during the lectures. In addition to quizzing and questioning, this additional functionality contributed to increasing focused attention on the course from the first to the last lecture: students paid more attention to the lecture, were less bored and less distracted than under the other conditions. A possible explanation for improved affective student engagement in learning could be the reduction in multi-tasking activities (Patterson, 2017; Tassone et al., 2020) and/or mind wandering during the lectures (Smallwood & Schooler, 2006). Indeed, seeing the slideshow on their own device could help students focus on the lectures, preventing external distractions such as multitasking (May & Elder, 2018) and/or internal distractions such as mind wandering (Wammes et al., 2019). However, these findings should be considered with caution because visualizing the slideshow both on their own device and on the screen in the lecture hall may have led to attentional conflict. However, a priori this is not what occurred, probably because the visualization of the teachers slideshow may have rendered the slides more readily accessible during lectures, by reducing cognitive load due to attention switching from the central screen used to visualize course content and their note taking. Although we did not measured how students took notes, it seems that most students used a copy and pasteprocedure, copying and pasting each slide into a text file and taking notes at the side. Of course, this procedure was impossible in the first two conditions in which the teachers slideshow was not available on the student devices. Observational measurements would have been useful to capture note-taking and multitasking activities, but in this ‘naturalisticsituation it was not possible to do this in a systematic and rigorous way. Although adding the functionality that allowed students to have the teachers synchronized slideshow on their own device could have made note-taking easier, it also prevented students from multitasking as they could only follow the lecture from the slides and from what the teacher said. Other studies should be conducted in the future to examine whether this functionality con- tributes to increasing note-taking and/or to decreasing multitasking.

A number of studies have demonstrated that affective/emotional engagement refers to feelings learners have about their learning

experience both regarding attention to the course contents and social connection with peers (Fredricks et al., 2004; Kahu, 2013; Wimpenny & Savin-Baden, 2013). Only attention to the lectures was taken into consideration in the present study, and future research should consider social connection with peers as a measure of affective engagement. The affective/emotional dimension may include the studentspositive response, for example, expressions of interest in and attention to the course, but also a negative response such as boredom (Bergdahl et al., 2020). In this perspective, it would be fruitful to use a more integrative scale for measuring student engagement beyond its three classic dimensions (Dierendonck et al., 2020), and/or adding a social dimension of engagement (Bergdahl et al., 2020). In this perspective it would be useful in future research to use more interactive learning situations (Chi & Wylie, 2014), for example, setting up peer instruction sessions during lectures (Mazur, 1997; Michinov et al., 2015; Michinov et al., 2020; Morice et al., 2015; Schell & Butler, 2018).

In addition to the main result on the improvement of affective engagement from the first to the last lecture (i.e., greater attention to the course), it also appeared that when the students had the opportunity to perform additional self-assessment quizzes, only about 30% completed them. More importantly, among the students who did the additional quizzes after the last lecture, nearly 41% were in the condition were only quizzes were administered during the lectures. In other words, they had not been given the possibility to ask the teacher written questions via the digital learning environment. As these students had not asked any questions during the lectures, they did not have the opportunity to reduce uncertainty about their understanding of some notions. Thus, it is possible that they felt a greater need to self-assess before exams by doing additional quizzes than those in the other conditions. In other words, students who were unable to question the teacher during lectures were more engaged behaviorally after lectures as more of them completed the additional quizzes to prepare exams. However, this result should be relativized because of the small number of students who did additional quizzes (about 30% of the sample).

Among the various ways to measure student engagement through observable behaviors, class participation or interactions with peers and teachers are often privileged (Fredricks & McColskey, 2012). In contrast, in the present study, behavioral engagement was measured by doing additional quizzes proposed to students after the lectures in order to prepare exams. This measure also contrasted with other measures used to assess student engagement in digital environments generally based on data logs (e.g., time spent in the system), despite their limitations as they do not capture the psychological aspects of engagement (Henrie et al., 2018). Although differences in behavioral engagement measured by completion of additional quizzes after the lectures were found, no difference between the conditions was found on the (self-reported) behavioral dimension of the Engaged Learning Index. This absence of effect on the self-report behavioral engagement may be due to the number and type of items used in the ELI scale. Indeed, only two items were used in the ELI to measure behavioral engagement, and they only focused on participation in the class either with peers or the teacher.

Among the main limitations of the present study, the type of engagement may be questioned. Engagement can be considered either as a process or an outcome (Appleton et al., 2008; Skinner et al., 2008). In this study it was essentially considered as an outcome, potentially affected by the functionalities of the digital environment used by the teacher. It would be fruitful in future studies to consider engagement as a mediating process between educational intervention and academic performance. Another limitation regards the absence of a control condition in which traditional lectures were delivered without any use of the digital learning environment, and consequently zero functionality. Although this condition would be useful, we decided not to introduce it in the present study because of ethical reasons. Indeed, several meta-analyses have revealed that student learning improves in active lectures compared to traditional lectures (Balta et al., 2017; Freeman et al., 2014; Hake, 1998). Consequently, we though that it would not be appropriate to incorporate a control condition in our design risking student failure, even if our study did not measure academic performance but only engagement in learning.

5. Conclusion

This study provides teachers with solutions to improve student engagement using some functionalities of a digital learning envi- ronment. As previously suggested in the literature on engagement (Handelsman et al., 2005), helping students to become engaged in a course may be as important as teaching knowledge and developing skills. From this perspective, the present findings suggest that using certain functionalities of a digital learning environment may contribute to achieving this objective. They also confirm that it is not the integration of technologies in lectures per se that improves student engagement, but rather the way teachers use them in implementing a set of functionalities which leads to students being more or less engaged in learning.

Funding

The authors received no financial support for the research, authorship, and/or publication of this article. Data and supplemental materials are available on our OSF project page:

https://osf.io/nmj8p/

Credit author statement

J´eroˆme Hutain and Nicolas Michinov worked in a collaborative fashion on this study. Together, they conceptualized the study, developed methods, wrote the paper, prepared registrations and study materials, and analysed the data.

Declaration of competing interest

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Appendix C. Supplementary data

Supplementary data to this article can be found online at https://doi.org/10.1016/j.compedu.2022.104496.

Appendix A. Illustration of feedback delivered to students after a quiz with the correct answer given by the teacher

Appendix B. Illustration of a message wall on which student questions are displayed

Unlabelled image

References

Akkoyunlu, B., & Soylu, M. Y. (2006). A study on students views on blended learning environment. The Turkish Online Journal of Distance Education, 7(3), 4356.

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45(5), 369386. https://doi.org/10.1002/pits.20303

Balta, N., Michinov, N., Balyimez, S., & Ayaz, M. F. (2017). A meta -analysis of the effect of Peer Instruction on learning gain: Identification of informational and cultural moderators. International Journal of Educational Research, 86, 6677. https://doi.org/10.1016/j.ijer.2017.08.009

Beekes, W. (2006). The ‘Millionaire method for encouraging participation. Active Learning in Higher Education, 7(1), 2536. https://doi.org/10.1177/ 1469787406061143

Bergdahl, N., Fors, U., Hernwall, P., & Knutsson, O. (2018). The use of learning technologies and student engagement in learning activities. Nordic Journal of Digital Literacy, 13, 113130. https://doi.org/10.18261/issn.1891-943x-2018-02-04, 02.

Bergdahl, N., Nouri, J., Fors, U., & Knutsson, O. (2020). Engagement, disengagement and performance when learning with technologies in upper secondary school.

Computers and Education, 149, 103783. https://doi.org/10.1016/j.compedu.2019.103783

Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom (ASHEERIC higher education rep. No. 1). George Washington University, School of Education and Human Development.

Buil, I., Catala´n, S., & Martínez, E. (2016). Do clickers enhance learning? A control-value theory approach. Computers and Education, 103, 170182. https://doi.org/ 10.1016/j.compedu.2016.10.009

Cakir, H. (2013). Use of blogs in pre-service teacher education to improve student engagement. Computers and Education, 68, 244252. https://doi.org/10.1016/j. compedu.2013.05.013

Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life Sciences Education, 6(1), 920. https://doi.org/10.1187/ cbe.06-12-0205

Chien, Y.- T., Chang, Y.- H., & Chang, C.- Y. (2016). Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educational Research Review, 17, 118. https://doi.org/10.1016/j.edurev.2015.10.003

Chi, M. T. H., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219243. https:// doi.org/10.1080/00461520.2014.965823

Christenson, S. L., Reschly, A. L., & Wylie, C. (Eds.) . (2012). Handbook of research on student engagement. Springer US. https://doi.org/10.1007/978-1-4614-2018-7. Cunningham, B. M. (2011). Introductory accounting as theater: A look behind the scenes of large-lecture production. Issues in Accounting Education, 26(4), 815833.

https://doi.org/10.2308/iace-50056

Derounian, J. G. (2020). Mobiles in class? Active Learning in Higher Education, 21(2), 142153. https://doi.org/10.1177/1469787417745214

Dierendonck, C., Milmeister, P., Kerger, S., & Poncelet, D. (2020). Examining the measure of student engagement in the classroom using the bifactor model: Increased validity when predicting misconduct at school. International Journal of Behavioral Development, 44(3), 279286. https://doi.org/10.1177/0165025419876360

Eisinga, R., Grotenhuis, M. te, & Pelzer, B. (2013). The reliability of a two-item scale: Pearson, Cronbach, or Spearman-Brown? International Journal of Public Health, 58

(4), 637642. https://doi.org/10.1007/s00038-012-0416-3

Elliott-Dorans, L. R. (2018). To ban or not to ban? The effect of permissive versus restrictive laptop policies on student outcomes and teaching evaluations. Computers and Education, 126, 183200. https://doi.org/10.1016/j.compedu.2018.07.008

Finn, J. D., & Rock, D. A. (1997). Academic success among students at risk for school failure. Journal of Applied Psychology, 82(2), 221234. https://doi.org/10.1037/ 0021-9010.82.2.221

Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 97131). Springer US. https://doi.org/10.1007/978-1-4614-2018-7_5.

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59109. https://doi.org/10.3102/00346543074001059

Fredricks, J. A., & McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In

S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 763782). Springer US. https://doi.org/10.1007/978-1-4614- 2018-7_37.

Fredricks, J. A., Wang, M.- T., Schall Linn, J., Hofkens, T. L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction, 43, 515. https://doi.org/10.1016/j.learninstruc.2016.01.009

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 84108415. https://doi.org/10.1073/pnas.1319030111

Galikyan, I., & Admiraal, W. (2019). Students engagement in asynchronous online discussion: The relationship between cognitive presence, learner prominence, and academic performance. The Internet and Higher Education, 43, 100692. https://doi.org/10.1016/j.iheduc.2019.100692

Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in an e- learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285292. https://doi.org/10.1016/j.chb.2012.09.005

Greene, B. A. (2015). Measuring cognitive engagement with self-report scales: Reflections from over 20 years of research. Educational Psychologist, 50(1), 1430. https://doi.org/10.1080/00461520.2014.989230

Greene, B. A., Miller, R. B., Crowson, H. M., Duke, B. L., & Akey, K. L. (2004). Predicting high school students cognitive engagement and achievement: Contributions of classroom perceptions and motivation. Contemporary Educational Psychology, 29(4), 462482. https://doi.org/10.1016/j.cedpsych.2004.01.006

Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses.

American Journal of Physics, 66(1), 6474. https://doi.org/10.1119/1.18809

Handelsman, M. M., Briggs, W. L., Sullivan, N., & Towler, A. (2005). A measure of college student course engagement. The Journal of Educational Research, 98(3), 184192. https://doi.org/10.3200/JOER.98.3.184-192

Han, J. H., & Finkelstein, A. (2013). Understanding the effects of professors pedagogical development with clicker assessment and feedback technologies and the impact on students engagement and learning in higher education. Computers and Education, 65, 6476. https://doi.org/10.1016/j.compedu.2013.02.002

Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344362. https://doi.org/10.1007/s12528-017-9161-1

Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 155. https://doi.org/10.1080/10705519909540118

Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect.

Computers and Education, 94, 102119. https://doi.org/10.1016/j.compedu.2015.11.013

Jamet, E., Gonthier, C., Cojean, S., Colliot, T., & Erhel, S. (2020). Does multitasking in the classroom affect learning outcomes? A naturalistic study. Computers in Human Behavior, 106, 106264. https://doi.org/10.1016/j.chb.2020.106264

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758773. https://doi.org/10.1080/03075079.2011.598505 Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers and Education, 53(3),

819827. https://doi.org/10.1016/j.compedu.2009.05.001

Lam, S., Wong, B. P. H., Yang, H., & Liu, Y. (2012). Understanding student engagement with a contextual model. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.),

Handbook of research on student engagement (pp. 403419). Springer US. https://doi.org/10.1007/978-1-4614-2018-7_19.

Lee, J.- S. (2014). The relationship between student engagement and academic performance: Is it a myth or reality? The Journal of Educational Research, 107(3), 177185. https://doi.org/10.1080/00220671.2013.807491

Libbey, H. P. (2004). Measuring student relationships to school: Attachment, bonding, connectedness, and engagement. Journal of School Health, 74(7), 274283. https://doi.org/10.1111/j.1746-1561.2004.tb08284.x

Maclaren, P., Wilson, D., & Klymchuk, S. (2017). I see what you are doing: Student views on lecturer use of tablet pcs in the engineering mathematics classroom.

Australasian Journal of Educational Technology. https://doi.org/10.14742/ajet.3257

Manwaring, K. C., Larsen, R., Graham, C. R., Henrie, C. R., & Halverson, L. R. (2017). Investigating student engagement in blended learning settings using experience sampling and structural equation modeling. The Internet and Higher Education, 35, 2133. https://doi.org/10.1016/j.iheduc.2017.06.002

May, K. E., & Elder, A. D. (2018). Efficient, helpful, or distracting? A literature review of media multitasking in relation to academic performance. International Journal of Educational Technology in Higher Education, 15(1), 13. https://doi.org/10.1186/s41239-018-0096-z

Mazur, E. (1997). Peer instruction: A users manual series in educational innovation. Prentice Hall.

McDaniel, M. A., Anderson, J. L., Derbish, M. H., & Morrisette, N. (2007). Testing the testing effect in the classroom. European Journal of Cognitive Psychology, 19(45), 494513. https://doi.org/10.1080/09541440701326154

Michinov, N., Anquetil, E., & Michinov, E. (2020). Guiding the use of collective feedback displayed on heatmaps to reduce group conformity and improve learning in Peer Instruction. Journal of Computer Assisted Learning, 36(6), 10261037. https://doi.org/10.1111/jcal.12457

Michinov, N., Morice, J., & Ferri`eres, V. (2015). A step further in peer instruction: Using the stepladder technique to improve learning. Computers and Education, 91, 113. https://doi.org/10.1016/j.compedu.2015.09.007

Morice, J., Michinov, N., Delaval, M., Sideridou, A., & Ferri`eres, V. (2015). Comparing the effectiveness of peer instruction to individual learning during a

chromatography course: Peer instruction and individual learning. Journal of Computer Assisted Learning, 31(6), 722733. https://doi.org/10.1111/jcal.12116 Mosher, R., & McGowan, B. (1985). Assessing student engagement in secondary schools: Alternative conceptions, strategies of assessing, and instrument (ERIC document

reproduction service No. ED 272812. University of Wisconsin, Research and Development Center. https://eric.ed.gov/?id ED272812.

Mundelsee, L., & Jurkowski, S. (2021). Think and pair before share: Effects of collaboration on students in-class participation. Learning and Individual Differences, 88, 102015. https://doi.org/10.1016/j.lindif.2021.102015

Papadopoulos, P. M., Obwegeser, N., & Weinberger, A. (2021). Let me explain! the effects of writing and reading short justifications on students performance, confidence and opinions in audience response systems. Journal of Computer Assisted Learning. https://doi.org/10.1111/jcal.12608. jcal.12608.

Patterson, M. C. (2017). A naturalistic investigation of media multitasking while studying and the effects on exam performance. Teaching of Psychology, 44(1), 5157. https://doi.org/10.1177/0098628316677913

Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior, 35, 157170. https://doi.org/10.1016/j.chb.2014.02.048

Reeve, J., & Tseng, C.- M. (2011). Agency as a fourth aspect of studentsengagement during learning activities. Contemporary Educational Psychology, 36(4), 257267. https://doi.org/10.1016/j.cedpsych.2011.05.002

Renninger, K. A., & Hidi, S. E. (2015). The power of interest for motivation and engagement (1st ed.) . Routledge. https://doi.org/10.4324/9781315771045

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. L. Christenson,

A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 319). Springer US. https://doi.org/10.1007/978-1-4614-2018-7_1.

Richardson, J. C., & Newby, T. (2006). The role of students cognitive engagement in online learning. American Journal of Distance Education, 20(1), 2337. https:// doi.org/10.1207/s15389286ajde2001_3

da Rocha Seixas, L., Gomes, A. S., & de Melo Filho, I. J. (2016). Effectiveness of gamification in the engagement of students. Computers in Human Behavior, 58, 4863. https://doi.org/10.1016/j.chb.2015.11.021

Roediger, H. L., III, Putnam, A. L., & Smith, M. A. (2011). Ten benefits of testing and their applications to educational practice. In J. P. Mestre, & B. H. Ross (Eds.),

Psychology of learning and motivation (Vol. 55, pp. 136). Elsevier. https://doi.org/10.1016/B978-0-12-387691-1.00001-6.

Schell, J. A., & Butler, A. C. (2018). Insights from the science of learning can inform evidence-based implementation of Peer Instruction. Frontiers in Education, 3, 33. https://doi.org/10.3389/feduc.2018.00033

Schell, J., Lukoff, B., & Mazur, E. (2013). Catalyzing learner engagement using cutting-edge classroom response systems in higher education. In C. Wankel, &

P. Blessinger (Eds.), Cutting-edge technologies in higher education (Vol. 6, pp. 233261). Emerald Group Publishing Limited. https://doi.org/10.1108/S2044-9968 (2013)000006E011.

Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature.

International Journal of Educational Technology in Higher Education, 14(1), 25. https://doi.org/10.1186/s41239-017-0063-0

Schreiner, L. A., & Louis, M. (2006). Measuring engaged learning in college students: Beyond the borders of NSSE. In Paper presented at the annual meeting of the association for the study of higher education. November.

Schreiner, L. A., & Louis, M. C. (2011). The engaged learning index: Implications for faculty development. Journal on Excellence in College Teaching, 22(1), 528. Shulman, L. S. (2002). Making differences: A table of learning. Change: The Magazine of Higher Learning, 34(6), 3644. https://doi.org/10.1080/00091380209605567 Siddiq, F., Gochyyev, P., & Valls, O. (2020). The role of engagement and academic behavioral skills on young students academic performancea validation across

four countries. Studies In Educational Evaluation, 66, 100880. https://doi.org/10.1016/j.stueduc.2020.100880

Skinner, E., Furrer, C., Marchand, G., & Kindermann, T. (2008). Engagement and disaffection in the classroom: Part of a larger motivational dynamic? Journal of Educational Psychology, 100(4), 765781. https://doi.org/10.1037/a0012840

Smallwood, J., & Schooler, J. W. (2006). The restless mind. Psychological Bulletin, 132(6), 946958. https://doi.org/10.1037/0033-2909.132.6.946

Stowell, J. R., Oldham, T., & Bennett, D. (2010). Using student response systems (clickers) to combat conformity and shyness. Teaching of Psychology, 37(2), 135140. https://doi.org/10.1080/00986281003626631

Tassone, A., Liu, J. J., Reed, M. J., & Vickers, K. (2020). Multitasking in the classroom: Testing an educational intervention as a method of reducing multitasking.

Active Learning in Higher Education, 21(2), 128141. https://doi.org/10.1177/1469787417740772 The Jamovi project (1.1.9). (2020). Computer software. https://www.jamovi.org.

Vahedi, Z., Zannella, L., & Want, S. C. (2019). Studentsuse of information and communication technologies in the classroom: Uses, restriction, and integration. Active Learning in Higher Education. https://doi.org/10.1177/1469787419861926, 146978741986192.

Wammes, J. D., Ralph, B. C. W., Mills, C., Bosch, N., Duncan, T. L., & Smilek, D. (2019). Disengagement during lectures: Media multitasking and mind wandering in university classrooms. Computers and Education, 132, 7689. https://doi.org/10.1016/j.compedu.2018.12.007

Wang, A. I., & Tahir, R. (2020). The effect of using Kahoot! for learning a literature review. Computers and Education, 149, 103818. https://doi.org/10.1016/j. compedu.2020.103818

Wimpenny, K., & Savin-Baden, M. (2013). Alienation, agency and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education, 18

(3), 311326. https://doi.org/10.1080/13562517.2012.725223

Wood, R., & Shirazi, S. (2020). A systematic review of audience response systems for teaching and learning in higher education: The student experience. Computers and Education, 153, 103896. https://doi.org/10.1016/j.compedu.2020.103896

Xie, K., Vongkulluksn, V. W., Lu, L., & Cheng, S.- L. (2020). A person-centered approach to examining high-school students motivation, engagement and academic performance. Contemporary Educational Psychology, 62, 101877. https://doi.org/10.1016/j.cedpsych.2020.101877

Zepke, N. (2014). Student engagement research in higher education: Questioning an academic orthodoxy. Teaching in Higher Education, 19(6), 697708. https://doi. org/10.1080/13562517.2014.901956

Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action. Active Learning in Higher Education, 11(3), 167177. https://doi.org/10.1177/ 1469787410379680

DMU Timestamp: March 17, 2023 08:51





Image
0 comments, 0 areas
add area
add comment
change display
Video
add comment

Quickstart: Commenting and Sharing

How to Comment
  • Click icons on the left to see existing comments.
  • Desktop/Laptop: double-click any text, highlight a section of an image, or add a comment while a video is playing to start a new conversation.
    Tablet/Phone: single click then click on the "Start One" link (look right or below).
  • Click "Reply" on a comment to join the conversation.
How to Share Documents
  1. "Upload" a new document.
  2. "Invite" others to it.

Logging in, please wait... Blue_on_grey_spinner