Index

Leveraging Student Engagement through MS Teams at an Open and Distance E-learning Institution

Chaka Chaka1*; Tlatso Nkhobo2 ; Mirriam Lephalala3

1,2,3Department of English Studies, College of Human Sciences, University of South Africa, Pretoria, South Africa.

Abstract

The current paper reports on a study that was conducted at the University of South Africa (UNISA) in 2021. The study involved three cohorts of undergraduate students (n = 20, n = 12 and n = 18), where each cohort participated in one of the virtual sessions offered on MS Teams as part of their modules’ virtual classes. Employing a case study research design, the study used the interactions students had on MS Teams through messages in each session to determine how such messages served as indicators of student engagement. Four student engagement dimensions, namely emotional, behavioral, cognitive and academic engagement, were the focus of this study. Two of the findings of this study are: (a) only few students dominated the messages posted during the three live virtual sessions; and (b) cognitive and emotional engagement dimensions were the two predominant dimensions of student engagement. The paper ends with the implications and recommendations.

Keywords: Student engagement, MS Teams, Virtual sessions, Cohorts, ENG1515, Messages.

Contribution of this paper to the literature
This study contributes to the existing literature on online student engagement in the higher education sector, with special reference to the open and distance e-learning (ODeL) arena. It does so by re-theorizing the notion of student engagement in order to accommodate to   students participating in a collaborative communications tool such as MS Teams, particularly when it is utilized for teaching and learning purposes in lieu of a learning management system such as myUnisa. The findings indicated that fewer students dominated the online engagement process, and that of the four student engagement dimensions investigated, both cognitive and emotional engagement dimensions were more dominant than emotional and behavioral engagement dimensions.

1. Introduction

Against this background, this study sought to investigate how features of MS Teams such as posts, replies and reactions served as indicators of student engagement, and what can be learnt from using such features for student engagement purposes.

2. Contextualizing Issues

2.1. MS Teams

Originally intended for business collaboration purposes, MS Teams is a software program that functions as an online hub for collaboration and communication. It is part of Microsoft’s Office 365 suite of integrated tools, even though it is available as a standalone app for different devices such as desktop and laptop computers and smart phones. It comprises channels that offer the following benefits:

MS Teams also has tools such as OneNote Class Notebook, Edu Class Note, and Edu Staff Note integrated into it (Microsoft, 2022). Additionally, it has inbuilt features like channels and tabs and functions such as mentions and threads. Some of the purposes served by these features and functions are as characterized below:

Looking at the above-mentioned tools and capabilities, it is plausible to argue that MS Teams is a device-neutral digital collaborative communications hub with an inbuilt suite of cognate tools that complement one another in varying degrees. This, in a way, makes it an all-in-one collaboration platform that lends itself well for educational purposes.

2.2. Open and Distance e-Lectronic Learning (ODeL)

In its traditional sense, open and distance e-learning (ODeL) is a learning environment in which students enrol for their courses and register for their qualifications through the open and distance education system, leveraging e-learning as a primary means for teaching and learning. Its bedrock delivery platform is a learning management system (LMS). In this ODeL environment, students receive  technical, institutional, administrative, academic, cognitive  and emotional support  from a distance (Maboe, 2019). The idea of open in this case  applies to the fact that distance e-learning is open to students by affording them access and  offering them flexibility in terms of learning and  enrolling for their degree courses. In this context, openness has more to do with providing students ample opportunities to learn and study remotely without the constraints of space and time. For its part, flexibility refers to several options students have regarding learning, instructional approaches, access modalities, time  and space (Bozkurt, 2019; Caliskan, 2012; Lee, 2021; Mejía-Madrid, Llorens-Largo, & Molina-Carmona, 2020) . However, it is worth noting that Lee (2021) contends that in certain instances, the notion of openness is more of a rhetoric than it is made out to be.

In its overall configuration, ODeL has elements of technology-enhanced learning, online learning and flexible learning built into both its openness and its distance-learning orientation. With the outbreak of the COVID-19 pandemic in 2020, the ODeL model in higher education has seen a rapid pivoting to other types of technologies that enhance and augment the legacy of LMSs and traditional e-learning. Some of these technologies include Teams and Zoom, which are digital collaborative communication tools. One of the key drivers that has led to the uptake of these digital collaborative communication tools is their videoconferencing platforms or capabilities (Chaka, 2020a; Ngoc & Phung, 2021; Sobaih, Salem, Hasanein, & Elnasr, 2021) . These digital collaborative communication tools, in particular, have added value to the online technology mix used by most ODeL higher education institutions such as UNISA, which has embraced MS Teams as a teaching and learning tool institution-wise. It is within this configuration and against the backdrop outlined in the preceding paragraph that, this paper contends, UNISA as an ODeL institution finds itself. This also applies to the Department of English Studies together with the modules it offers, such as the two whose registered students are under investigation in this paper.

3. Theorizing Student Engagement to Make it Resonate with Collaborative Communication Tools

Student engagement, also known as learner engagement (Bodily, Graham, & Bush, 2017; Viberg, Khalil, & Bergman, 2021), as it relates to both   online  and  blended learning, has been extensively studied in the higher education (HE) sector (Bodily et al., 2017; Boulton, Kent, & Williams, 2018; Chaka & Nkhobo, 2019; Chen, Chang, Ouyang, & Zhou, 2018; Disho, Nchindo, & Dortea, 2022; Henrie, Bodily, Larsen, & Graham, 2018; Hussain, Zhu, Zhang, & Abidi, 2018; Rugube & Govender, 2022; Vogt, 2016) . Even though there appears to be no universal consensus regarding what student engagement is (Chaka & Nkhobo, 2019), the idea that it comprises aspects such as the quality of time, investment  and interest students have in and students’ increasing persistence, satisfaction  and performance in given learning tasks (Robinson, 2012; Trowler & Trowler, 2010) highlights its multidimensional nature.

In some instances, it also includes student voice and student participation (Robinson, 2012) in the learning task and in the mode and medium through which this task is presented. As a construct, student engagement has multiple framings (Chaka & Nkhobo, 2019). These include a two-dimensional framing (Finn, 1989; Reschly & Christenson, 2012); a three-dimensional framing (Hu & Li, 2017; Reschly & Christenson, 2012; Yazzie-Mintz, 2007) and a four-dimensional framing (Chaka & Nkhobo, 2019; Reschly & Christenson, 2012; Trowler & Trowler, 2010; Viberg et al., 2021) . In its four-dimensional framing, it encapsulates four engagement variables: emotional, behavioural, cognitive and academic engagement (Chaka & Nkhobo, 2019; Reschly & Christenson, 2012; Robinson, 2012) . This progressive conceptualization of student engagement emphasizes how this construct has evolved over time.

While the progressive theorizing of student engagement provided above is a positive development, there is, nonetheless, a need to theorize student engagement not only in a manner that reflects its evolutionary nature, but also in a way that encapsulates the different contexts and the various technologies through which it is used. This relates, especially, to theorizing it with respect to both digital technologies and online collaborative communication tools such as MS Teams, which are currently used for teaching and learning by ODeL higher education institutions.  In relation to this paper and in view of the new normal brought about by the COVID-19 pandemic to the teaching and learning ecosystem, student engagement needs to be conceptualized in line with the manner in which some of the repurposed online collaborative communication tools are applied to the emerging teaching and learning value chain. As pointed out above, one of these tools is MS Teams.

In this context, and in response to teaching and learning demands under the COVID-19 pandemic, this paper proposes that student engagement be further theorized to include the following variables: downloads; posts; mentions; reads and threads. These variables can be used, alongside conventional student engagement dimensions mentioned earlier, to measure or to map student engagement and digital student presence. The latter, also known as virtual or online student presence, is, in this case, part of presence teaching and presence learning as theorized by, mainly, Chaka (2015); Chaka (2019). The former (presence teaching) is a technology-mediated teaching that facilitates an instructor’s digital presence, while the latter (presence learning) is a technology-mediated learning that enables a student’s digital presence. In the case of MS Teams, this is MS Teams-mediated presence teaching and presence learning. The five proposed student engagement variables should be seen as deictic in nature. Deictic student engagement variables,  such as deictic literacy (Forzani & Leu, 2017), foreground and emphasize the context-embeddedness and the MS Teams-mediation of student engagement as a construct.

In the light of the points highlighted above, the current paper focuses on the built-in user communication features such as posts, replies and reactions which are available in Teams, and present them as variables that serve as indicators of student engagement. The paper espouses a deictic view of student engagement, which it regards as flexible and dynamic. It also argues that when conceptualized deictically, student engagement varies with, and is determined, enhanced and constrained by a given teaching and learning ecosystem within which it applies together with the attendant technologies and contexts of that ecosystem. For instance, in certain learning contexts mediated by social media technologies (also known as over-the-top technologies) like WhatsApp, Facebook and Twitter, student engagement might involve further variables such as likes, dislikes, favourites and memes or gifs. This is the evolving nature of student engagement as necessitated by the changing teaching and learning landscape in the midst of the COVID-19 pandemic, which this paper proposes. This particular evolutionary and deictic theorization of student engagement is relevant and appropriate for ODeL institutions such as UNISA and for the three cohorts of students enrolled for the two undergraduate modules investigated in this paper.

Based on the points highlighted in the preceding paragraphs, this study had the following research questions:

(a) In what ways do features such as posts, replies and reactions as offered by MS Teams serve as indicators of student engagement?
(b) What can be learnt from the use of such features in terms of student engagement?

4. Literature Review

Even before the COVID-19 pandemic, numerous studies were conducted on student engagement as it related to online learning environments in the HE sector. Some of these studies included those conducted by Boulton et al. (2018), Henrie et al. (2018) and Hussain et al. (2018). Others were conducted within blended learning contexts, of which Almoslamani (2018); Conijn, Snijders, Kleingeld, and Matzat (2016); Sahni (2019) and Williams and Whiting (2016) studies are examples. In these two instances of student engagement studies, the common denominator in all of these studies bar one, was the use of a learning management system (LMS) to gauge student engagement (also see Chaka and Nkhobo (2019)). During the current period of the COVID-19 pandemic, there are already studies that have been conducted on student engagement in the HE sector. Examples of these studies are those by Chen., Kaczmarek, and Ohyama (2020); Dahleez et al. (2021); Hewson and Chung (2021) and Reguera and Lopez (2021). For instance, the study by Chen et al. (2020)  investigated student engagement together with the learning outcomes and students’ perceptions of an online course that incorporated quizzes, tasks  and tests as a formative assessment. It involved 60 first-year college students, who received their activities virtually on both iCourse (a nationwide learning platform) and MuClass (an interaction tool affiliated to iCourse), and through in-person meetings. The study concludes that, based on their learning scores and records, students managed to engage actively in all the learning activities and succeeded in gaining high scores in all quizzes, tasks and tests. However, it did not single out or focus on any of the student engagement dimensions in its investigation.

For its part, the study by Dahleez et al. (2021)  explored the effects of both an LMS (MOODLE) usability and teacher behavior on the agentic, behavioral, emotional  and cognitive dimensions of student engagement. The participants were 418 students enrolled for different specializations at five private tertiary institutions in Oman. With data collected through a self-administered questionnaire, the study found that the e-learning (MOODLE) usability impacted agentic, behavioral and cognitive engagement significantly and positively. Nonetheless, the impact on emotional engagement was not significant. The study also showed that teacher behavior influenced the link between the e-learning system usability and the four students’ engagement dimensions.

In this context, Hewson and Chung (2021) study examined the way in which MS Teams was piloted at Leeds Beckett University in 2017 with a view to replacing institutional LMS platforms that were being used by the Distance Learning Unit at the time. The MS Teams pilot project was also intended to enhance distance learning student engagement. In one module, 16% more students posted responses on MS Teams than in the previous year. Almost two-thirds of them said they were more likely to participate in MS Teams discussions than in institutional LMS platform discussions. In another module, academic staff realized that MS Teams was capable of reducing transactional distance through asynchronous conversations that resembled synchronous conversations.

In another context, the study by Yildiz (2021) which employed MS Teams and which was carried out at Kafkas University Kazım Karabekir Vocational School of Technical Sciences, Turkey, set out to determine the use of MS Teams and to establish students’ views about its use during the COVID-19 pandemic. The study had 15 students as its participants and collected data through an online questionnaire. Some of the results of this study were as follows: most of the students (86%) believed that MS Teams offered them a two-way learning interaction; 76% of the students indicated that MS Teams increased their motivation in learning and 58% of the students were of the view that MS Teams was effective as a distance learning tool.

Lastly, Reguera and Lopez (2021) study investigated the impact of introducing and using a digital whiteboard (Zoom) on student engagement. Its participants were 39 fourth-semester medical students enrolled for an immunology course at a university in Mexico. Student engagement factors assessed were emotion and behavior; interaction with peers and faculty and structure and educational environment. One of the findings of this study was that students perceived the digital whiteboard to have contributed to their class engagement.

5. Research Methodology

Most studies investigating online student module engagement often utilize quantitative surveys such as questionnaires and self-reports, and quantitative metrics like module login data and performance scores to measure student engagement (Chaka & Nkhobo, 2019; Dixson, 2015; Henrie et al., 2018; Vogt, 2016) ,  while  others employ mixed-method  approaches  (Hu & Li, 2017). The current study adopted a case study research design, which entails examining real-world phenomena and contexts within a single case or within multiple cases (Chaka, Nkhobo, & Lephalala, 2020; Harrison, Birks, Franklin, & Mills, 2017; Yin, 2014) .

5.1. Participants and Sampling Techniques

Participants of this study consisted of 50 undergraduate students enrolled for two first-level English Studies modules, ENG1515 and ENG1503, at UNISA. The first module is a year-long module, which is part of a Bachelor of Education (B.Ed.) Foundation Phase degree program. The second module is a two-semester module, whose full name is English for Academic Purposes. Thirty-two of these participants belonged to ENG1515. Of these, eighteen were females and fourteen were males. Together, their average age as reflected in their module enrollment information was 35 (SD = 2.5). These participants belonged to two cohorts. The first cohort comprised twenty students who attended a virtual class session intended for the Assignment 03 preparation information, which was presented through MS Teams on 05 July 2021. The second cohort was made up of twelve students who attended an MS Teams feedback session for Assignment 02. This virtual class session took place on 02 August 2021. Both cohorts had two different sets of students from ENG1515, and each virtual class session was scheduled for one hour.

The third cohort consisted of eighteen participants from ENG1503. The participants were part of an Early Completion Program (ECP), and had their one-hour virtual class session on 04 June 2021. This session was intended to prepare students for Assignment 02. The average age of the participants was 33 (SD = 2.8); twelve participants were females and six were males.

Volunteer sampling was used to select the fifty participants. This type of sampling technique involves requesting individuals to volunteer to participate in a study project through invitations or announcements (Chaka et al., 2020; Omair, 2015; Sharma, 2017) . For the present study, an announcement for permission to use participants’ MS Teams data for research purposes was posted on the module sites of the two modules, which are hosted on myUnisa (a UNISA’s legacy learning management system), on 26 August 2021. In addition, this study is part of the Department of English Studies’ research project that has been granted two ethical clearance certificates by UNISA’s College of Human Sciences Research Ethics Review Committee and by UNISA’s Research Permission Sub-Committee (RPSC) of the Senate Research, Innovation, Postgraduate Degrees and Commercialization Committee (SRIPDCC). The certificate numbers are 90258495_CREC_CHS_2021 and 2021_RPSC_050, respectively.

5.2. Data Collection

The data for this study were collected from MS Teams. They consisted exclusively of student participation trails related to the three MS Teams class sessions attended by the three cohorts of students, as specified above. As such, the data were in three sets, each of which came from each of the three virtual class sessions. The three data sets each comprised messages (posts, replies and reactions) that each cohort of students posted on MS Teams’ chat facility during their respective virtual class sessions on each of the three days on which these sessions took place: 04 June 2021, 05 July 2021 and 02 August 2021. All these messages were recorded by MS Teams’ General channel, which functions as a log system of participants’ conversations in MS Teams, as shown in Figure 1.


Figure 1. Screenshot of MS teams’ General channel or tab.

5.3. Data Analysis

After the three data sets had been collected from MS Teams, they were analyzed through thematic analysis (Castleberry & Nolen, 2018; Vaismoradi, Jones, Turunen, & Snelgrove, 2016). Two schemes for each data set were created in three MS Word files. The first coding scheme consisted of five columns labeled participants, channel, posts, replies and reactions, consecutively. These five labels corresponded to those used in MS Teams to represent the participants taking part in a virtual session, the types of messages they wrote during a session and the channel (the General tab) under which they posted the messages. In this case, the labels constituted the themes and the units of analysis for the first coding scheme in each data set.

The second coding scheme for the three data sets comprised six columns labeled participants, emotional dimension, behavioral dimension, cognitive dimension, academic dimension and user platform commentary, respectively. These labels were the themes that represented student engagement and constituted the units of analysis in each data set. In each case, two researchers coded and rated the data sets. Disagreements were resolved through consensus. The two researchers’ coding reliability was .90, which according to Cohen’s κ value weightings, is excellent (Chaka, 2020b; Cohen, 1960).

6. Findings

The findings and discussions of this study are presented for the three cohorts of students; as such, the findings and respective discussions are integrated.

6.1. First Cohort

Figure 2 displays the messages that the 20 students as participants posted during the first MS Teams session that was on preparing students for Assignment 02.

Figure 2. Participants’ message types as posted on MS teams’ General channel.

Participants’ message types comprised posts, replies and reactions. In all, there were 9 posts (original messages), 53 replies and 15 reactions. The total number of messages posted during the duration of the course for this session was 77.  In relation to the posts, of the 20 students, only four posted original messages, with one student having 4 posts, followed by two students with 2 posts each, and one with 1 post. Except for two students, all the other students posted replies to the 9 original posts. One student had the most replies (n = 21), who incidentally was the same student who had the highest number of original posts (n = 4). This same student posted 29 messages in all. The student with the second highest number of replies (n = 8) was one of the two students who posted 2 original messages. Overall, this student posted 14 messages. The rest of the replies are as follows: 3 posts (n = 2); 2 posts (n = 4); and 1 post (n = 10). In terms of reactions, only six students posted them; three of which were single reactions. The remaining three students posted 2 reactions each.  Of these, one was from the student with the most number of posts and the most number of replies, while the other was from the student with the second most number of replies.

Figure 3. Student engagement dimensions as determined by message types on MS teams.

Figure 3 illustrates the types of student engagement dimensions as reflected by and as worked out from participants’ overall messages posted on MS Teams during the first virtual session. Of the 77 messages, the highest number of messages (n = 25/32%) was on the cognitive engagement dimension. The second highest number of messages (n = 20/26%) was on the emotional engagement dimension. The two lowest number of messages (n = 14/18% and n = 5/5%), were on the academic and behavioral engagement dimensions respectively. The remaining messages (n = 14/18%) were on platform usage (user platform commentary). Only one student had messages covering the four student engagement dimensions. This was the student who had the second highest number of messages (n = 14). This student was followed by two students whose messages covered three of the four student engagement dimensions. One of these students was the one with the most messages (n = 29). The remaining students’ messages were related to one or two dimensions.

6.2. Second Cohort

Figure 4  l displays participants’ overall messages, which consisted of three categories: posts (n = 1), replies (n = 42) and reactions (n = 8). These message types corresponded to how user messages are classified on MS Teams’ General tab.  On the whole, there was one post (original message) that generated 42 replies and 8 reactions. One student posted the most replies (n = 11) and was followed by one student with 6 replies and by two students who posted 4 replies each.

Figure 4. Participants’ message types as posted on MS teams’ General channel.

The rest of the replies were as follows: 3 replies (n = 3 students); 2 replies (n = 3 students) and 2 single replies (n = 2 students). The 8 reactions were from five students, while all twelve students posted one or more replies. The one student who posted the most replies was the one who had the most number of messages, overall, in this session. This student was followed by three students who posted 8, 6 and 5 messages, respectively. They were followed by two students with 4 messages each, and by three students with 3 messages each. The remaining messages comprised 2 messages by one student and 2 single messages shared by two students.

 Figure 4 shows that twelve students managed to generate 51 messages in response to just one post. In addition, this figure displays a pattern in which three students dominated the virtual interaction as their collective messages constituted 49%, which was almost half of the messages posted. However, of these three students, one dominated the messages posted, making this, as in the first session, a three-person show skewed heavily towards a one-person show.

Figure 5. Student engagement dimensions as determined by message types on MS teams.

Figure 5  displays student engagement dimensions in terms of their types, as determined by the 51 messages posted on MS Teams during the second virtual session. Here, two dimensions, cognitive engagement and emotional engagement attracted the most messages, which were 22 (43%) and 20 (39%), respectively. Academic and behavioral engagement drew fewer messages. The student with the highest number of messages as described in Figure 4, had the highest number of posts for the cognitive student engagement dimension. This dimension also attracted the most number of messages; only two messages were posted for the platform usage purposes.

6.3. Third Cohort

The message types of the 18 participants in this cohort are displayed in Figure 6. Overall, there were 11 original messages from which 38 replies and 3 reactions emerged. These original posts came from 8 students, with 2 posts each coming from three students. Of the 38 replies, 8 replies came from one student who had not posted any original message and four replies were posted by one student. By contrast, six students posted 3 replies each, with two students posting 2 replies each. The remaining four students posted 1 reply each, while the other students did not post any reply. There were 3 reactions from two students. In all, there were 52 messages posted during this virtual class session.

Figure 6. Participants’ message types as posted on MS teams’ General channel.

As shown in Figure 7, there were 34 messages related to both student engagement dimensions and user platform comments that were posted on MS Teams by this cohort during this virtual session. Of these 34 messages, 10 messages (29%) were directly related to student engagement dimensions. The rest (n = 24 (71%)) had to do with user platform commentary.  In the student engagement dimensions, 4 messages (12%) were on the academic engagement dimension. This was followed by the cognitive engagement dimension (n = 3 (9%)), behavioral engagement dimension (n = 2 (6%)) and emotional engagement dimension (n = 1 (3%)), respectively.

Figure 7. Student engagement dimensions as determined by message types on MS teams.

 In relation to the points raised in the preceding paragraphs, only 8 students (44%) posted original messages. These were the same students who dominated these original messages. These messages attracted 38 replies from fourteen students and 3 reactions from two students. The most number of replies (n = 8 (21%)) came from only one student. These 8 replies were followed by 4 replies (10%) from one student.

7. Discussion

7.1. First Cohort Student Engagement

Based on the findings presented above, there were few original posts (original messages) that were made by equally few students (n = 4) among the 20 students. Even though these posts attracted 53 replies and 15 reactions, the fact remains that they were initiated by only a few students. Additionally, even the 53 replies that were posted were dominated by two students who had the two highest numbers of messages. Of these two, one had double the number of messages posted by the other, making her the most dominant student overall in this virtual session. The same two students were among the four that dominated the reactions. Given this, it is plausible to suggest that the messages posted in this particular virtual session were, on the one hand, a two-person show, and on the other, a one-person show. This is more so when considering that the messages produced by these two students constituted 56% of the overall messages for this session.

With respect to the student engagement dimensions, both the cognitive and emotional engagement dimensions attracted the most number of posts, with the cognitive engagement dimension emerging as the predominant dimension. This means that posts, replies  and reactions serve, in this case, as instances of online presence learning (see (Chaka, 2015; Chaka, 2019)) for students who use them. Elsewhere, in a study by Caton et al. (2021) that compared the  engagement of preclinical students through question-asking behaviors in an online videoconference environment versus an in-person environment, it was found that students posed more questions in videoconference sessions than they did in in-person sessions. This emphasizes the fact that virtual videoconference platforms, of which MS Teams is but one example, can facilitate student engagement. In another instance, a systematic evidence map study by Bond, Buntins, Bedenlier, Zawacki-Richter, and Kerres (2020) on student engagement and educational technology in higher education concerning academic disciplines such as education, arts & humanities, and natural sciences, mathematics & statistics discovered that behavioral engagement was the most frequently cited dimension.  This was followed by emotional and cognitive engagement. All these were in relation to blended learning and online text-based tools (e.g., online discussion forums) in which undergraduate students were the primary participants. In the current study, though, cognitive engagement was the dominant engagement dimension for the first cohort, followed by emotional engagement.

7.2. Second Cohort Student Engagement

For this cohort, the pattern that emerged was one in which three students dominated the online interaction, with one student being the most dominant overall. Here  too, cognitive engagement and emotional engagement, realisable through students’ posts, replies and reactions as instances of online presence learning (see (Chaka, 2015; Chaka, 2019)), were the two predominant engagement dimensions, with the former being the more dominant compared with the latter. Both academic and behavioural engagements were the least dominant. In this case, as pointed out earlier, a study by Dahleez et al. (2021) that investigated the effects of MOODLE usability and teacher behavior on student engagement found both variables impacted students’ agentic, behavioral, emotional  and cognitive engagements positively. However, in the present study as is the case with the first cohort, the second cohort  student engagement differed  from the findings by Bond et al. (2020)  as cognitive engagement,  and not behavioral engagement,  was the dominant engagement dimension. It was followed by emotional engagement as the second most dominant engagement dimension.

7.3. Third Cohort Student Engagement

As mentioned earlier, in this cohort, student engagement pattern was a two-person show. As regards student engagement dimensions, academic engagement dimension was the most dominant dimension, with emotional engagement as the least dominant dimension. Overall, the four student engagement dimensions, encoded through students’ posts, replies and reactions as instances of online presence learning (see (Chaka, 2015; Chaka, 2019)), attracted lower student participation than platform usage as a variable. This means that student messages in this MS Teams session were more about platform usage than about the four student engagement dimensions put together. In this cohort, unlike in the first two cohorts, academic engagement trumped the other three student engagement dimensions. Moreover, this development differs from Bond et al. (2020)  in whose study behavioral engagement was the most frequently cited dimension.

8. Implications, Conclusions and Limitations

This study has attempted to utilize posts, replies and reactions, all of which are messages on MS Teams, as indicators of student engagement. Overall, in this study, as in traditional in-person classes, only a few students dominated the posts, replies and reactions. This resulted in student interactions in the three virtual sessions being dominated by one or two students. This one or two person dominance translated into the four student engagement dimensions, especially in the first two virtual sessions being dominated by the same one or two students’ posts as well. For instance, the one student engagement dimension, especially cognitive engagement dimension, was dominated by a single student in each of the first two virtual sessions. These two single students individually dominated the emotional engagement dimension in their respective virtual sessions. In relation to the third virtual session, all the four student engagement dimensions attracted very few messages. This implies that students did not leverage the benefits MS Teams offers for tapping into student engagement. Another implication of the current study is that instructors should always encourage all students to participate in whatever way possible on virtual platforms such as MS Teams. This is more critical during the current COVID-19 pandemic when most classes have pivoted to virtual or online instruction in most ODeL institutions and most in-person HEIs. A further implication of this study is that variables such as posts, replies and reactions can be employed as indicators or proxies of student engagement on MS Teams or as indicators to gauge student engagement on virtual platforms similar to MS Teams.

 The present study was a case study involving three cases of three cohorts of undergraduate students. The students who participated in these three cohorts were very few given the small sample size that was used in each case. As a result, the findings of this study are contextual and non-generalizable. Nonetheless, the study has a contextual transferability and applicability (Chaka & Nkhobo, 2019; Chaka et al., 2020). So, future studies that intend to investigate student engagement in an online collaborative communications tool such as MS Teams needs to consider using larger samples. One other limitation is that not all posts or messages may result in student engagement. Therefore, caution should be exercised when employing posts or messages as indicators of student engagement. However, the need to tap into variables such as posts, replies, reactions, mentions, likes and favorites as some of the benefits offered by the new and emerging digital tools for teaching and learning cannot be over-emphasized.

References

Almoslamani, Y. (2018). Effectiveness of student engagement using learning management system in the blended learning environment at Saudi Electronic University. Retrieved from: https://digscholarship.unco.edu/cgi/viewcontent.cgi?article=1478&context=dissertations.

Bodily, R., Graham, C. R., & Bush, M. D. (2017). Online learner engagement: Opportunities and challenges with using data analytics. Educational Technology, 57(1), 10-18.

Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), 1-30.Available at: https://doi.org/10.1186/s41239-019-0176-8.

Boulton, C. A., Kent, C., & Williams, H. T. (2018). Virtual learning environment engagement and learning outcomes at a ‘bricks-and-mortar’university. Computers & Education, 126, 129-142.Available at: https://doi.org/10.1016/j.compedu.2018.06.031.

Bozkurt, A. (2019). From distance education to open and distance learning: A holistic evaluation of history, definitions, and theories. In S. Sisman-Ugur, & G. Kurubacak (Eds.), Handbook of research on learning in the age of transhumanism (pp. 252–273). Hershey, PA: IGI Global.

Caliskan, H. (2012). Open learning. In N. M. Seel (Eds.), Encyclopedia of the sciences of learning. Boston, MA: Springer.

Castleberry, A., & Nolen, A. (2018). Thematic analysis of qualitative research data: Is it as easy as it sounds? Curriculum in Pharmacy Teaching and Learning, 10, 807–815.Available at: https://doi.org/10.1016/j.cptl.2018.03.019.

Caton, J. B., Chung, S., Adeniji, N., Hom, J., Brar, K., Gallant, A., . . . Hosamani, P. (2021). Student engagement in the online classroom: comparing preclinical medical student question-asking behaviors in a videoconference versus in‐person learning environment. FASEB BioAdvances, 3(2), 110-117.Available at: https://doi.org/10.1096/fba.2020-00089.

Chaka, C. (2015). Digital identity, social presence technologies, and presence learning. In R. D. Wright (Ed.), Student-teacher interaction in online learning environments (Vol. IGI Global, pp. 183-203): Hershey, PA.

Chaka, C. (2019). Re-imagining literacies and literacies pedagogy in the context of semio-technologies. Nordic Journal of Digital Literacy, 14(1-02), 54-69.Available at: https://doi.org/10.18261/issn.1891-943x-2019-01-02-05.

Chaka, C., & Nkhobo, T. (2019). Online module login data as a proxy measure of student engagement: The case of myUnisa, MoyaMA, Flipgrid, and Gephi at an ODeL institution in South Africa. International Journal of Educational Technology in Higher Education, 16(1), 1-22.Available at: https://doi.org/10.1186/s41239-019-0167-9.

Chaka, C., Nkhobo, T., & Lephalala, M. (2020). Leveraging MoyaMA, WhatsApp and online discussion forum to support students at an open and distance e-learning university. Electronic Journal of E-learning, 18(6), 494-515.Available at: https://doi.org/10.34190/jel.18.6.003.

Chaka, C. (2020a). Higher education institutions and the use of online instruction and online tools and resources during the COVID-19 outbreak - An online review of selected U.S. and SA's universities. Research Square, 1-46.Available at: https://doi.org/10.21203/rs.3.rs-61482/v1.

Chaka, C. (2020b). Online polylogues and the speech acts of online discussion forums. Journal of Educators Online, 17(2), n2.

Chen, B., Chang, Y.-H., Ouyang, F., & Zhou, W. (2018). Fostering student engagement in online discussion through social learning analytics. The Internet and Higher Education, 37, 21-30.Available at: https://doi.org/10.1016/j.iheduc.2017.12.002.

Chen, E., Kaczmarek, K., & Ohyama, H. (2020). Student perceptions of distance learning strategies during COVID-19. Journal of Dental Education, 1–2.Available at: https://doi.org/10.1002/jdd.12339.

Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37-46.Available at: https://doi.org/10.1177/001316446002000104.

Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2016). Predicting student performance from LMS data: A comparison of 17 blended courses using Moodle LMS. IEEE Transactions on Learning Technologies, 10(1), 17-29.Available at: https://doi.org/10.1109/tlt.2016.2616312.

D’Orville, H. (2020). COVID-19 causes unprecedented educational disruption: Is there a road towards a new normal? Prospects, 49, 11–15.Available at: https://doi.org/10.1007/s11125-020-09475-0.

Dahleez, E.-S., Abed, K., Ayman, A., Al, A. M., Alawi, A., & Abdelmuniem, F. (2021). Higher education student engagement in times of pandemic: The role of e-learning system usability and teacher behavior. The International Journal of Educational Management, 35(6), 1312-1329.Available at: https://doi.org/10.1108/ijem-04-2021-0120.

Disho, M. R., Nchindo, M., & Dortea, S. (2022). Profiling students for online teaching and learning environments: A case for the University of Namibia, Rundu Campus. American Journal of Education and Learning, 7(1), 24–34.Available at: https://doi.org/10.55284/ajel.v7i1.625.

Dixson, M. D. (2015). Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learning, 19(4), n4.Available at: https://doi.org/10.24059/olj.v19i4.561.

Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59, 117–142.Available at: https://doi.org/10.3102/00346543059002117.

Forzani, E., & Leu, D. J. (2017). Multiple perspectives on literacy as it continuously changes: Reflections on opportunities and challenges when literacy is deictic. Journal of Education, 197(2), 19-24.Available at: https://doi.org/10.1177/002205741719700203.

Harrison, H., Birks, M., Franklin, R., & Mills, J. (2017). Case study research: Foundations and methodological orientations. Forum: Qualitative Social Research, 18(1), 19.Available at: https://doi.org/10.17169/fqs-18.1.2655.

Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344-362.Available at: https://doi.org/10.1007/s12528-017-9161-1.

Hewson, E., & Chung, G. W. (2021). Beyond the VLE: Transforming online discussion and collaboration through Microsoft Teams. International Journal of Management Science and Business Administration, 7(3), 37–45.Available at: http://dx.doi.org/10.18775/ijmsba.1849-5664-5419.2014.73.1004.

Hu, M., & Li, H. (2017). Student engagement in online learning: A review. Paper presented at the In 2017 International Symposium on Educational Technology (ISET), IEEE.

Hussain, M., Zhu, W., Zhang, W., & Abidi, S. M. R. (2018). Student engagement predictions in an e-learning system and their impact on student course assessment scores. Computational Intelligence and Neuroscience, 2018, 634718621.Available at: https://doi.org/10.1155/2018/6347186.

Lee, K. (2021). Openness and innovation in online higher education: A historical review of the two discourses. Open Learning: The Journal of Open, Distance and e-Learning, 36(2), 112-132.Available at: https://doi.org/10.1080/02680513.2020.1713737.

Maboe, K. A. (2019). Students’ support in an ODeL context: students in ODeL. In L. Darinskaia, & G. Molodtsova (Eds.), Modern technologies for teaching and learning in socio-humanitarian disciplines (pp. 114–137). Hershey, PA: IGI Global.

Mejía-Madrid, G., Llorens-Largo, F., & Molina-Carmona, R. (2020). Dashboard for evaluating the quality of open learning courses. Sustainability, 12(9), 3941.Available at: https://doi.org/10.3390/su12093941.

Microsoft. (2022). Get started with Microsoft Teams for remote learning. Retrieved from: https://docs.microsoft.com/en-us/microsoftteams/remote-learning-edu.

Ngoc, T. P., & Phung, L. T. K. (2021). Online language learning via moodle and microsoft teams: Students’ challenges and suggestions for improvement. Advances in Social Science, Education and Humanities Research, 533, 106–113.Available at: https://doi.org/10.2991/assehr.k.210226.013.

O’Neill, L. (2021). What is microsoft teams? Everything you need to know. Retrieved from: https://searchunifiedcommunications.techtarget.com/definition/Microsoft-Teams.

Omair, A. (2015). Selecting the appropriate study design for your research: Descriptive study designs. Journal of Health Specialties, 3(3), 153–156.Available at: https://doi.org/10.4103/1658-600x.159892.

Reguera, E. A. M., & Lopez, M. (2021). Using a digital whiteboard for student engagement in distance education. Computers & Electrical Engineering, 93, 107268.Available at: https://doi.org/10.1016/j.compeleceng.2021.107268.

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: evolution and future directions of the engagement construct. In Christenson, S. L., Reschly, A. L, & Wylie, C. (Eds.), Handbook of research on student engagement (pp. 3–20). New York: Springer.

Robinson, C. (2012). Student engagement: What does this mean in practice in the context of higher education institutions? Journal of Applied Research in Higher Education, 4(2), 94–108.

Rugube, T. T., & Govender, D. (2022). Evaluation of a software model for integrating learning management systems and massive open online courses. International Journal of Innovative Research and Scientific Studies, 5(3), 170–183.Available at: https://doi.org/10.53894/ijirss.v5i3.493.

Sahni, J. (2019). Does blended learning enhance student engagement? Evidence from higher education. Journal of E-learning and Higher Education, 1-14.Available at: https://doi.org/10.5171/2019.121518.

Sharma, G. (2017). Pros and cons of different sampling techniques. International Journal of Applied Research, 3(7), 749-752.

Sobaih, A. E. E., Salem, A. E., Hasanein, A. M., & Elnasr, A. E. A. (2021). Responses to COVID-19 in higher education: Students’ learning experience using Microsoft Teams versus social network sites. Sustainability, 13, 10036.Available at: https://doi.org/10.3390/su131810036.

Tria, J. Z. (2020). The COVID-19 pandemic through the lens of education in the Philippines: The new normal. International Journal of Pedagogical Development and Lifelong Learning, 1(1), 2-4.Available at: https://doi.org/10.30935/ijpdll/8311.

Trowler, V., & Trowler, P. (2010). Student engagement literature review. Retrieved from: https://www.heacademy.ac.uk/system/files/studentengagementliteraturereview_1.pdf.

University of Reading. (2020). Introduction to using Microsoft Teams in teaching and learning. Retrieved from: https://sites.reading.ac.uk/tel-support/2020/04/03/using-microsoft-teams-in-teaching-and-learning.

Vaismoradi, M., Jones, J., Turunen, H., & Snelgrove, S. (2016). Theme development in qualitative content analysis and thematic analysis. Journal of Nursing Education and Practice, 6(5), 100-110.Available at: https://doi.org/10.5430/jnep.v6n5p100.

Viberg, O., Khalil, M., & Bergman, G. (2021). TimeTracker App: Facilitating migrants’ engagement in their second language learning. In: Auer M.E., Tsiatsos T. (Eds.), Internet of Things, Infrastructures and mobile applications, IMCL 2019. Advances in Intelligent Systems and Computing, 1192 (pp. 983–994). Cham: Springer.

Vogt, K. L. (2016). Measuring student engagement using learning management systems. Doctoral Dissertation, University of Toronto (Canada).  

Williams, D., & Whiting, A. (2016). Exploring the relationship between student engagement, Twitter, and a learning management system: A study of undergraduate marketing students. International Journal of Teaching and Learning in Higher Education, 28(3), 302-313.

Yazzie-Mintz, E. (2007). Voices of students on engagement: A report on the 2006 high school survey of student engagement. Retrieved from: http://www.indiana.edu/~ceep/hssse/images/HSSSE%20Overview%20Report%20-%202006.pdf.

Yildiz, E. P. (2021). Student opinions regarding Microsoft Teams app used as an educational online learning environment during pandemic. International Journal of Emerging Technology and Advanced Engineering, 2(5), 53-62.

Yin, R. K. (2014). Case study research and design and methods. Thousand Oaks, CA: Sage.

Asian Online Journal Publishing Group is not responsible or answerable for any loss, damage or liability, etc. caused in relation to/arising out of the use of the content. Any queries should be directed to the corresponding author of the article.