Otwarty dostęp

Teaching efficacy of U.S. Air Force enlisted professional military educators during the COVID-19 pandemic

   | 02 maj 2021

Zacytuj

Introduction

In Spring 2020, the COVID-19 pandemic forced schools and universities across the U.S. to shift traditionally in-person instruction to online learning environments, and U.S. Air Force Enlisted Professional Military Education (EPME) was not spared from this mandated change in pedagogic practice (Culbert 2020). Hundreds of instructors teaching leadership development interventions at multiple levels of professional military education found themselves forced to teach online for the first time in the history of EPME (Culbert 2020).

In-residence and facilitated online EPME courses were similar in that students attended both for the same length of time, completed the same classroom discussions and assessments which were postured towards similar learning outcomes and received the same number of college credits for attendance (Smith 2020). However, online EPME delivery lacked tactile games and activities that were used to reinforce lesson concepts within the in-residence course, and mandated mastery of videoconferencing and online collaboration tools that are not used during in-person instruction, requiring instructors to quickly develop alternate delivery methods of classroom material (Kingery 2020). This study aimed to examine the interrelationship between teachers’ self-efficacy and the associated shift in pedagogic practice.

U.S. Air Force guidance stated, “…distance learning instructors must complete the same qualification process as a traditional class room instructor” (Community College of the Air Force [CCAF] 2017, p. 25) and did not mandate pre-service or in-service training specific to teaching in online learning environments. Given the fact that no data has ever been collected on the efficacy of U.S. Air Force EPME instructors teaching online, this study examined the reported teachers’ self-efficacy levels of instructors after they had begun to teach in online environments. The results of this study can be used to better inform enlisted and officer professional military education decision-makers across the Department of Defense (DOD) on where to surgically allocate faculty development resources for online instructors.

Background and literature review
Online EPME as a necessity

U.S. Air Force EPME prepares Airmen for leadership roles as they increase in both rank and responsibility (O’Neil 2020; Rivera & Shufelt 2016). Students are noncommissioned officers serving in roles which range from immediate supervisors to leaders responsible for managing, inspiring and motivating thousands of Airmen worldwide (O’Neil 2020; Bangari 2014; Rivera & Shufelt 2016). Air Force EPME consists of four levels of training, with Airmen attending each course as they advance in rank (AF/A1D 2018). There are over 500 instructors teaching the various levels of EPME across 80 schools globally (Thomas N. Barnes Center for Enlisted Education 2020a, 2020b, 2020c).

Tens of thousands of Airmen who are annually selected for promotion require EPME completion before assuming the next rank (AF/A1D 2018). Therefore, there was a need to continue courses during the CoViD-19 pandemic. Due to the necessity of continued EPME, resistance to change was not an option, and EPME instructors and schoolhouses were forced to adopt online delivery methods for courses that had traditionally been taught in face-to-face environments (Culbert 2020). Air Force guidance stated that “…distance learning instructors must complete the same qualification process as a traditional class room instructor” (CCAF 2017, p. 25) and did not mandate pre-service or in-service training specific to teaching online. Therefore, this study aimed to glean insight into the perceived self-efficacy of U.S. Air Force EPME instructors to better inform future decisions regarding training and development for online faculty.

Synchronous and asynchronous online teaching and learning

Rapid technological developments have enabled the rise of effective and efficient online learning systems (Dhawan 2020). Online learning uses computer networks to allow for the possibility for students and teachers to use computers and internet-connections to interact anywhere, anytime and in a rhythm and manner which is most beneficial to learning (Cojocariu et al. 2014). Synchronous online learning allows the learning environment to mimic in-person instruction with live lectures, discussions and interactions between students and teachers, whereas in asynchronous environments, learning can occur in different times and spaces that are particular to each learner (Guo 2020; Dhawan 2020). During the COVID-19 pandemic, U.S. Air Force EPME leveraged both delivery methods (Culbert 2020).

Teachers’ self-efficacy

Bernhardt (2018) noted that there are different varieties of data which one can analyse to examine the efficacy of learning environments such as student learning data (e.g. success in reaching learning outcomes), process data (e.g. cost, organisational processes) and demographic data (e.g. attendance, graduation rates). This study examined teachers’ self-efficacy by obtaining perceptions data directly from faculty members, since perceptions data is ideal for analysing thoughts, feelings and opinions of participants (Bernhardt 2018).

While Bandura (1977) defined the concept of self-efficacy as an individual's belief that they are capable of executing specific performance attainments, teachers’ self-efficacy refers to the self-efficacy of professional educators, and can be defined as the belief in one's abilities to “plan, organise, and carry out activities required to attain given educational goals” (Skaalvik & Skaalvik 2007, p. 612). Tschannen-Moran et al. (1998), in their comprehensive literature review, noted that higher senses of teachers’ self-efficacy bring forth feelings of perseverance, curiosity and drive to continuously improve pedagogic practices. Bandura (1977) noted that teachers with high levels of self-efficacy could maintain a positive outlook and succeed in situations where they feel unprepared or uncomfortable.

Teachers’ self-efficacy impacts teachers, students and schools (Stephanou et al. 2013). Skaalvik and Skaalvik (2019) noted a correlation between higher senses of self-efficacy and increased levels of student engagement. Zhang and Liu (2019) explained that teachers’ sense of self-efficacy plays a moderating role when examining motivational regulation, perceived task value and learner engagement, with the moderating effect increasing as the teachers’ sense of self-efficacy increases. Teachers’ self-efficacy has been examined from a myriad of angles, such as the difference in efficacy between male and female college professors (Chang et al. 2011), the difference in efficacy based on years of experience (Mehdinezhad 2012) and the difference in perceived self-efficacy of online teachers based on the age, education level, gender and number of classes taught online (Robinia and Anderson 2010).

Built upon foundational practices from eight previous methods used to measure teachers’ efficacy, Tschannen-Moran and Hoy (2001) crafted and validated an instrument known as the Ohio State teacher efficacy scale (OSTES), measuring the teachers’ sense of efficacy of K-12 educators. A factor analysis using pre-service teachers’ responses to the 24 items on the instrument identified three main factors: efficacy in instructional strategies (8 items; alpha = 0.91), efficacy in student engagement (8 items; alpha = 0.87), efficacy for classroom management (8 items; alpha = 0.90) and an overall combined OSTSES score (24 items; alpha=0.94).

The OSTES was further refined and validated when Robinia and Anderson (2010) modified the instrument (with permission) for use by nursing educators teaching online at the collegiate level. In addition to examining the three dimensions of teachers’ sense of efficacy within the OSTES, this revised instrument, known as the Michigan Nurse Educators Sense of Efficacy for Online Teaching (MNESEOT; see Appendix), brought the OSTES into the digital age by incorporating and examining an additional fourth dimension of teachers’ self-efficacy: computer use. The researchers validated the 32-item instrument by conducting a factor analysis, resulting in four factors: efficacy in online student engagement (0.93), efficacy in online instructional strategies (0.94), efficacy for online classroom management (0.93) and efficacy in the use of computers (0.86). A total score for the entire instrument (0.93) was also derived.

Robinia and Anderson (2010) administered the MNESEOT to 327 participants, receiving a 43% usable response rate, or 140 participants, from college-level nursing educators across the state of Michigan. Results indicated a positive and significant correlation between teachers who had met with an instructional support expert (such as an expert in educational technology or online delivery methods) and sense of teaching self-efficacy, as well as a positive and significant correlation between educators who had taken a course in online teaching and sense of teaching self-efficacy. Robinia and Anderson (2010) also noted a positive and significant correlation between the number of online teaching experiences and higher senses of teachers’ self-efficacy. Participants felt higher senses of teachers’ self-efficacy in the subscales of classroom management and instructional strategies, whereas student engagement was noted as the category where educators felt least confident when teaching online.

Horvitz et al. (2014) used a modified web-based MNESEOT instrument to ascertain the sense of efficacy of 91 college-level educators within a variety of organisations specialising in liberal arts and humanities, STEM and professional skills development. Results mirrored those of Robinia and Anderson (2010): the highest efficacy scores were observed in the instructional strategies and classroom management categories, whereas the lowest efficacy scores were observed in the student engagement category. Results indicated that there was a significant and positive correlation between the number of semesters taught online and the teachers’ sense of self-efficacy, indicating that as an educator has more experience teaching online, their sense of efficacy rises. Unique to the study was data indicating that educators affiliated with professional schools, as opposed to traditional colleges, displayed a significantly higher sense of efficacy in the subscale of computer use.

Aside from the MNESEOT, other instruments have been used to measure teachers’ sense of efficacy. Chang et al. (2011) examined the teachers’ self-efficacy of 513 university-level educators in Taiwan using the Faculty Teaching Efficacy Questionnaire, an instrument comprised of 28 four-point-Likert items. The items were affiliated with six factors: efficacy for course design, technology usage, instructional strategy, class management, interpersonal relation and learning assessment. Factor analysis results for each factor were consistently large, between 0.58 and 0.88. 73.59% of the total variance was accounted for by all six factors. The categories of internal consistency reliability for course design, instructional strategy, technology usage, class management, interpersonal relation, learning assessment and the total scale were 0.91, 0.88, 0.93, 0.90, 0.86, 0.87 and 0.95, respectively (Chang et al. 2011). Results indicated that educators’ sense of teaching efficacy was lower for teachers with five or fewer years of experience than their counterparts who had taught for 6 years or longer. Results also indicated that institutional investment in educators’ professional development, such as assigning coaches, mentors or specialists to hone the skill-sets of educators, positively correlated with teachers’ sense of self-efficacy.

Although teachers’ self-efficacy has been thoroughly studied within K-12 and postsecondary education (Norton 2013; Walsh et al. 2020; Avalos 2011; Loughland and Nguyen 2020), it is true that less is known about the self-efficacy of U.S. Air Force EPME instructors teaching leadership interventions at the collegiate level. This study aimed to continue the work of Robinia and Anderson (2010) by utilising their self-efficacy measurement instrument (with permission) to examine the variables that affect the self-efficacy of EPME instructors who were forced to teach online during the COVID-19 pandemic without specialised training to instruct in online learning environments.

Research questions

Online EPME instructors were not required to be trained to teach in ways specific to online learning environments (CCAF 2017). With hundreds of instructors teaching online with no formalised training specific to online instruction, this study began with the following research questions:

How confident do EPME instructors feel in preparing, conducting and evaluating online courses? Is there a difference in online teaching efficacy in relation to the variables: (a) military rank, (b) years of teaching EPME, (c) number of online classes taught, (d) level of EPME at which the instructor is teaching, (e) instructor education level and (f) the qualification level of the instructor?

In what ways do professional development, experience with online teaching and perceived support from colleagues or instructional support specialists influence EPME instructors’ reported self-efficacy when teaching online?

Hypotheses

This study examined instructors who were not formally trained to utilise online instructional technologies. As Robinia and Anderson's (2010) results indicated a positive relationship between the number of online teaching experiences and increased self-efficacy levels for online teaching, as well as a positive relationship between working with instructional experts and increased self-efficacy, the hypotheses for this study were as follows:

Of the four dimensions of self-efficacy which were examined, the dimension of technology use will be the lowest amongst EPME instructors.

There will be a positive relationship between the number of online teaching experiences and increased self-efficacy levels for online teaching.

EPME instructors who report experiences of working with instructional support specialists will have significantly higher levels of online teaching efficacy.

Methods

An online survey was distributed to EPME schoolhouses in September 2020 using a cross-sectional design to examine the sense of self-efficacy of instructors across all four levels of EPME. Instructors were asked to participate if they were currently teaching or had previously taught EPME online during the COVID-19 pandemic. A review of the EPME faculty training database and an examination of the number of schools teaching courses online revealed a population estimate of 500. The useable response rate was 26% or 129 participants. The typical survey respondent was serving in the non-commissioned officer pay grades of E-5 or E-6 (66%; see Table 1) and was assigned to the lowest level of U.S. Air Force EPME, Airman Leadership School (58%; see Table 2), a fully qualified instructor who had completed all on-the-job training requirements (89%), held an associate's degree as the highest level of education (52%) and had almost two years of teaching experience (M = 1.7, SD = 1). This study was granted exempt status from the institutional review board at Air University.

Participants by rank/grade

Rank n %
CMSgt/E-9 1 0.8%
SMSgt/E-8 12 9.3%
MSgt/E-7 33 25.6%
TSgt/E-6 57 44.2%
SSgt/E-5 26 20.2%

Respondents teaching at each level of EPME

EPME level n %
Airman Leadership School 75 58.1%
Non-commissioned Officer Academy 49 37.9%
Senior Non-commissioned Officer Academy 4 3.1%
Chief Leadership Course 1 0.8%
Instrument

The instrument for this study replicated Robinia and Anderson's (2010) sense of efficacy survey. Since they were not tailored to any specific institution or profession, all 32 questions pertaining to the four dimensions of teachers’ self-efficacy (classroom management, student engagement, technology use and instructional strategies) used the exact wording from Robinia and Anderson's (2010) instrument. Modifications were only made to demographic questions at the end of the survey, as this study examined military professionals as opposed to nurse educators.

This study was undertaken to provide actionable data to U.S. Air Force decision-makers. Therefore, the question of gender was intentionally omitted from the background questionnaire, because the U.S. Air Force does not consider gender in recruitment, hiring, training, evaluation or retention of EPME faculty (Military Assignments Programs Branch 2020). This study viewed the EPME instructor as a gender-neutral position.

Cronbach's alpha for the entire instrument was established as 0.97. Based on a factor analysis of the 32 items in the instrument, four items were derived coinciding with Robinia and Anderson's (2010) four dimensions of teachers’ self-efficacy. Subscale Cronbach's alphas were computed for student engagement (0.92), instructional strategies (0.90), classroom management (0.84) and computer skills (0.85).

Process for data collection

These data were collected using an online survey, and all questionnaires were stored and administered by utilising commercial survey software. After obtaining official e-mail addresses for all instructors from EPME headquarters, instructors were contacted through official U.S. Air Force e-mail, which requested their participation. Within the e-mail, a link provided directions and access to the survey. The survey was closed 2 weeks after the initial solicitation for participants.

Data analysis

This study employed descriptive statistics to describe the sample through frequencies and cross tab comparisons to population statistics. Means and standard deviations were calculated to answer questions posed within this research. When determining correlations between interval-level variables, Pearson's product–moment correlation coefficient was used. Analysis of variance was used to test differences between the means of the online teaching efficacy scores. This study used alpha of 0.05 throughout all tests.

Factor analysis

Factor analysis was conducted to ensure items within the questionnaire measured the appropriate category of teachers’ self-efficacy. With a sample size of 129 participants, principal axis factor analysis with varimax rotation was conducted to assess the underlying structure of the 32 items of the instrument using SPSS version 27. Four factors were requested, based on the presumption that questions were designed to index four constructs: classroom management, technology use, learner engagement and instructional strategies. After rotation, four components emerged with eigenvalues greater than one, explaining 19%, 17%, 15% and 8% of the variance respectively. As recommended by Morgan et al. (2019), factors less than 0.3 were omitted to improve clarity, since the correlation matrix revealed a majority of coefficients of 0.3 and greater. Used to measure sampling adequacy, results from the Kaiser–Meyer–Olkin test were > 0.70, exceeding the recommended value of 0.6 (Pallant 2020). Bartlett's test of sphericity produced a value < 0.05, substantiating factorability of the correlation matrix (Morgan et al. 2019).

Varimax rotation indicated that 11 items loaded strongly on one component consisting of questions related to classroom management. Twelve items loaded strongly on two components consisting mainly of questions related to technology use. Nine items loaded strongly on a third component consisting of questions related to instructional strategies. Questions related to student engagement were represented across all four components.

Results
Efficacy levels of EPME instructors related to online teaching

The 32-item instrument directed participants to respond to items using a Likert-scale format ranging from nothing (1) to a great deal (9) in relation to questions pertaining to online teaching.

Mean subscale scores within the domains of classroom management, instructional strategies, student engagement and technology use were calculated (Table 3). These scores were then combined to create a total sense of efficacy score, ranging from 4 to 36. Generally, participants indicated they felt they could do more than “some” to “quite a bit” when tasked with preparing, conducting and evaluating online courses. No participant felt they could do “nothing” in any category. The mean score for the instrument was 26.92 with a standard deviation of 5.25.

Online teaching efficacy ratings

Efficacy dimension M SD Range
Student engagement 6.34 1.42 2.38–9
Instructional strategies 6.67 1.32 3.38–9
Classroom management 7.04 1.25 3.86–9
Technology use 6.89 1.26 3.0–9

One-way between-groups analysis of variance showed no significant difference in sense of efficacy scores for rank [F(4,123) =.957), p = 0.434], level of EPME [F(6,122) = 1.85), p = 0.95], instructor's education level [F(2,126) = 1.09), p = 0.34] or number of online courses taught [F(5,121) = 1.73) p = 0.13].

An independent samples t-test found no significant difference between instructor trainees (n = 9, M = 7, SD = 0.76) and fully qualified instructors (n = 116, M = 6.69, SD = 1.26), t(123) =.713, p = 0.48, or respondents who had a degree in education (n = 19, M = 6.57, SD = 1.24) and respondents without a degree in education (n = 107, M = 6.71, SD = 1.22), t(124) = −0.465, p = 0.64.

One-way between-groups analysis of variance showed a significant difference in sense of efficacy scores based on number of years having taught EPME [F(3,124) = 4.148), p = 0.008]. Responses regarding years of teaching EPME were further categorised as (a) less than one year, (b) between 1 and 2 years, (c) between 2 and 3 years and (d) three or more years, as there was noted a rise or fall of sense of teachers’ self-efficacy at each of those points (see Figure 1). The effect size was medium to large at 0.09 according to Cohen (2013). Post hoc comparisons using the Tukey's honestly significant difference test indicated that a significant difference was found between instructors who had taught between 1 and 2 years and instructors who had taught between 2 and 3 years (p = < 0.05).

Fig. 1

Efficacy of EPME instructors based on years of experience.

Preparatory experiences for online EPME instructors

Participants were asked to identify whether they had any of five possible preparatory experiences for online teaching including: (a) having a teaching degree, (b) taking a seminar on online teaching, (c) taking a course on online teaching, (d) meeting with a peer mentor on a regular basis during an online teaching experience and (e) meeting with an instructional support expert during an online teaching experience. Affirmative responses were directed to additional questions which asked participants to rate their agreement that the experience prepared them in the skills necessary to teach online. A Likert-type scale was used with ratings of 1 = strongly disagree, 2 = slightly disagree, 3 = neutral, 4 = agree and 5 = strongly agree.

An independent sample t-test found significant differences in total sense of efficacy scores for instructors who had met with an instructional support expert during an online teaching experience (M = 7.09, SD = 1.07) compared to instructors who did not (M = 6.56, SD = 1.21) t(122) = −2.02, p = 0.04. Pearson's product–moment coefficient revealed positive and significant correlations between both meeting with an instructional support expert during an online teaching experience and overall sense of efficacy, with a medium or typical effect size (0.47) according to Cohen (2013).

Independent samples t-tests found no significant differences in the efficacy levels between instructors who had a teaching degree (M = 6.57, SD = 1.24) and those who did not have a teaching degree (M = 6.71, SD = 1.21), t(124) = −0.465, p = 0.66, those who had taken a course on online teaching (M = 6.57, SD = 1.30) and those who had not (M = 6.74, SD = 1.19), t(124) = −0.672, p = 0.5, those who had attended a seminar on online teaching (M = 6.7, SD = 1.30) and those who had not (M = 6.7, SD = 1.19), t(124) = 0.032, p = 0.97 or those who had met with peer mentors on a regular basis during an online teaching experience (M = 6.83, SD = 1.11) and those who had not (M = 6.57, SD =1.29), t(124) = 1.18, p = 0.24.

Hypothesis findings

Hypothesis 1 was not supported, as the self-efficacy dimension of technology use was not the lowest reported among EPME instructors. Hypothesis 2 was also not supported, as there was no positive relationship between the number of online teaching experiences and increased self-efficacy levels for online teaching. Hypothesis 3 was supported, as EPME instructors who reported working with instructional support specialists also reported significantly higher levels of online teaching efficacy.

Discussion

Results from this study of 129 U.S. Air Force EPME instructors indicate that they are relatively confident in their abilities to prepare for, conduct and evaluate online courses. Across all ranks and EPME levels, instructors expressed that they felt they could do between “some” and “quite a bit” across the four dimensions of teachers’ self-efficacy when teaching online. This is the same range of confidence levels reported by Robinia and Anderson (2010). These data indicate that pre-service and in-service faculty development efforts, which do not focus on online teaching strategies, are sufficient for developing faculty to teach online.

Results confirm that there is no significant difference in perceived levels of teachers’ self-efficacy between military ranks, levels of EPME at which instructors teach, education levels of instructors or instructors who have completed and not completed all instructor qualification training requirements. Therefore, when managing talent within EPME, DOD leaders should consider that instructors across the above-mentioned categories feel equally confident and competent in their abilities to accomplish tasks and meet goals. Talent management decisions could potentially be made without regard to rank, level at which an instructor is teaching or the instructor's education level, instead focussing on overall experience in the EPME enterprise.

Analysis indicates there is a significant positive correlation between the amount of time an instructor has been teaching and their sense of self-efficacy when teaching online courses, which results are similar to those revealed by Horvitz et al. (2014) and Robinia and Anderson (2010). Results reveal that instructors with minimal experience feel relatively high levels of confidence, with that confidence decreasing over the first year of teaching before rising once more as experienced is gained. This drop and subsequent rise in teachers’ sense of efficacy has been noted by Tollerud (1990) and Tschannen-Moran et al. (1998). Chang et al. (2011) Horvitz et al. (2014) and Robinia and Anderson (2010) noted that teachers with higher senses of self-efficacy are more confident, display more agility and better engage with students in online learning environments. Thus, DOD professional military education leadership should consider choosing more seasoned instructors to teach online, since they would potentially have higher senses of self-efficacy and create more engaging and effective online learning environments.

This study also highlighted the importance of preparatory experiences for online instructors, showing a positive significant correlation between higher efficacy scores and instructors who had collaborated with instructional support specialists when teaching online. Instructional support specialists provide coaching and mentorship to teachers, model effective teaching strategies and can specialise in areas such as educational psychology or educational technology (Lee 2001). These results coincide with prior studies on online instruction, which reiterated the need for external support to facilitate online instruction (Johnson 2008; Kim & Bonk 2006; Maguire 2005, Robinia & Anderson 2010). DOD professional military education leadership who wish to increase the measure of efficacy of online instructors should consider integrating instructional support specialists into faculty training and development efforts.

Student engagement was the efficacy sub-category where instructors felt the least confident and competent when teaching online, similar to the findings of Robinia and Anderson (2010) and Horvitz et al. (2014). While pre-service and in-service instructor training appears to sufficiently prepare instructors to teach online, future modifications to training should focus on student engagement techniques within online learning environments.

Limitations

This study relied solely on perceptions data. Perceptions data refer to data which have been collected to examine peoples’ beliefs, sentiments and feelings about an educational experience (Bernhardt 2018). To gain a more holistic understanding of the efficacy of EPME instructors teaching online, future studies should incorporate the remaining three facets of data-driven decision-making as per Bernhardt (2018): student learning data (e.g. success in reaching learning outcomes), process data (e.g. cost, organisational processes) and demographic data (e.g. attendance, graduation rates).

Additionally, this study only collected perceptions data from EPME instructors. Future studies on EPME instructor efficacy should triangulate perceptions data from multiple data sources to include students, administrators and EPME leadership teams. This would allow participation from a broader number of participants than would otherwise be allowed to participate, since disparate viewpoints of the same instructor are used to strengthen the validity of findings (Carter et al. 2014).

During this study, the Airman Leadership School level of EPME delivered the same curriculum as they had previously taught in-residence for the preceding 12 months, while the Non-Commissioned and Senior Non-Commissioned Officer Academy schoolhouses fielded new curricula in conjunction with their shift to online teaching and learning and participated in this study after teaching two courses lasting 25 days each (Smith 2020; Stephens 2020). Future studies should examine the efficacy of the Non-Commissioned and Senior Non-Commissioned Officer Academy instructors after they have become more familiar and comfortable with course content, since the lack of familiarity with the curriculum could have affected the instructors’ sense of teachers’ self-efficacy.

eISSN:
1799-3350
Język:
Angielski
Częstotliwość wydawania:
Volume Open
Dziedziny czasopisma:
History, Topics in History, Military History, Social Sciences, Political Science, Military Policy