December 2007 Index
 

Home Page

Editor’s Note: This is a study to determine reasons for low completion rates in faculty training courses in a corporate university. It identifies forces external to the course that impact completion rates and defuses the assumption that online courses are inferior in content or pedagogy.

Factors Affecting Completion Rates
in Asynchronous Online Facilitated
Faculty Professional Development Courses

John Sener and Robert L. Hawkins
United States

Abstract

Course or program completion has long attracted great interest and some controversy, whether expressed in positive terms (e.g. retention) or negative ones (e.g. attrition). One corporate university conducted a study to determine possible causes for a large disparity in completion rates between classroom and online courses in its Faculty Professional Development (FPD) program. The study focused on factors identified from previous course completion studies and additional factors derived from observational analysis of FPD courses. Study results indicated that time conflicts with work commitments, level of organizational support, and learners’ early course experience were important factors affecting completion rates in online facilitated courses, corroborating the findings of earlier studies. Learners’ technology proficiency and comfort level did not affect course completion in this study, however. The study also identified possible contributory factors to consider when analyzing course completion rates or comparing them across delivery modes, including course completion requirements, enrollment and withdrawal policies, learners’ delivery mode preferences, and course quality. Study results strongly suggest that achieving comparable completion rates for online facilitated courses relative to classroom courses requires the development of support structures comparable to existing structures for classroom and web-based training courses.

Keywords: course completion, retention, student success, attrition, dropout, faculty professional development, online education, corporate university, asynchronous courses, online facilitated courses.

Introduction

Course or program completion has long been an issue that attracts interest and controversy, whether expressed in positive terms (retention, completion, student success) or negative (attrition, dropout, non-completion). One corporate university noticed a large disparity in completion rates after introducing online facilitated courses to its faculty professional development program. It decided to conduct a study to determine possible causes for this disparity.

This corporate university is a consortium of education and training institutions and organizations with a headquarters and five regional campuses scattered across the U.S. Its stated mission is to provide practitioner training, career management, and services to enable its target community
“to make smart business decisions and deliver timely and affordable capabilities.”
It serves a worldwide constituency and has consistently attempted to be on the leading edge of integrating training, education and emerging technologies.

In late 2004, it embarked upon an effort to expand its learning options for students to include online facilitated courses. (These courses were distinct from the corporate university’s web-based training (WBT) which rely on self-paced, asynchronous, content-led and instructor facilitated course delivery using discussion forums, e-mail and other communication tools.)

The first courses developed were for the university’s internal faculty professional development program (FPDP) and then expanded into “assignment-specific” courses for the university’s primary constituency. In total, the FPD program offers eleven education-related courses for its faculty, of which four were converted to online facilitated format.

After several offerings of these online facilitated faculty development courses, it became apparent that many participants who started courses did not complete them. Analysis of initial results indicated a huge disparity in completion rates between classroom and online facilitated courses (44% online vs. 98% classroom). Consequently, a study was initiated to determine the possible reasons for this disparity in completion rates between delivery modes.

The Issue of Completion Rates in Online Courses

Some studies suggest that corporations are finding no significant difference in learning and performance between distance education and face-to-face courses, comparable to the “no significant difference” phenomenon reported in higher education institutions (Russell, 1998 and Zolkos, 1999). Distance education in the corporate sector are frequently linked to specific job requirements, unlike in higher education where students have more flexibility with course or program choices (Tyler-Smith, 2006; Henke & Russum, 2002). As a self-contained curriculum, the FPDP courses are selectively mandatory as dictated by an internal policy directive on faculty certification. FPD courses can be required for job performance and may impact employees’ job ratings and future career opportunities. Application of these requirements varies by campus or course so that online facilitated FPD courses are often not mandatory and this has implications for course completion rates.

Many research studies have found online courses exhibit higher attrition rates compared to on-campus courses (Diaz 2002; Royer 2006; Diaz and Cartnal, 2006). This has contributed to a belief that higher attrition rates are a major weakness in online education (Carr, 2000; O'Brien and Renner, 2002) despite the fact that online students often outperform traditional students (Diaz 2002; Royer 2006).

Higher drop rates do not indicate lack of success or even the presence of a problem. There are many factors besides the actual delivery mode that can account for the difference in retention rates between classroom-based and online courses:

  • Online students are more likely to be employed. One study, for instance, found that five out of six online students were employed and thus could not attend traditional classes. The study noted that “employment responsibilities may also contribute to the attrition rate” in online courses relative to campus-based courses with lower student employment rates (Bocchi, Eastman, and Swift 2004).

  • Online students are more mature. Many online students drop classes “because it is the right thing to do” for them, i.e., as a “mature, well-informed decision that is consistent with a learner with significant academic and life experience” (Diaz 2002).

  • Job performance is a better measure. In one study, fully 25 percent of survey respondents (n=375) dropped out of an online course because they learned what they needed to know in order to do the job before the course ended (Wang et al. 2003). Although this finding may overstate the case, it highlights the fact that dropout rates alone are often not a reliable method of evaluating effectiveness of e-learning courses.

  • Statistics lie. As one researcher noted, “statistics on retention and drop outs are, at best, fragmented, do not compare like with like, and are either unreliable and/or misleading” (Tyler-Smith 2006). Thus it is very important to examine completion rates within the context of individual programs to ascertain their meaning and implications.

  • Other possible factors include student characteristics (demographics), course and instructional quality, subject matter, “socioeconomic factors, disabilities, or apathy” (Diaz 2002); “cumulative grade point average, class rank, number of previous courses completed online, searching the Internet training, operating systems and file management training, and Internet applications training” (Dupin-Bryant 2004).

Another study notes “there has been very little research on dropouts in online education.” Available research suggests that the reasons for dropping out of distance education programs are “complex, multiple, and interrelated” in the aggregate and “varied and unique to each individual” (Willging and Johnson 2004). This observation applies to online courses (Tyler-Smith 2006).

Reasons for Non-Completion of Online Courses

In the corporate university environment there are many potential reasons for non-completion of courses:

Time conflicts with work commitments: This issue has several dimensions, as reported in previous research findings:

  1. Increased work hours. Over one-quarter (27.7%) of respondents in one study reported that their work hours increased while doing e-learning courses.

  2. Infringement on non-work hours. Almost one-third (30.4%) of respondents in the same study reported doing their e-learning from home (Wang et al, 2003), while e-learners in another study reported that about 60% of their time spent on the course utilized their personal time rather than work time (Thalheimer 2004).

  3. Reduced capacity to perform work and e-learning duties. Work tasks and non-supportive policies (e.g., office “open door” policies) increased dropout rates (Wang et al. 2003).

Asking employees to learn on their own time or to juggle e-learning with full-time work responsibilities is may compromise results. Assigning e-learning as an additional responsibility requires employees to sacrifice something, usually personal time at home. Assigning e-learning on top of regular work duties gives it a lower priority, which also decreases the likelihood of completion (Takiya et al. 2005).

Lack of organizational support creates problems even for motivated learners. One study found that programs with “top-level visibility” organizational support had higher completion rates (Wang et al. 2003).

Cognitive load factors include "technical access, asynchronicity, text-based discussions, multiple conversations, information overload and [physical] isolation” (Tyler-Smith 2006, Whipp and Chiarelli 2004). Unfamiliarity of these elements of the learning environment increases the cognitive load and can make the initial stages of e-learning a daunting task, particularly for first-time learners.

User proficiency, comfort with technology: Even in a technology-rich workplace, not all employees are highly proficient or comfortable with using technology. The “technology hurdle” can cause large enrollment drops early in a course (Kleinman & Entin 2002). Comfort level is also a key factor: one study reported a 26% dropout rate and 13.6% of respondents reported feelings of “discomfort” or “high discomfort” with the technologies. This alone could account for up to half of the reported dropout rate (Wang et al. 2003).

Lack of motivation: Upon closer inspection, this seemingly obvious factor more complex than it appears. Many researchers cited lack of motivation as a cause for higher attrition (Diaz and Cartnal 2006; Moore, Sener, and Fetzner 2006; Wang et al. 2003; Diaz 2002). There are a host of external and internal factors such as well-organized course introduction and overview (Conrad 2002), course and instructor quality (Diaz 2002), or locus of control (Parker 1999) that can adversely affect motivation. Other causes for lack of motivation are ones previously noted – time conflicts with work commitments, cognitive load factors, lack of top-level organizational support, and lack of user proficiency or comfort with technology. Thus it makes more sense to look at causal factors rather than lead to “lack of motivation.”

Early attrition in e-Learning: The British Open University found that 35% or more of e-Learners withdraw before submitting their first assignment (Simpson 2004, p. 83), which suggests that a learner’s initial experience with e-Learning may well have a significant impact on a decision to drop out (Tyler-Smith, 2006). Another study found higher course dropout rates for online courses prior to instruction but almost identical course completion rates for classroom and online courses once instruction started, indicating that non-instructional factors contributed to early attrition and accounted for all of the difference in dropout rates (Frydenberg, 2007).

Data Collection Methodology

Course completion results were compiled and analyzed for online facilitated courses, using course enrollments, course headcount, and faculty headcount as measures.

An online survey was developed to obtain quantitative and qualitative information from faculty who had enrolled in online facilitated FPD course offerings. The survey questions for this study were derived primarily from a literature review of existing studies and focused on such factors as time-related issues, organizational support, technology proficiency, and learners’ initial course experience. Survey questions were also derived from observational analysis of perceived variables in the FPD courses, for example the level of “robustness” designed into the courses and the resulting effect on learning quality and time requirements.

The online survey was sent out via e-mail to 59 out of the 76 faculty who had completed one or more online facilitated FPD courses. Faculty who left the university without current contact information was not included. The target population included some faculty who had completed the classroom version of the course and were subsequently invited to participate in piloting the initial online course offerings. A total of 44 responses were received, for a 75% response rate (58% of the total target population).

Based on the preliminary results of the online survey, an additional ‘survey’ request was sent out by e-mail to all faculty, asking for responses from faculty who had “thought about and/or wanted to sign up for a online facilitated course but did not.” Respondents were asked to explain why they did not sign up and to identify specific obstacle(s) which may have stopped them. A total of 29 narrative responses were received from the second survey.

Findings

Course Completion Results

A total of 44 faculty completed one or more FPD courses. Of these, nine successfully completed two courses and two completed three courses, for a total of 57 successful completions. Course completion rates ranged from 57% to 62% depending on the measure used (course enrollments, course headcount, and faculty headcount (Table 1). Eight faculty took a course more than one time and five of these completed the course successfully on the second try.

Table 1

Course Completion Results for Online Facilitated Courses

Key Variable

Results

Headcount for all course offerings

76

# of enrollments for all course offerings

100

% of completions by course enrollment

57%

Headcount by course

92

% of completions by course headcount

62%

% of faculty who completed any course [headcount]

59%

Online Survey Results

Almost two-thirds of the respondents were male. Over 60% were aged 50 and over, and more than 90% were age 40 and over, reflecting the university’s practice of hiring faculty with substantial prior experience in their areas of expertise (Table 2).

Table 2

Respondent Demographic Characteristics

Gender

%

#

 

Male

65.91%

29

 

Female

34.09%

15

 

 

 

 

Age by Gender:

Age

 

 

Male

Female

20-29

0.00%

0

0

0

30-39

9.09%

4

2

2

40-49

29.55%

13

9

4

50-59

50.00%

22

13

9

60+

11.36%

5

5

0

Respondents reported a variety of reasons for enrolling in a facilitated online course. Almost one-half cited fit with career goals and/or for other professional development reasons. Over one-third cited convenience and almost one-third cited flexibility of schedule as reasons for enrolling. Over one-third of respondents also reported other reasons for enrolling; the most commonly cited were to meet faculty certification requirements (seven responses) and because the course was only offered in an online format (four responses – Table 3).

Table 3

Reasons for Enrolling in a Facilitated Online Course

Reasons for Enrolling

%

#

Flexibility of schedule

31.82%

14

Convenience of taking the course online

36.36%

16

Fit well with career goals

47.73%

21

Other professional development reasons

45.45%

20

Other (please specify)

36.36%

16

    Meet training/certification requirements

 

8

    Only offered in online facilitated format

 

4

    Preparing to develop a similar course

 

1

    Support a curriculum development project

 

1

    Meet rank advancement requirements

 

1

    I am an online instructor

 

1

    [I was] encouraged to take the course

 

1

    Wanted to become familiar with this tool

 

1

Of the 44 respondents, there were 27 “completers” and 17 non- or partial completers (Table 4). Compared to actual course completion data, it appears that course completers responded to the survey in about the same proportion as non- or partial completers. The reported numbers indicate that 61% of completers responded vs. 46% of non- or partial completers. However, the survey allowed respondents to self-identify as a course completer based on course rather than course offering, which could account for most or all of this variance as there were several faculty who successfully completed a course on the second try after not having completed the first try. Since the survey was anonymous, it is impossible to tell how many of these respondents self-identified as completers or partial completers. Thus there is a high likelihood that respondents were a representative sample of the larger population.

Table 4

Respondent Completion Rates for Facilitated Online Courses

 

%

#

Yes, I completed all of the courses I took.

61.36%

27

I completed one or more of the courses I took, but I did not complete one or more of the other courses I took.

20.45%

9

No, I did not complete any of the courses I took.

18.18%

8

Respondents who reported not completing one or more facilitated courses were asked additional questions about their reasons for non-completion. These questions were organized into five categories: job, personal, course, technology, and learning environment (Table 5).

Table 5

Reasons for Non-Completion of Facilitated Online Courses (N=16)

Personal Reasons

%

#

Lack of time to complete the assignments

43.75%

7

Schedule conflicts with other personal activities

37.50%

6

Family issues

12.50%

2

None of these reasons applies to me.

31.25%

5

Other personal reasons (please specify)

1

Job-Related Reasons

%

#

Job responsibilities changed during the course

56.25%

9

My supervisor did not support my taking the course.

0.00%

0

My work hours increased while taking the course

37.50%

6

I had to use too much of my personal time to complete the course.

25.00%

4

There were too many distractions at work for me to complete the course.

56.25%

9

I asked for but did not receive comp time to do evening or weekend study.

0.00%

0

I learned what I needed to know for my job before the course ended.

0.00%

0

None of these reasons applies to me.

18.75%

3

Other job-related reasons (please specify)

4

Course-Related Reasons

%

#

The course was too difficult/demanding

25.00%

4

The course was not demanding enough

0.00%

0

The course was more difficult than a comparable web-based or classroom course

12.50%

2

The group assignments were too difficult

6.25%

1

Lack of one-to-one interaction with the instructor(s)

6.25%

1

Lack of one-to-one interaction with other students

6.25%

1

The course didn’t meet my expectations

0.00%

0

None of these reasons applies to me.

75.00%

12

Other course-related reasons (please specify)

2

Technology-Related Reasons

%

#

The Blackboard learning management system was too complicated

0.00%

0

The online learning environment was too de-personalized

0.00%

0

There was not enough technical support from [university] staff

0.00%

0

There were too many technical problems.

0.00%

0

My technical skills were inadequate to do the program

0.00%

0

None of these reasons applies to me.

100.00%

16

Other technology-related reasons (please specify)

2

Learning Environment-Related Reasons

%

#

Assignment scheduling

12.50%

2

Figuring out how to find my way in an unfamiliar learning environment

0.00%

0

Having to do the course asynchronously instead of in real time

6.25%

1

Having to do text-based discussions instead of oral discussions

6.25%

1

Presentation of information was too unordered and non-linear for me

6.25%

1

Physical isolation from the instructor and/or other students

25.00%

4

Lack of support when I encountered difficulty

0.00%

0

Longer turnaround time for answering questions

12.50%

2

I did not care for the online instructor

6.25%

1

None of these reasons applies to me.

68.75%

11

Other learning environment-related reasons (please specify)

3


Job-related reasons
were cited most frequently by far (32 responses). Changing job responsibilities and work distractions were both cited by more than half of the respondents. Six respondents said that their work hours increased while taking the course, and one-quarter of the respondents said that they had to use too much of their personal time to complete the course. Other responses noted conflicts with other time demands; one respondent noted an increase in work hours because of the course.

Lack of time to complete the assignments (seven responses) and schedule conflicts with other personal activities (six responses) were frequently cited as personal reasons for course non-completion.

Course-related reasons were a less important factor for most non or partial completers. One-quarter of these respondents said the course was too difficult or demanding, two respondents cited difficulty relative to a comparable web-based or classroom course. One of the “other” responses noted that the assignments were more “time-consuming” than difficult or demanding.

Learning environment-related reasons were also a less important factor for most respondents. One-quarter of respondents cited “physical isolation from the instructor and/or other students” as a factor, while assignment scheduling and longer turnaround time for answering questions were also cited by more than one respondent. Other responses cited a sense of distraction, a lack of time to devote proper attention, and a dislike for technology-mediated learning as reasons.

Technology-related reasons were not a factor at all for non-completions. The two “Other” comments indicated that technology was not a problem.

Survey respondents were also asked to estimate what percentage of the time they worked on the course at work, at home, or somewhere else. The purpose of this question was to determine whether or not faculty spent a significant amount of time studying at home or other venues. The findings indicate that this was indeed the case (Chart 1):

  • Many respondents spent a lot of the time working on the course from home.

  • A significant proportion of respondents spent most of the time working on the course from home.

  • Relatively few respondents spent all of their time doing their course while at work.

  • A significant proportion of respondents spent some of their time working on the course from somewhere else.

Figure 1. Locations for working on online courses.

All respondents were also asked a series of statements about facilitated courses in general. The purpose of these statements was to obtain more information related to course content, course difficulty, delivery setting, pre-course transparency, and time-related issues. All but one respondent completed this section of the survey (Table 6).

Table 6

Responses to Other Key Statements about Facilitated Online Courses (N=43)


Statement

Strongly Agree

Agree

Neutral

Disagree

Strongly Disagree

The [online] course was more difficult than anticipated.

16.3%

32.6%

18.6%

27.9%

4.7%

The course took more time to complete than anticipated.

23.3%

41.9%

14.0%

18.6%

2.3%

Course content was relevant to my job.

48.8%

46.5%

4.7%

0.0%

0.0%

I would prefer to take the course in a classroom setting.

27.9%

27.9%

16.3%

20.9%

7.0%

I had sufficient time during the workday to complete the online coursework.

2.3%

30.2%

9.3%

41.9%

16.3%

I asked for and received comp time during the workday to complete the course.

2.3%

4.7%

32.6%

34.9%

25.6%

I asked for and received comp time evenings and/or weekends to study.

0.0%

4.7%

37.2%

34.9%

23.3%

I was given access to the course before it started so I could figure out how it worked before I began.

9.3%

37.2%

32.6%

16.3%

4.7%

I had a good sense of how the course was structured before I began.

11.6%

39.5%

20.9%

23.3%

4.7%

As expected, time was an issue for faculty. Almost two-thirds (65.2%) of respondents reported that the course took more time to complete than anticipated. A majority (58.2%) reported that that did not have time during the workday to complete the online coursework, while less than one-third (32.5%) agreed that they had sufficient time. Very few respondents asked for and received comp time during the workday (7.0%) or on evenings and/or weekends (4.7%) to study.

Course difficulty and learners’ initial experience with courses were other issues which may have contributed to non-completion for all respondents. Almost half (48.9%) thought that the facilitated course was more difficult than anticipated, while about one-quarter of respondents reported did not have access to the course before it started (21%) and that they did not have a good sense of how the course was structured before it began (28%).

Other responses indicated that relevance of course content was not an issue, but online course delivery might be one. A majority of respondents (55.8%) said that they would prefer to take a course in a classroom setting, and only about one-quarter (27.9%) disagreed with this statement.

In anticipation that this might be an issue, respondents were also asked to indicate whether or not they had a strong preference for classroom courses or asynchronous online courses. A substantial majority (60.5%) expressed a strong preference for classroom courses, while only a few respondents (7.0%) strongly preferred asynchronous online courses, and relatively few (32.6%) did not have a strong preference for either delivery mode (Table 7). Although there were some differences in delivery mode preference by age range and gender, the number of responses is too small to discern any definitive patterns (Table 8).

Table 7

Respondents’ Delivery Mode Preference

Delivery Mode Preference (n=43)

%

#

Strongly prefer asynchronous online course

6.98%

3

Strongly prefer face-to-face classroom course

60.47%

26

No strong preference either way

32.56%

14

Table 8

Respondents’ Delivery Mode Preference by Age and Gender

Delivery Mode Preference by Age (n=43)

30-39

40-49

50-59

60+

Male

Female

Strongly prefer asynchronous online course

0

1

2

0

2

1

Strongly prefer face-to-face classroom course

3

5

14

4

16

10

No strong preference either way

1

6

6

1

10

4

N=

4

12

22

5

28

15

Non-Participant Faculty Survey Results

 Of the 29 responses received, 27 were from non-participants and two were previous participants in online facilitated courses. Responses from the non-participant faculty were based on prior perceptions but not actual experience with online facilitated courses. However, many responses were clearly based on previous experience with technology-enabled courses, in particular web-based training courses. Their comments yielded more useful information about reasons why faculty did not sign up for online facilitated courses, as described in the “Responses from Non-Participant Faculty” section below.

Discussion

Age was a factor in this study in that the average age of survey respondents is relatively high. However, there were too few responses in this study to draw conclusions based on age range. Gender did not appear to be a major factor in this study.

Respondents enrolled in these courses for practical reasons such as career goals and professional development, as would be expected with a faculty development program. Convenience and flexibility of schedule were important but secondary reasons for enrollment. Some faculty noted that they enrolled in these courses because they were required, although actual requirements vary as noted previously. Several also reflected respondents’ clear preference for classroom delivery; four respondents noted that they were taking the course online because it was only available in that delivery format, and another explicitly noted a preference for “classroom exposure.”

Reasons for Non-Completion.

This study asked respondents to differentiate time spent on course work by location rather than by type of time (“work” vs. “personal”) because the organization’s policies allow faculty to request compensatory time during the workday or on evenings and weekends to do activities such as taking professional development courses. Only four out of 43 survey respondents indicated that they requested and received comp time to work on their course either during the workday or on evenings or weekends, and only one respondent requested and received comp time to work on their course both during the workday and on evenings or weekends. Although it is possible that faculty routinely ignore existing university policies, the pattern of survey responses suggest that the time faculty spent taking these courses was added on to their existing workload rather than compensated for by simply counting time spent doing a course as regular work duty time.

Survey results appear to corroborate other studies’ findings that time conflicts with work commitments result in increased course dropout rates.

Infringement on non-work hours. Survey findings indicate that taking these courses infringed on non-work hours for most faculty. About one-quarter (26.1%) of respondents reported that all of their course work time was spent at work. Over two-thirds (69.1%) of respondents reported spending at least some time working on the course from home (Chart 1), which is far higher than the 30.4% reported in the Wang et al. study. Of these, almost one-half (47.6%) reported spending at least 20% of their course work time at home, and almost one-quarter (21.4%) reported that at least 50% of their course work time was spent at home. Similarly, about two-fifths (40.5%) of respondents reported spending at least some time working on the course somewhere else, in most cases while on work travel or other assigned “temporary duty”. Less than one-third (32.6%) reported that that they had sufficient time during the workday to complete their coursework.

In terms of relative proportion of time spent by location, respondents reported that they spent over one-third of their time working on the course at home (25%) or somewhere else (9%). While this is less than the 60% reported by Thalheimer (2004), it still represents a relatively large infringement on non-work hours.

Increased work hours. Almost two-fifths (six out of 16, or 37.5%) of non-completers reported that their work hours increased while taking their course, and one narrative response reported work hours increased because of the course. Completers were not asked about increased work hours per se. However, since taking these courses was part of faculty job duties, survey findings clearly indicate that the time required to complete these courses required an increase in job hours. Almost two-thirds (65.1%) of all respondents reported that their course took more time to complete than anticipated, and almost half of the non-completers (43.8%) reported that they lacked time to complete course assignments. These results suggest that many faculty chose not to complete courses rather than spend the additional time required to complete them.

Survey results also suggest that these courses were frequently assigned as an additional responsibility on top of regular work duties, which decreases the likelihood of completion (Takiya et al. 2005). Survey responses also strongly suggest that faculty workplaces were not effectively set up as supportive learning environments for taking online facilitated courses, which is another indicator for increased dropout rates.

Learners’ initial experience with the course may have been an additional contributing factor to dropout rates, given that about one-quarter of respondents reported issues with prior course access and transparency of course structure. The proportion of respondents who expressed a strong preference for classroom courses, combined with the number who reported that they were required to take these courses, also raises the question of whether delivery mode preference was also a contributing factor, although the participant survey did not explicitly address this question.

The importance of the above factors is further magnified by the fact that user proficiency and comfort with technology were non-factors for this population, in contrast to other studies in which these factors were major contributors to dropout rates.

Reasons for Avoidance of Online Facilitated Courses

The non-participant faculty survey asked respondents to identify reasons for not taking online facilitated courses and specific obstacle(s) which may have stopped them from doing so. These open-ended questions allowed respondents to offer responses which were not explicitly solicited. Nevertheless, several themes emerged from their responses which are consistent with the existing literature on attrition rates and with the findings of the faculty participant survey:

Lack of organizational support appeared to be a factor in faculty decisions not to sign up for online facilitated courses. One related issue was inconsistent policies about course requirements; in principle, the university mandates faculty professional development, but in practice the application of requirements varies among the regional campuses. As a result, while many online survey respondents reported taking the courses because they were required, several other faculty did not sign up for FPD courses because they were not required. Lack of marketing and available information about online facilitated courses was also noted in some responses.

Delivery mode preference was another important factor for non-participants, as about one-third of respondents stated or implied a strong preference for traditional classroom instruction. Some of these comments also suggested a broader distaste for all forms of technology-enabled learning, including the web-based training courses which the organization had instituted some years ago to replace classroom offerings.

However, the most important factor, cited by over two-thirds (69%) of respondents, was time-related issues – lack of time, time management, schedule conflicts, etc. Some respondents cited simple time conflicts with work commitments, while other responses also highlighted an unexpected relationship between delivery mode preference and time issues. In particular, many comments indicated a perception that taking online facilitated courses would require more faculty time rather than less. Even more interesting were comments which indicated a dislike of online (or more broadly technology-enabled) courses which were coupled with the time issue. These comments that online facilitated courses required more time to complete directly contradict the prevailing wisdom that online courses save time by increasing convenience and flexibility, which raises the question of why these respondents’ perception of time requirements in online courses is so different.

The most striking aspect of these comments was the concern with the amount of time required to complete online courses. These comments indicate that respondents experience the prospect of taking online facilitated courses as an added responsibility, which parallels the findings reported by actual course participants. None of the online courses actually required four weeks’ worth of participants’ time; in fact, the course instructor estimated that it would take as much or less time to complete an online facilitated course (20-40 hours) relative to the equivalent classroom course (35-40 hours + travel time). However, online courses required some participant time on a daily basis for a four-week time period, which was apparently perceived as additional time. By contrast, the time required for completion of classroom courses was factored into participants’ existing workload, so it was already accounted for and occurred within a shorter time frame.

The reason for this perception may be in large part due to the particular (and to some extent peculiar) organizational environment. The corporate university in this study has a well-structured system where faculty allocates their work time into specific work categories several months in advance for the coming year. In almost all cases, faculty time for online facilitated courses was not allocated in advance, so taking an online facilitated course becomes an added responsibility in most cases. In addition, different processes for creating faculty workload schedules and FPDP course schedules often results in schedule conflicts which reduce the ability of faculty to schedule FPD courses.

Other Factors Affecting Completion Rates

There are also other systemic factors that emerged from this study which may help account for the difference in completion rates between classroom and online facilitated courses.

Course completion requirements -- The baseline course completion rate for the FPD classroom courses is very high (~98%). This suggests that, for whatever reasons, being in attendance is the key course completion requirement for these FPD classroom courses, an observation corroborated by anecdotal comments from course instructors. By contrast, in the absence of physical attendance as a criterion, online facilitated course offerings were more likely to use assignment completion (e.g., submission of work products, discussion board participation, completion of other assignments) as a criterion for course completion.

Course entry/exit access -- Classroom students are to large extent captive participants, especially if a course is offered off-site away from work. In this university’s case, both enrollment and withdrawal are easier for online facilitated courses than for classroom courses. Signing up for a classroom course required schedule (re-) arrangement to make the time slot available, whereas signing up for an online facilitated course usually did not involve schedule changes since most participants tried to fit the course into their existing schedules. This in turn made it easier for faculty to drop an online course if other time commitments intruded on their schedule while they were taking the course. The absence of negative consequences for dropping an online course also abetted this situation, whereas lost work time and travel expenses were potential negative consequences for failing to complete a classroom course.

Respondents from both participant and non-participant surveys did not mention the presence of visible top-level administrative support, implying that its absence was another possible factor. Responses also indicated that supportive learning environments and policies were also likely absent, including logistical and cognitive elements such as time allocation, designated “learning space,” management of office-related interruptions, etc.

Two other possible factors are worth noting. The course instructor used the re-design process as an opportunity to improve the design of several of the online facilitated courses. Based on an assessment of student work products, the instructor believes that the resulting courses were more robust, with improved quality that produced more reflective and therefore deeper learning. It is not clear whether this factor contributed to the findings that almost half of the survey respondents found the online facilitated courses more difficult than anticipated, or that several non-/partial completers reported finding the courses to be too difficult. However, other anecdotal evidence suggests that some course participants disagreed that improved learning was worth extra time and effort.

The other factor is related to delivery mode preference which involves several issues, including the belief that role modeling is the best or only appropriate way to learn how to be a teacher, or unwillingness to change or venture out of one’s comfort zone, or a lack of understanding of how online facilitated courses work. The common factor is a strongly-held belief that the face-to-face interaction of classroom instruction is preferable to technology-enabled interaction. Many of the reasons for this have legitimate components, some of which have little to do with learning. Attending a classroom course can be a perk: an opportunity to network with colleagues, travel to a new city, or get a much-needed break from the office routine. Other reasons may have more connection with learning, such the belief in modeling as an effective learning strategy or a preference for the affordances of face-to-face interaction. Some comments also suggested an inability to distinguish between online facilitated courses and the web-based training courses with which most respondents were familiar. These experiences may have pre-disposed some faculty to have negative expectations about online facilitated learning.

Despite an ever-growing body of research literature which indicates that online facilitated courses offer all of these affordances – content retention, peer interaction, enhanced interaction with professional educators – non-participant comments illustrate that prior perceptions about online vs. classroom education constitute an obstacle which needs to be overcome in order to attract faculty to take online facilitated courses. Likewise, prior bias against technology-enabled course delivery may also contribute to increased dropout rates in conjunction with other factors.

Conclusion

Comparing completion rates across delivery modes is notoriously difficult. Yet the huge difference between reported completion rates in FPD classroom and online facilitated courses merited a closer examination. In this case, it appears that the disparity in completion rates between classroom and online facilitated courses occurred for systemic reasons. The most likely factor in decreasing course completion rates for online facilitated courses was the creation of time conflicts with work commitments. Most faculty took online facilitated courses as an added responsibility instead of having designated learning time comparable to what typically occurs for classroom courses. The resulting infringement on non-work hours and increase in learners’ job hours produced results which corroborate other studies’ findings that time conflicts with work commitments result in increased course attrition rates.

Issues with the learners’ initial experience with the online courses in terms of prior course access and transparency of course structure may also have contributed to dropout rates. Other organizational support issues such as inconsistent policies about course requirements and lack of appropriate “readiness marketing” (for example, informing prospective learners about time estimates to complete online courses) are other possible factors. In addition, less stringent course enrollment and withdrawal policies and more stringent completion requirements for online facilitated courses may also be factors. Finally, prior delivery mode preferences for some learners may have worked in conjunction with other factors to increase the likelihood of attrition.

Achieving comparable completion rates for online facilitated courses requires the development of support structures which are comparable to those already in place for classroom and web-based training courses. If such structures are not in place, the results of this study and other research findings indicate that a decrease in course completion rates is predictable. Corporate universities and other organizations which are contemplating an initiative which utilizes online facilitated courses should make sure that they provide adequate organizational support in terms of time allocation, learning space allocation, clear policies, learner readiness, and course design which provides learners with an appropriate initial course experience.

References

Bocchi, J., Eastman, J. K., & Swift, C. O. (2004). Retaining the online learner: Profile of students in an online MBA program and implications for teaching them. Journal of Education for Business, 79(4), 245-253.

Carr, S. (2000). As distance education comes of age, the challenge is keeping the students. The Chronicle of Higher Education, February 11, 2004, pp. A39-41.

Diaz, D.P. (2002). Online drop rates revisited. The Technology Source, May/June 2002. Retrieved October 2, 2007 from: http://technologysource.org/article/online_drop_rates_revisited/

Diaz, D and Cartnal, R. (2006). Term length as an indicator of attrition in online learning. Innovate, 2 (5), May/June 2006. Retrieved October 2, 2007 from: http://www.innovateonline.info/index.php?view=login&id=196&next=index.php%3Fview%3Darticle%7Cid%3D196%7Caction%3Darticle

Dupin-Byrant, P. A. (2004). Pre-entry variables related to retention in online distance education. American Journal of Distance Education, 18 (4), p. 199.

Frydenberg, J. (2007). Persistence in University Continuing Education Online Classes. Accepted for publication in the International Review of Research in Open and Distance Learning, 8 (3), www.irrodl.org.

Henke, H., and Russum, J. (2002). Factors Influencing Attrition Rates in a Corporate Distance Education Program. Education at a Distance 14 (11). Retrieved October 2, 2007 from: http://www.usdla.org/html/journal/NOV00_Issue/story03.htm

Kleinman, J., and Entin, E. B. (2002). Comparison of in-class and distance-learning students' performance and attitudes in an introductory computer science course. The Journal of Computing in Small Colleges, 17(6). Retrieved on October 2, 2007 from: http://www.middlesex.mass.edu/carnegie/Carnegie_02/Attitude_Comparison.pdf

Moore, J., Sener. J, and Fetzner, M. (2006). Getting better: ALN and student success. Journal of Asynchronous Learning Networks, 10 (3), July 2006. Retrieved on October 2, 2007 from:
http://www.sloan-c.org/publications/jaln/v10n3/v10n3_6moore.asp

O'Brien, B. & Renner, A. (2002). Online student retention: Can it be done? Paper presented at the ED-MEDIA 2002 World Conference on Educational Multimedia, Hypermedia & Telecommunications, Denver, CO. Retrieved October 2, 2007 from: http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/1b/19/90.pdf

Parker, A. (2003). Identifying predictors of academic persistence in distance education. USDLA Journal, 17 (1). Retrieved October 2, 2007 from: http://www.usdla.org/html/journal/JAN03_Issue/article06.html

Russell, T. L.(1998). No Significant Difference: Phenomenon as Reported in 248 Research Reports, Summaries, and Papers, Fourth Edition. Raleigh, NC: North Carolina State University. Cited in: Henke, H., Russum, J. (2002) (see complete citation above).

Royer, M. (2006). Student Success and Retention in Online Courses, Bellevue Community College, November 2006. Retrieved July 2, 2007 from: http://www.sbctc.ctc.edu/docs/data/stdt_success_retention_in_online_courses_bcc.pdf

Simpson, V. (2004). The effect on staff perceptions of online learning when using a non-traditional approach to staff development. Proceedings Networked Learning Conference 2002, Sheffield University, UK. Retrieved October 2, 2007 from: http://www.networkedlearningconference.org.uk/past/nlc2002/proceedings/papers/37.htm

Takiya, S., Archbold, J., Berge, Z. (2005). Flexible Training’s Intrusion on Work/Life Balance. Turkish Online Journal of Distance Education, 6(2), April 2005. Retrieved October 2, 2007 from: http://tojde.anadolu.edu.tr/tojde18/articles/article5.htm

Thalheimer, W. (2004). E-Learning’s Burden on Work-Life Balance: What we can do. Retrieved July 2, 2007 from http://www.work-learning.com/Catalog/index.htm

Tyler-Smith, K. (2006). Early Attrition among First Time e-Learners: A Review of Factors that Contribute to Drop-out, Withdrawal and Non-completion Rates of Adult Learners undertaking eLearning Programs. Journal of Online Learning and Teaching, 2(2), June 2006. Retrieved October 2, 2007 from: http://jolt.merlot.org/Vol2_No2_TylerSmith.htm

Wang, G, Foucar-Szocki, D, Griffen, O., O’Connor, C. and Sceiford, E. (2003). Departure, Abandonment, and Dropout of E-learning: Dilemma and Solutions. James Madison University, October 2003. Retrieved October 2, 2007 from: http://www.masie.com/researchgrants/2003/JMU_Final_Report.pdf

Whipp, J. L., & Chiarelli, S. (2004) Self-regulation in a Web-based course: A case study. Educational Technology Research and Development, 52(4), 5-22.

Willging, P. & Johnson, S. (2004). Factors that influence students' decision to dropout of online courses. Journal of Asynchronous Learning Networks 8(4), October 2004. Retrieved October 2, 2007, from: http://www.aln.org/publications/jaln/v8n4/v8n4_willging.asp

Zolkos, R. (1999). Online education getting good grades: Despite high attrition, online courses seen as a possible alternative to the classroom. Business Insurance, Oct 1999, p. 40B.

About the Authors

John Sener heads Sener Learning Services, a leader in supporting evolution of technology-enabled learning through knowledge development and dissemination, evaluation, strategic planning, and learning design. As a pioneer in online education, he provides a unique mixture of broad practical experience and academic expertise. As Director of Special Initiatives for the Sloan Consortium, Sener has been an Effective Practices editor since 2002 and has served on the Journal of Asynchronous Learning Networks editorial board since its inception in 1996. He also served as a member of the Council on Academic Management for eArmyU. Sener's 25+ year career in education and training encompasses a unique mélange of learning experiences. He holds degrees from Johns Hopkins University and Oberlin College.

email:   jsener@senerlearning.com  www.senerlearning.com

 Robert L. Hawkins, Ed. D. (robert.hawkins12@verizon.net) is a retired instructional design specialist from the federal government. He is a professional speaker at many conferences and provides educational consulting and staff development programs for a variety of organizations. Dr. Hawkins has headed several Project Management Teams for infusing product and performance-based asynchronous learning into selected college and university courses in both the public and private sectors. Dr. Hawkins is also an Assistant Professor Adjunct in the Adult Studies Program, Virginia Wesleyan College, where he teaches courses in Speech Communication and Organizational Communication.

go top

December 2007 Index

Home Page