Editor’s Note: This is a study to determine reasons for low completion rates in faculty training courses in a corporate university. It identifies forces external to the course that impact completion rates and defuses the assumption that online courses are inferior in content or pedagogy. Factors Affecting Completion Rates |
Key Variable | Results |
Headcount for all course offerings | 76 |
# of enrollments for all course offerings | 100 |
% of completions by course enrollment | 57% |
Headcount by course | 92 |
% of completions by course headcount | 62% |
% of faculty who completed any course [headcount] | 59% |
Almost two-thirds of the respondents were male. Over 60% were aged 50 and over, and more than 90% were age 40 and over, reflecting the university’s practice of hiring faculty with substantial prior experience in their areas of expertise (Table 2).
Table 2
Respondent Demographic Characteristics
Gender | % | # |
| |
Male | 65.91% | 29 |
| |
Female | 34.09% | 15 |
| |
Age by Gender: | ||||
Age | Male | Female | ||
20-29 | 0.00% | 0 | 0 | 0 |
30-39 | 9.09% | 4 | 2 | 2 |
40-49 | 29.55% | 13 | 9 | 4 |
50-59 | 50.00% | 22 | 13 | 9 |
60+ | 11.36% | 5 | 5 | 0 |
Respondents reported a variety of reasons for enrolling in a facilitated online course. Almost one-half cited fit with career goals and/or for other professional development reasons. Over one-third cited convenience and almost one-third cited flexibility of schedule as reasons for enrolling. Over one-third of respondents also reported other reasons for enrolling; the most commonly cited were to meet faculty certification requirements (seven responses) and because the course was only offered in an online format (four responses – Table 3).
Table 3
Reasons for Enrolling in a Facilitated Online Course
Reasons for Enrolling | % | # |
Flexibility of schedule | 31.82% | 14 |
Convenience of taking the course online | 36.36% | 16 |
Fit well with career goals | 47.73% | 21 |
Other professional development reasons | 45.45% | 20 |
Other (please specify) | 36.36% | 16 |
Meet training/certification requirements | 8 | |
Only offered in online facilitated format | 4 | |
Preparing to develop a similar course | 1 | |
Support a curriculum development project | 1 | |
Meet rank advancement requirements | 1 | |
I am an online instructor | 1 | |
[I was] encouraged to take the course | 1 | |
Wanted to become familiar with this tool | 1 |
Of the 44 respondents, there were 27 “completers” and 17 non- or partial completers (Table 4). Compared to actual course completion data, it appears that course completers responded to the survey in about the same proportion as non- or partial completers. The reported numbers indicate that 61% of completers responded vs. 46% of non- or partial completers. However, the survey allowed respondents to self-identify as a course completer based on course rather than course offering, which could account for most or all of this variance as there were several faculty who successfully completed a course on the second try after not having completed the first try. Since the survey was anonymous, it is impossible to tell how many of these respondents self-identified as completers or partial completers. Thus there is a high likelihood that respondents were a representative sample of the larger population.
Table 4
Respondent Completion Rates for Facilitated Online Courses
% | # | |
Yes, I completed all of the courses I took. | 61.36% | 27 |
I completed one or more of the courses I took, but I did not complete one or more of the other courses I took. | 20.45% | 9 |
No, I did not complete any of the courses I took. | 18.18% | 8 |
Respondents who reported not completing one or more facilitated courses were asked additional questions about their reasons for non-completion. These questions were organized into five categories: job, personal, course, technology, and learning environment (Table 5).
Table 5
Reasons for Non-Completion of Facilitated Online Courses (N=16)
Personal Reasons | % | # |
Lack of time to complete the assignments | 43.75% | 7 |
Schedule conflicts with other personal activities | 37.50% | 6 |
Family issues | 12.50% | 2 |
None of these reasons applies to me. | 31.25% | 5 |
Other personal reasons (please specify) | 1 | |
Job-Related Reasons | % | # |
Job responsibilities changed during the course | 56.25% | 9 |
My supervisor did not support my taking the course. | 0.00% | 0 |
My work hours increased while taking the course | 37.50% | 6 |
I had to use too much of my personal time to complete the course. | 25.00% | 4 |
There were too many distractions at work for me to complete the course. | 56.25% | 9 |
I asked for but did not receive comp time to do evening or weekend study. | 0.00% | 0 |
I learned what I needed to know for my job before the course ended. | 0.00% | 0 |
None of these reasons applies to me. | 18.75% | 3 |
Other job-related reasons (please specify) | 4 | |
Course-Related Reasons | % | # |
The course was too difficult/demanding | 25.00% | 4 |
The course was not demanding enough | 0.00% | 0 |
The course was more difficult than a comparable web-based or classroom course | 12.50% | 2 |
The group assignments were too difficult | 6.25% | 1 |
Lack of one-to-one interaction with the instructor(s) | 6.25% | 1 |
Lack of one-to-one interaction with other students | 6.25% | 1 |
The course didn’t meet my expectations | 0.00% | 0 |
None of these reasons applies to me. | 75.00% | 12 |
Other course-related reasons (please specify) | 2 | |
Technology-Related Reasons | % | # |
The Blackboard learning management system was too complicated | 0.00% | 0 |
The online learning environment was too de-personalized | 0.00% | 0 |
There was not enough technical support from [university] staff | 0.00% | 0 |
There were too many technical problems. | 0.00% | 0 |
My technical skills were inadequate to do the program | 0.00% | 0 |
None of these reasons applies to me. | 100.00% | 16 |
Other technology-related reasons (please specify) | 2 | |
Learning Environment-Related Reasons | % | # |
Assignment scheduling | 12.50% | 2 |
Figuring out how to find my way in an unfamiliar learning environment | 0.00% | 0 |
Having to do the course asynchronously instead of in real time | 6.25% | 1 |
Having to do text-based discussions instead of oral discussions | 6.25% | 1 |
Presentation of information was too unordered and non-linear for me | 6.25% | 1 |
Physical isolation from the instructor and/or other students | 25.00% | 4 |
Lack of support when I encountered difficulty | 0.00% | 0 |
Longer turnaround time for answering questions | 12.50% | 2 |
I did not care for the online instructor | 6.25% | 1 |
None of these reasons applies to me. | 68.75% | 11 |
Other learning environment-related reasons (please specify) | 3 |
Job-related reasons were cited most frequently by far (32 responses). Changing job responsibilities and work distractions were both cited by more than half of the respondents. Six respondents said that their work hours increased while taking the course, and one-quarter of the respondents said that they had to use too much of their personal time to complete the course. Other responses noted conflicts with other time demands; one respondent noted an increase in work hours because of the course.
Lack of time to complete the assignments (seven responses) and schedule conflicts with other personal activities (six responses) were frequently cited as personal reasons for course non-completion.
Course-related reasons were a less important factor for most non or partial completers. One-quarter of these respondents said the course was too difficult or demanding, two respondents cited difficulty relative to a comparable web-based or classroom course. One of the “other” responses noted that the assignments were more “time-consuming” than difficult or demanding.
Learning environment-related reasons were also a less important factor for most respondents. One-quarter of respondents cited “physical isolation from the instructor and/or other students” as a factor, while assignment scheduling and longer turnaround time for answering questions were also cited by more than one respondent. Other responses cited a sense of distraction, a lack of time to devote proper attention, and a dislike for technology-mediated learning as reasons.
Technology-related reasons were not a factor at all for non-completions. The two “Other” comments indicated that technology was not a problem.
Survey respondents were also asked to estimate what percentage of the time they worked on the course at work, at home, or somewhere else. The purpose of this question was to determine whether or not faculty spent a significant amount of time studying at home or other venues. The findings indicate that this was indeed the case (Chart 1):
Many respondents spent a lot of the time working on the course from home.
A significant proportion of respondents spent most of the time working on the course from home.
Relatively few respondents spent all of their time doing their course while at work.
A significant proportion of respondents spent some of their time working on the course from somewhere else.
All respondents were also asked a series of statements about facilitated courses in general. The purpose of these statements was to obtain more information related to course content, course difficulty, delivery setting, pre-course transparency, and time-related issues. All but one respondent completed this section of the survey (Table 6).
Table 6
Responses to Other Key Statements about Facilitated Online Courses (N=43)
| Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree |
The [online] course was more difficult than anticipated. | 16.3% | 32.6% | 18.6% | 27.9% | 4.7% |
The course took more time to complete than anticipated. | 23.3% | 41.9% | 14.0% | 18.6% | 2.3% |
Course content was relevant to my job. | 48.8% | 46.5% | 4.7% | 0.0% | 0.0% |
I would prefer to take the course in a classroom setting. | 27.9% | 27.9% | 16.3% | 20.9% | 7.0% |
I had sufficient time during the workday to complete the online coursework. | 2.3% | 30.2% | 9.3% | 41.9% | 16.3% |
I asked for and received comp time during the workday to complete the course. | 2.3% | 4.7% | 32.6% | 34.9% | 25.6% |
I asked for and received comp time evenings and/or weekends to study. | 0.0% | 4.7% | 37.2% | 34.9% | 23.3% |
I was given access to the course before it started so I could figure out how it worked before I began. | 9.3% | 37.2% | 32.6% | 16.3% | 4.7% |
I had a good sense of how the course was structured before I began. | 11.6% | 39.5% | 20.9% | 23.3% | 4.7% |
As expected, time was an issue for faculty. Almost two-thirds (65.2%) of respondents reported that the course took more time to complete than anticipated. A majority (58.2%) reported that that did not have time during the workday to complete the online coursework, while less than one-third (32.5%) agreed that they had sufficient time. Very few respondents asked for and received comp time during the workday (7.0%) or on evenings and/or weekends (4.7%) to study.
Course difficulty and learners’ initial experience with courses were other issues which may have contributed to non-completion for all respondents. Almost half (48.9%) thought that the facilitated course was more difficult than anticipated, while about one-quarter of respondents reported did not have access to the course before it started (21%) and that they did not have a good sense of how the course was structured before it began (28%).
Other responses indicated that relevance of course content was not an issue, but online course delivery might be one. A majority of respondents (55.8%) said that they would prefer to take a course in a classroom setting, and only about one-quarter (27.9%) disagreed with this statement.
In anticipation that this might be an issue, respondents were also asked to indicate whether or not they had a strong preference for classroom courses or asynchronous online courses. A substantial majority (60.5%) expressed a strong preference for classroom courses, while only a few respondents (7.0%) strongly preferred asynchronous online courses, and relatively few (32.6%) did not have a strong preference for either delivery mode (Table 7). Although there were some differences in delivery mode preference by age range and gender, the number of responses is too small to discern any definitive patterns (Table 8).
Table 7
Respondents’ Delivery Mode Preference
Delivery Mode Preference (n=43) | % | # |
Strongly prefer asynchronous online course | 6.98% | 3 |
Strongly prefer face-to-face classroom course | 60.47% | 26 |
No strong preference either way | 32.56% | 14 |
Table 8
Respondents’ Delivery Mode Preference by Age and Gender
Delivery Mode Preference by Age (n=43) | 30-39 | 40-49 | 50-59 | 60+ | Male | Female |
Strongly prefer asynchronous online course | 0 | 1 | 2 | 0 | 2 | 1 |
Strongly prefer face-to-face classroom course | 3 | 5 | 14 | 4 | 16 | 10 |
No strong preference either way | 1 | 6 | 6 | 1 | 10 | 4 |
N= | 4 | 12 | 22 | 5 | 28 | 15 |
Non-Participant Faculty Survey Results
Of the 29 responses received, 27 were from non-participants and two were previous participants in online facilitated courses. Responses from the non-participant faculty were based on prior perceptions but not actual experience with online facilitated courses. However, many responses were clearly based on previous experience with technology-enabled courses, in particular web-based training courses. Their comments yielded more useful information about reasons why faculty did not sign up for online facilitated courses, as described in the “Responses from Non-Participant Faculty” section below.
Age was a factor in this study in that the average age of survey respondents is relatively high. However, there were too few responses in this study to draw conclusions based on age range. Gender did not appear to be a major factor in this study.
Respondents enrolled in these courses for practical reasons such as career goals and professional development, as would be expected with a faculty development program. Convenience and flexibility of schedule were important but secondary reasons for enrollment. Some faculty noted that they enrolled in these courses because they were required, although actual requirements vary as noted previously. Several also reflected respondents’ clear preference for classroom delivery; four respondents noted that they were taking the course online because it was only available in that delivery format, and another explicitly noted a preference for “classroom exposure.”
This study asked respondents to differentiate time spent on course work by location rather than by type of time (“work” vs. “personal”) because the organization’s policies allow faculty to request compensatory time during the workday or on evenings and weekends to do activities such as taking professional development courses. Only four out of 43 survey respondents indicated that they requested and received comp time to work on their course either during the workday or on evenings or weekends, and only one respondent requested and received comp time to work on their course both during the workday and on evenings or weekends. Although it is possible that faculty routinely ignore existing university policies, the pattern of survey responses suggest that the time faculty spent taking these courses was added on to their existing workload rather than compensated for by simply counting time spent doing a course as regular work duty time.
Survey results appear to corroborate other studies’ findings that time conflicts with work commitments result in increased course dropout rates.
Infringement on non-work hours. Survey findings indicate that taking these courses infringed on non-work hours for most faculty. About one-quarter (26.1%) of respondents reported that all of their course work time was spent at work. Over two-thirds (69.1%) of respondents reported spending at least some time working on the course from home (Chart 1), which is far higher than the 30.4% reported in the Wang et al. study. Of these, almost one-half (47.6%) reported spending at least 20% of their course work time at home, and almost one-quarter (21.4%) reported that at least 50% of their course work time was spent at home. Similarly, about two-fifths (40.5%) of respondents reported spending at least some time working on the course somewhere else, in most cases while on work travel or other assigned “temporary duty”. Less than one-third (32.6%) reported that that they had sufficient time during the workday to complete their coursework.
In terms of relative proportion of time spent by location, respondents reported that they spent over one-third of their time working on the course at home (25%) or somewhere else (9%). While this is less than the 60% reported by Thalheimer (2004), it still represents a relatively large infringement on non-work hours.
Increased work hours. Almost two-fifths (six out of 16, or 37.5%) of non-completers reported that their work hours increased while taking their course, and one narrative response reported work hours increased because of the course. Completers were not asked about increased work hours per se. However, since taking these courses was part of faculty job duties, survey findings clearly indicate that the time required to complete these courses required an increase in job hours. Almost two-thirds (65.1%) of all respondents reported that their course took more time to complete than anticipated, and almost half of the non-completers (43.8%) reported that they lacked time to complete course assignments. These results suggest that many faculty chose not to complete courses rather than spend the additional time required to complete them.
Survey results also suggest that these courses were frequently assigned as an additional responsibility on top of regular work duties, which decreases the likelihood of completion (Takiya et al. 2005). Survey responses also strongly suggest that faculty workplaces were not effectively set up as supportive learning environments for taking online facilitated courses, which is another indicator for increased dropout rates.
Learners’ initial experience with the course may have been an additional contributing factor to dropout rates, given that about one-quarter of respondents reported issues with prior course access and transparency of course structure. The proportion of respondents who expressed a strong preference for classroom courses, combined with the number who reported that they were required to take these courses, also raises the question of whether delivery mode preference was also a contributing factor, although the participant survey did not explicitly address this question.
The importance of the above factors is further magnified by the fact that user proficiency and comfort with technology were non-factors for this population, in contrast to other studies in which these factors were major contributors to dropout rates.
The non-participant faculty survey asked respondents to identify reasons for not taking online facilitated courses and specific obstacle(s) which may have stopped them from doing so. These open-ended questions allowed respondents to offer responses which were not explicitly solicited. Nevertheless, several themes emerged from their responses which are consistent with the existing literature on attrition rates and with the findings of the faculty participant survey:
Lack of organizational support appeared to be a factor in faculty decisions not to sign up for online facilitated courses. One related issue was inconsistent policies about course requirements; in principle, the university mandates faculty professional development, but in practice the application of requirements varies among the regional campuses. As a result, while many online survey respondents reported taking the courses because they were required, several other faculty did not sign up for FPD courses because they were not required. Lack of marketing and available information about online facilitated courses was also noted in some responses.
Delivery mode preference was another important factor for non-participants, as about one-third of respondents stated or implied a strong preference for traditional classroom instruction. Some of these comments also suggested a broader distaste for all forms of technology-enabled learning, including the web-based training courses which the organization had instituted some years ago to replace classroom offerings.
However, the most important factor, cited by over two-thirds (69%) of respondents, was time-related issues – lack of time, time management, schedule conflicts, etc. Some respondents cited simple time conflicts with work commitments, while other responses also highlighted an unexpected relationship between delivery mode preference and time issues. In particular, many comments indicated a perception that taking online facilitated courses would require more faculty time rather than less. Even more interesting were comments which indicated a dislike of online (or more broadly technology-enabled) courses which were coupled with the time issue. These comments that online facilitated courses required more time to complete directly contradict the prevailing wisdom that online courses save time by increasing convenience and flexibility, which raises the question of why these respondents’ perception of time requirements in online courses is so different.
The most striking aspect of these comments was the concern with the amount of time required to complete online courses. These comments indicate that respondents experience the prospect of taking online facilitated courses as an added responsibility, which parallels the findings reported by actual course participants. None of the online courses actually required four weeks’ worth of participants’ time; in fact, the course instructor estimated that it would take as much or less time to complete an online facilitated course (20-40 hours) relative to the equivalent classroom course (35-40 hours + travel time). However, online courses required some participant time on a daily basis for a four-week time period, which was apparently perceived as additional time. By contrast, the time required for completion of classroom courses was factored into participants’ existing workload, so it was already accounted for and occurred within a shorter time frame.
The reason for this perception may be in large part due to the particular (and to some extent peculiar) organizational environment. The corporate university in this study has a well-structured system where faculty allocates their work time into specific work categories several months in advance for the coming year. In almost all cases, faculty time for online facilitated courses was not allocated in advance, so taking an online facilitated course becomes an added responsibility in most cases. In addition, different processes for creating faculty workload schedules and FPDP course schedules often results in schedule conflicts which reduce the ability of faculty to schedule FPD courses.
There are also other systemic factors that emerged from this study which may help account for the difference in completion rates between classroom and online facilitated courses.
Course completion requirements -- The baseline course completion rate for the FPD classroom courses is very high (~98%). This suggests that, for whatever reasons, being in attendance is the key course completion requirement for these FPD classroom courses, an observation corroborated by anecdotal comments from course instructors. By contrast, in the absence of physical attendance as a criterion, online facilitated course offerings were more likely to use assignment completion (e.g., submission of work products, discussion board participation, completion of other assignments) as a criterion for course completion.
Course entry/exit access -- Classroom students are to large extent captive participants, especially if a course is offered off-site away from work. In this university’s case, both enrollment and withdrawal are easier for online facilitated courses than for classroom courses. Signing up for a classroom course required schedule (re-) arrangement to make the time slot available, whereas signing up for an online facilitated course usually did not involve schedule changes since most participants tried to fit the course into their existing schedules. This in turn made it easier for faculty to drop an online course if other time commitments intruded on their schedule while they were taking the course. The absence of negative consequences for dropping an online course also abetted this situation, whereas lost work time and travel expenses were potential negative consequences for failing to complete a classroom course.
Respondents from both participant and non-participant surveys did not mention the presence of visible top-level administrative support, implying that its absence was another possible factor. Responses also indicated that supportive learning environments and policies were also likely absent, including logistical and cognitive elements such as time allocation, designated “learning space,” management of office-related interruptions, etc.
Two other possible factors are worth noting. The course instructor used the re-design process as an opportunity to improve the design of several of the online facilitated courses. Based on an assessment of student work products, the instructor believes that the resulting courses were more robust, with improved quality that produced more reflective and therefore deeper learning. It is not clear whether this factor contributed to the findings that almost half of the survey respondents found the online facilitated courses more difficult than anticipated, or that several non-/partial completers reported finding the courses to be too difficult. However, other anecdotal evidence suggests that some course participants disagreed that improved learning was worth extra time and effort.
The other factor is related to delivery mode preference which involves several issues, including the belief that role modeling is the best or only appropriate way to learn how to be a teacher, or unwillingness to change or venture out of one’s comfort zone, or a lack of understanding of how online facilitated courses work. The common factor is a strongly-held belief that the face-to-face interaction of classroom instruction is preferable to technology-enabled interaction. Many of the reasons for this have legitimate components, some of which have little to do with learning. Attending a classroom course can be a perk: an opportunity to network with colleagues, travel to a new city, or get a much-needed break from the office routine. Other reasons may have more connection with learning, such the belief in modeling as an effective learning strategy or a preference for the affordances of face-to-face interaction. Some comments also suggested an inability to distinguish between online facilitated courses and the web-based training courses with which most respondents were familiar. These experiences may have pre-disposed some faculty to have negative expectations about online facilitated learning.
Despite an ever-growing body of research literature which indicates that online facilitated courses offer all of these affordances – content retention, peer interaction, enhanced interaction with professional educators – non-participant comments illustrate that prior perceptions about online vs. classroom education constitute an obstacle which needs to be overcome in order to attract faculty to take online facilitated courses. Likewise, prior bias against technology-enabled course delivery may also contribute to increased dropout rates in conjunction with other factors.
Comparing completion rates across delivery modes is notoriously difficult. Yet the huge difference between reported completion rates in FPD classroom and online facilitated courses merited a closer examination. In this case, it appears that the disparity in completion rates between classroom and online facilitated courses occurred for systemic reasons. The most likely factor in decreasing course completion rates for online facilitated courses was the creation of time conflicts with work commitments. Most faculty took online facilitated courses as an added responsibility instead of having designated learning time comparable to what typically occurs for classroom courses. The resulting infringement on non-work hours and increase in learners’ job hours produced results which corroborate other studies’ findings that time conflicts with work commitments result in increased course attrition rates.
Issues with the learners’ initial experience with the online courses in terms of prior course access and transparency of course structure may also have contributed to dropout rates. Other organizational support issues such as inconsistent policies about course requirements and lack of appropriate “readiness marketing” (for example, informing prospective learners about time estimates to complete online courses) are other possible factors. In addition, less stringent course enrollment and withdrawal policies and more stringent completion requirements for online facilitated courses may also be factors. Finally, prior delivery mode preferences for some learners may have worked in conjunction with other factors to increase the likelihood of attrition.
Achieving comparable completion rates for online facilitated courses requires the development of support structures which are comparable to those already in place for classroom and web-based training courses. If such structures are not in place, the results of this study and other research findings indicate that a decrease in course completion rates is predictable. Corporate universities and other organizations which are contemplating an initiative which utilizes online facilitated courses should make sure that they provide adequate organizational support in terms of time allocation, learning space allocation, clear policies, learner readiness, and course design which provides learners with an appropriate initial course experience.
Bocchi, J., Eastman, J. K., & Swift, C. O. (2004). Retaining the online learner: Profile of students in an online MBA program and implications for teaching them. Journal of Education for Business, 79(4), 245-253.
Carr, S. (2000). As distance education comes of age, the challenge is keeping the students. The Chronicle of Higher Education, February 11, 2004, pp. A39-41.
Diaz, D.P. (2002). Online drop rates revisited. The Technology Source, May/June 2002. Retrieved October 2, 2007 from: http://technologysource.org/article/online_drop_rates_revisited/)
Diaz, D and Cartnal, R. (2006). Term length as an indicator of attrition in online learning. Innovate, 2 (5), May/June 2006. Retrieved October 2, 2007 from: http://www.innovateonline.info/index.php?view=login&id=196&next=index.php%3Fview%3Darticle%7Cid%3D196%7Caction%3Darticle
Dupin-Byrant, P. A. (2004). Pre-entry variables related to retention in online distance education. American Journal of Distance Education, 18 (4), p. 199.
Frydenberg, J. (2007). Persistence in University Continuing Education Online Classes. Accepted for publication in the International Review of Research in Open and Distance Learning, 8 (3), www.irrodl.org.
Henke, H., and Russum, J. (2002). Factors Influencing Attrition Rates in a Corporate Distance Education Program. Education at a Distance 14 (11). Retrieved October 2, 2007 from: http://www.usdla.org/html/journal/NOV00_Issue/story03.htm
Kleinman, J., and Entin, E. B. (2002). Comparison of in-class and distance-learning students' performance and attitudes in an introductory computer science course. The Journal of Computing in Small Colleges, 17(6). Retrieved on October 2, 2007 from: http://www.middlesex.mass.edu/carnegie/Carnegie_02/Attitude_Comparison.pdf
Moore, J., Sener. J, and Fetzner, M. (2006). Getting better: ALN and student success. Journal of Asynchronous Learning Networks, 10 (3), July 2006. Retrieved on October 2, 2007 from:
http://www.sloan-c.org/publications/jaln/v10n3/v10n3_6moore.asp
O'Brien, B. & Renner, A. (2002). Online student retention: Can it be done? Paper presented at the ED-MEDIA 2002 World Conference on Educational Multimedia, Hypermedia & Telecommunications, Denver, CO. Retrieved October 2, 2007 from: http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/1b/19/90.pdf
Parker, A. (2003). Identifying predictors of academic persistence in distance education. USDLA Journal, 17 (1). Retrieved October 2, 2007 from: http://www.usdla.org/html/journal/JAN03_Issue/article06.html
Russell, T. L.(1998). No Significant Difference: Phenomenon as Reported in 248 Research Reports, Summaries, and Papers, Fourth Edition. Raleigh,
Royer, M. (2006). Student Success and Retention in Online Courses,
Simpson, V. (2004). The effect on staff perceptions of online learning when using a non-traditional approach to staff development. Proceedings Networked Learning Conference 2002,
Takiya, S., Archbold, J., Berge, Z. (2005). Flexible Training’s Intrusion on Work/Life Balance. Turkish Online Journal of Distance Education, 6(2), April 2005. Retrieved October 2, 2007 from: http://tojde.anadolu.edu.tr/tojde18/articles/article5.htm
Thalheimer, W. (2004). E-Learning’s Burden on Work-Life Balance: What we can do. Retrieved July 2, 2007 from http://www.work-learning.com/Catalog/index.htm
Tyler-Smith, K. (2006). Early Attrition among First Time e-Learners: A Review of Factors that Contribute to Drop-out, Withdrawal and Non-completion Rates of Adult Learners undertaking eLearning Programs. Journal of Online Learning and Teaching, 2(2), June 2006. Retrieved October 2, 2007 from: http://jolt.merlot.org/Vol2_No2_TylerSmith.htm
Wang, G, Foucar-Szocki, D, Griffen, O., O’Connor, C. and Sceiford, E. (2003). Departure, Abandonment, and Dropout of E-learning: Dilemma and Solutions. James
Whipp, J. L., & Chiarelli, S. (2004) Self-regulation in a Web-based course: A case study. Educational Technology Research and Development, 52(4), 5-22.
Willging, P. & Johnson, S. (2004). Factors that influence students' decision to dropout of online courses. Journal of Asynchronous Learning Networks 8(4), October 2004. Retrieved October 2, 2007, from: http://www.aln.org/publications/jaln/v8n4/v8n4_willging.asp
Zolkos, R. (1999). Online education getting good grades: Despite high attrition, online courses seen as a possible alternative to the classroom. Business Insurance, Oct 1999, p. 40B.
About the Authors
John Sener heads Sener Learning Services, a leader in supporting evolution of technology-enabled learning through knowledge development and dissemination, evaluation, strategic planning, and learning design. As a pioneer in online education, he provides a unique mixture of broad practical experience and academic expertise. As Director of Special Initiatives for the Sloan Consortium, Sener has been an Effective Practices editor since 2002 and has served on the Journal of Asynchronous Learning Networks editorial board since its inception in 1996. He also served as a member of the Council on Academic Management for eArmyU. Sener's 25+ year career in education and training encompasses a unique mélange of learning experiences. He holds degrees from
email: jsener@senerlearning.com www.senerlearning.com