February 2010 Index
Home Page

Editor’s Note: In a global society we must be aware of national trends and develop higher standards for teaching and learning. This study embraces instructional design, trends in distance education, and increase in national and global standards. It adds data to support quality improvement in academic distance learning.


Evaluating and Improving an Online Program for
Graduate Students Enrolled in a Research Methods Course
in Physical Education and Health

Jong-Hoon Yu, Jwa K. Kim


This research studied the learning experience satisfaction levels of forty-three students enrolled in an online Research Methods course by establishing the acknowledged importance of five major variables to them, their learning experience satisfaction in each area, and the lessons learned through the various differentials. An online survey was used to determine these ratings and the resulting descriptive statistics, including mean scores and standard deviations, were calculated to lead us to appropriate conclusions. Furthermore, statistically significant differentials in each variable between acknowledged importance and learning experience satisfaction ratings were analyzed using a paired-samples t-test, at the .05 level of significance. The study conclusively demonstrated that students were, in general, satisfied with this on-line approach, but the differentials in evaluations indicated areas for program improvement.

Keywords: distance education; students’ perspective; asynchronous learning; traditional courses; perceived satisfaction; higher education; virtual classroom; online course design; online learning; perceived importance


The United States Census Bureau has carefully examined the use of computers and the Internet in American households. According to that report, the percentage of households having personal computers has dramatically increased from 8% in 1984 to 62% by October of 2003 (U.S. Census Bureau, 2005). The percentage of American households with Internet access has also expanded tremendously. According to the same United States Census Bureau, the percentage of households with Internet access has grown from 18% in 1997 to 50% in 2001 and finally to 55% in 2003 (U.S. Census Bureau, 2005). Because of this increasing access to personal computers and the internet, online distance learning enrollment at universities and colleges across the country has grown tremendously. The Sloan Consortium surveyed more than 2,500 colleges and universities nationwide and in their final report stated:

Over 3.9 million students were taking at least one online course during the fall 2007 term; a 12 percent increase over the number reported the previous year. The 12.9 percent growth rate for online enrollments far exceeds the 1.2 percent growth of the overall higher education student population. Over twenty percent of all U.S. higher education students were taking at least one online course in the fall of 2007. (Allen & Seaman, 2008, p. 1)

As this trend toward online education in higher education continues, many universities and colleges nationwide now offer online courses in a variety of areas (Pulichino, 2006). In recent years, for example, an online master degree program for Physical Education began appearing in a number of colleges and universities across the United States. Ball State University, Boston University, Canisius College, Emporia State University, Florida State University, Georgia Southern University, Ohio University, the University of Houston, and the University of South Florida - all offer evolving long distance learning programs of this nature.

With this “exponential growth” of “online learning” in higher education, thoughtful and important questions continue to appear about the quality level of such “long distance” methodologies (Muirhead, 2000; Spellings, 2006). It is important to examine the quality of online courses from the student’s perspective. The “perceived satisfaction” level of students - how they sincerely felt about their learning experience - is an integral component of any approach to the measurement of the success of an online learning curriculum (Lin & Overbaugh, 2007; Roach & Lemasters, 2006). Students must be respected as individuals who have definite expectations about what they want and need from an online learning experience. It is important to understand that a student’s “perceived satisfaction” about an online learning experience exists when the expectations any student brings to the distance learning course are matched or exceeded. When an online learning program fails to understand and meet these expectations, the “perceived dissatisfaction” of students will eventually become clear in “overt” ways.

Within this context of expanding long distance learning programs, measuring both the levels of acknowledged curriculum importance and learning experience satisfaction has become necessary in order to effectively examine the student’s mastery of any online approach to a core content. In addition, any approach to an online course assessment that evaluates both the level of content importance and the level of the students’ “perceived satisfaction” provides us with a means to improve those areas of core content that are rated highly in importance, but also show lower indicators of student satisfaction. Instructors can work at reducing the gaps that exist between content importance and students' satisfaction levels with information of this nature; this can be accomplished by magnifying course strengths and revising evident course weakness. 

Many researchers have also examined the effectiveness of technology-based instruction (Bennett & Green, 2001; Brown, 2003; Fallah & Ubell, 2000; Johnson et al., 2000; Larson, 2009; Russell, 1999; Schulman & Sims, 1999). These studies compared test scores and the general performance levels of students enrolled in technology-based courses to those of students enrolled in traditional face-to-face courses. In general, these studies demonstrated that there was not a significant difference in student achievement between these two instructional methods. As a result, it has been shown that students learn as well through technology-based instruction as they do in traditional “face-to-face” instructional frameworks. The “virtual classroom” is working well.

In addition, there have been significant research studies that have focused on a variety of factors that influence the students’ “sense of satisfaction” in online distance learning situations. The factors include “frequent interaction between student and instructor” (Dahl, 2004; Pontz, 2006; Richardson & Swan, 2003); “active interaction between student and student” (Cannon et al., 2001; Swan et al., 2000; Wanstreet, 2006); “timely feedback from the instructor” (Benbunan-Fich, Hiltz, & Harasim, 2005; Howland & Moore, 2002; Pontz, 2006); and “readily available technical support” (Jonassen et al., 1999; Moody, 2004; Qureshi, 2004). All are identified as having critical importance in establishing the likelihood of high student satisfaction levels with online course learning situations.

These studies reported that there was an important relationship between the nature of “interaction” in an online course and student satisfaction levels. The studies showed quite clearly that students who had a higher level of “perceived interaction” between the instructor and themselves, as well as that among the students enrolled with them in the course, usually reported a high level of personal satisfaction as well as satisfaction with the various educational outcomes. These studies also suggested that immediate feedback from the instructor to the students’ questions, assignment concerns, and ongoing discussion topics contributed significantly to the students’ perceived satisfaction with the online learning experience.

In addition, the studies that related to the variable of technical support as a factor in satisfaction levels showed that a significant correlation existed between the students’ satisfaction and “readily available” technical support for the online course itself. These studies documented the fact that some students working “on-line” lack the necessary technical skills to do so easily, and therefore experience some frustration. Consequently, students expect “readily available” technical assistance that is efficient, clear, and regularly updated.

There is a lack of literature on students' own perception of course importance and their various satisfaction levels for an online course. This study explored the perceptions of curriculum importance and learning satisfaction that graduate students enrolled in an online Research Methods course assign to five key variables. Furthermore, the differences the study revealed between the “perceived importance” of the course and the resulting “student satisfaction levels” were quantified and analyzed. The data evolved from courses taught in three different semesters, and have since updated our technological platforms with these results in mind.



Of the sixty-one graduate students enrolled this online Research Methods course in the summer and fall semesters of 2008, and in the spring semester of 2009, forty-three students voluntarily participated in this study. The study design and procedures were reviewed by the Institutional Review Board to ensure that all appropriate professional protocols were observed.


The “Online Student Perceptions Survey,” developed by Sheila Hendry (2005), was used to determine the student’s ratings for both the importance of the course material and their “perceived satisfaction” with key variables associated with online courses. This instrument was selected because of its simplicity and relevance to online course work. 

The original online student perceptions survey consisted of three parts. The first part contains seven demographic questions including those relating to age, gender, the number of online course hours one has completed, the number of online hours one currently is taking, the number of traditional courses one is taking including those that might have some kind of online component, the number of weekly work hours a student has in addition to course work, and the distance the student generally travels to and from campus. This data was used so that we might reach a better understanding of the background and personal characteristics of students enrolled in the online Research Methods course.

The second part of this survey included thirty six statements based on the five main variables usually associated with online courses. These five variables, with their respective Cronbach’s alpha, are “convenience” (importance .86; satisfaction .89); “emotional health” (importance .60; satisfaction .86); “communication” (importance .72; satisfaction .91); “student support” (importance .80; satisfaction .89) and “grade earned/knowledge learned” (importance .80; satisfaction .88). Survey results reflected an acceptable level of internal consistency (Hendry, 2005). Students were asked to rate the degree of importance of each of thirty-six statements using a 5-point Likert scale
(1 = not important at all, 2 = slightly important, 3 = neutral, 4 = somewhat important, 5 = very important, and 0 = does not apply) and their level of satisfaction with these same statements using a 5-point Likert scale (1 = not satisfied at all, 2 = slightly satisfied. 3 = neutral, 4 = somewhat satisfied, 5 = very satisfied, and 0 = does not apply).

Section three of the survey included twenty statements which evaluate the online learning profile of students enrolled in the online Research Methods course. A 5-point Likert scale (1 = highly disagree, 2 = disagree, 3 = neutral, 4 = agree, to 5 = strongly agree) was used.


In order to collect data for the student’s perceptions of the importance of the course core content and their learning experience satisfaction levels with our online program design and implementation, the online student perception survey developed by Hendry (2005) was packaged as a web-based survey by simply utilizing the components of CHECKBOX Survey Software v4.5 (Web Survey Software).  

Next, the survey and informed consent form was posted on the online course site operating in the ANGEL Learning Management System for the summer and fall semesters of 2008, and the spring semester of 2009. The CHECKBOX Survey Software (v4.5) clearly presented the frequencies of responses for each item in the survey, which were then easily applied to further descriptive and inferential statistical analyses. At the beginning of the process, an email letter describing the study was sent to all students who had been enrolled during these three semesters. Those who were interested in the study then responded to all three parts.

Data Analysis

Descriptive statistics, such as mean scores and standard deviations, were calculated to establish the students’ perceptions of the importance of each of these major variables and their “learning satisfaction levels” when the course work was completed. The mean difference with regard to these two indicators was then calculated by subtracting the “importance grand mean” from the “satisfaction grand mean”. Then, the statistically significant differences in each of the five areas between the importance scale ratings and the satisfaction scale ratings were analyzed using a paired-samples t- test, at the .05 level of significance. The statistical computation was completed using the Statistical Package for the Social Sciences (SPSS) 15.0 for Windows.


Demographic data

The demographic date demonstrated that the participants’ group was somewhat typical for graduate programs, and that students were quite comfortable with the process and procedures of online, distance learning. Of the forty-three students who participated in this study, 56% were male and 44% were female. Only 2% of these students were younger than twenty-three years of age, 67% were 23-28 years of age, 21% were 29-35 years of age, and 10% were 36-45 years of age. Most of the students had indeed taken online courses. A significant minority, 23%, had completed two online courses, but 77% had completed three or more courses. In addition, nearly half of students had completed traditional courses that did include a significant online component - 19% had taken one combined course, 14% had taken two combined courses, and 16% had taken more than three courses.

The data demonstrated as well that our sample students were employed for financial gain to great degrees while they carried relatively demanding academic loads of graduate work. One might even infer that distance learning has become the major pathway to their degree. With respect to “online hours” in their current course work, 23% were currently taking three credit hours in this manner, 47% were taking six credit hours, 23% were taking nine credit hours, and 7% were taking twelve credit hours. In response to the question that asked about their current hours of “for gain” employment, 40% worked forty-one or more hours per week, 30% worked 31-40 hours per week, 9% worked 21-30 hours per week, and 7% worked 11-20 hours per week. Only 12% of the sample worked 1-10 hours per week and 2% were full-time graduate students. Lastly, the demographic data revealed that 23% lived in the city area, 23% lived within twenty miles of the campus, 9% lived 21-75 miles from the campus, 14% lived over 75 miles from the campus, and 30% actually lived out of state.

Profile Data

Our survey supported the thesis that students have a sincere respect for online learning as an instructional tool, that they are very comfortable with it, and that they do not believe that the core content is weakened in any way, or the material “dumbed down” through distance learning applications. The online learning profile of the students who were included in this study is demonstrated in Table 1.

Almost three-quarters (74%) of these students disagreed somewhat or disagreed strongly with the statement that they would not have taken an online course if there had been some other means of receiving credit available to them. A great majority of these students (86%) agreed with the statement, “I am comfortable working with computers,” while only 9% disagreed with it, and 5% voiced no opinion. More than half of this group (58%) agreed strongly and another 33% agreed somewhat strongly with the idea that they were “highly motivated,” and only 5% characterized themselves as “academically" lethargic” in some way. In addition, more than three-quarters (77%) disagreed with the idea that they learned less than they expected when taking an online course. A clear majority of students (86%) disagreed with the idea that they felt isolated and alone while taking this course; only 9% agreed with it. Meanwhile, in response to the statement, “Getting a good grade is easy in an online course,” 70% of the students, a strong majority disagreed that notion; 16% agreed but 14% were uncertain.

Table 1

Online learning profile of the study sample

Variables Data

The differences between the students’ perception of the importance of the course work and their level of satisfaction with each of the variables associated with online course experiences were analyzed using a paired-samples t-test, at the .05 level of significance.

Table 2
Results of paired-samples t-test evaluating differences
between perceived importance and satisfaction with selected variables

As demonstrated in Table 2, the results showed that there were statistically significant differences between the acknowledged importance of the core content and their learning satisfaction levels with regard to values: “emotional health,” t(42) = 2.619, p<.05, “communication,” t(42) = 2.412, p<.05, “student support,” t(42) = 6.775, p<.05, and “grade earned/knowledge learned,” t(42) = 7.622, p<.05. The single exception to this pattern was with regard to the matter of “convenience,” t(42) = .540, p>.05.

These descriptive statistics showed that the “grade earned/knowledge learned” was rated with highest variable on the importance rating (M = 4.83, SD = .46) but that it received the lowest rating on the learning satisfaction scale (M = 4.26, SD = .90). This large differential (M = -.56, SD = .02) meant that the students expected, to some degree, grades higher than those received after they completed the course work. Meanwhile, the value of "convenience" was rated moderately highly on the importance scale (M = 4.59, SD = .72) but received the highest rating on the satisfaction scale (M = 4.56, SD = .64), yielding the smallest discrepancy observed. (M = -.03, SD = .76). In that area, the course lived up to student expectations.

Remember that the student responses in the “4” integer indicated some real degree of satisfaction with core content and final learning experience. But, as Table 2 clearly demonstrates, all five major values carried a negative association between the “perceived” importance and the “perceived” satisfaction' ratings. None of the variables had satisfaction mean values that met or exceeded the various mean values of importance. Further analysis was subsequently performed to determine what factors were involved in the causation of these differentials. Table 3 demonstrated the differences between students’ perception of the importance of the course and their level of satisfaction in the always sensitive area of “knowledge learned/grade earned”- the matter of congruency between the work the students do and the formal assessment of it.

Table 3
Differences between perceived importance and satisfaction
with each statement of grade earned / knowledge learned

A close scrutiny of the various questions asked in the variable “knowledge learned/grade earned” provided meaningful direction in improving that area of concern for students. The largest mean differential between importance and satisfaction ratings occurred with the statement that related to “passing the class” (M = -.81, SD = .98), as one might expect. The statement was rated as the highest on the importance scale (M = 4.98, SD = .15) but it was rated somewhat lower on the satisfaction scale (M = 4.16, SD = .97), meaning perhaps that once students received the actual credit for the course, this matter shrank somewhat in vitality. In contrast, the statement “gaining knowledge of the content from taking this course” was rated as second in importance (M = 4.88, SD = .32), but in terms of satisfaction was rated the highest (M = 4.51, SD = .88), with the smallest discrepancy (M = -.37, SD = .87).We concluded that, in general, the students felt that they had successfully mastered a core content that proved to be as important as they originally thought it might be. “Being prepared academically for future classes” was the statement rated as having the least importance to this group (M = 4.60, SD = .69) and it also appeared as having the smallest value on the “satisfaction scale” (M = 4.12, SD = .91).

The measurements in Table 4 address the issues related to the "technical" student support structures, an area to which we pay great attention. With regard to this clearly-stated variable, “readily available and appropriate technical assistance” was rated with a fairly high importance to them (M = 4.91, SD = .37), but results indicated relatively low satisfaction levels.
(M = 4.09, SD = 1.06). Here we see the largest “gap” in the survey between importance and satisfaction levels (M = -.81, SD = 1.07).

Table 4
Difference between perceived importance and satisfaction
with each statement of student support

Meanwhile, “ease of learning new software” was rated as that value having the smallest discrepancy in scoring between importance and satisfaction ratings (M = -.19, SD = 1.10), meaning, we think that graduate students find mastery of new applications rather easy. “Encountering few or no technical difficulties” was the statement rated with the highest mean value on the importance rating scale (M = 4.95, SD = .21). We conclude that students simply want everything to work as advertised. “Reliability of the server and/or equipment” was that statement rated with the highest mean value on the satisfaction scale (M = 4.56, SD = .63) - a result that reflects on the normal daily operations of modern educational institutions. “Access to readily available tutorials” was rated with the lowest mean scores on both the importance and satisfaction scales (M = 4.23, SD = .75; M = 3.81, SD = 1.10). We conclude our students are comfortable about finding the technical expertise they might need.

Table 5
Differences between perceived importance and satisfaction
with each statement of emotional health

The data in Table 5 reveals information about “comfort levels” of the students within the formal contexts of online learning. We conclude, in general, that students experience some frustration with “interactive” aspects of long-distance learning. “Being mentally prepared for taking tests online” had the highest discrepancy score between importance and satisfaction ratings (M = -.70, SD = 1.17). Perhaps students experience some frustration about how to prepare for the tests administered periodically in the program “modules.”

Meanwhile, the value “being able to send blanket e-mails for help to other students at my convenience” was perceived to be of lowest importance (M = 4.21, SD = .91) and was lowest as well on the satisfaction rating scale (M = 4.19, SD = .96), with in the smallest discrepancy (M = -.02, SD = 1.14). Students may be uncertain about how the “interactive methodology” in the program can best be utilized. “Being able to complete my work alone” was the idea rated with the highest satisfaction rating (M = 4.51, SD = .67) while “being able to receive e-mail help from other students at our mutual convenience” was rated with the lowest satisfaction rating (M = 4.16, SD = 1.02). Most find working alone through the program easier than the various group activities that are available and required.

The results in Table 6 reveal that a vital “by-product” often emerges in online learning. With regard to the “communication” variable, we learned that “improvement in my written communication skills” had the lowest importance to them (M = 4.21, SD = 1.04) but a relatively high satisfaction rating (M = 4.53, SD = .67). Here, we see the largest differential between importance and satisfaction ratings (M = .33, SD = .99). Obviously, we see the largest differential in the study which showed that students perceived themselves to have improved as writers because of this online learning course work - an unintended consequence but a very positive and important one.

Table 6
Differences between perceived importance and satisfaction
with each statement of communication

The idea stated as “quantity of student to student interactions” was rated quite low on the importance scale (M = 4.28, SD = .98) and it was lowest on the satisfaction rating (M = 4.16, SD = .75), resulting in the smallest discrepancy score (M = -.07, SD = 1.08). Students apparently feel online learning is not meant to help them improve the personal interaction skills. The statement “receiving timely feedback about my progress from the instructor” was rated with the highest mean score on both scales (M = 4.91, SD = .37; M = 4.63, SD = .58). We learned that students want a timely and appropriate response concerning the quality of their work.

As demonstrated in Table 7, all parties agree that the variable of “convenience” makes online learning attractive to all parties in our 24/7 world. The statement “easily accessing the syllabus and written instructions for assignments as needed” was ranked with moderate levels on both importance and satisfaction scales (M = 4.79, SD = .60; M = 4.53, SD = .67) but it also had the largest discrepancy (M = -.26, SD = .95). Because the rankings were in general quite high, we concluded the website was accessible, and the modular unit structure clear, but some room for improvement existed. The value “saving money on automotive expenses, including gas” was rated with the smallest discrepancy score between importance and satisfaction ratings (M = .05, SD = .82). We concluded that working at home or on a laptop anywhere is important to the students and that we had succeeded for the most part in facilitating that practice.

Table 7
Differences between perceived importance and satisfaction
with each statement of convenience

Some might argue that the only variable data that might be considered as “statistically non-significant” in our approach was that relating to the “convenience” value. Some positive mean differences were indeed revealed. For example, the statements - “saving money on babysitting fees”(M = .12, SD = 1.07), “saving money on food”(M = .09, SD = .84), “saving time from commuting” (M = .05, SD = .65), and “saving money on automotive expenses, including gas”
(M = .05, SD = .82) received positive mean differences in importance and satisfaction ratings, indicating students' expectations about good distance learning programs are high but usually well-met. “Being able to work on assignments at any time, day or night” was rated as the value with the highest mean scores both in terms of importance and satisfaction (M = 4.88, SD = .39; M = 4.79, SD = .41). The other ideas followed with the lowest mean scores in both categories: “Being able to complete classwork at home, office, etc.”(M = 4.81, SD = .45; M = 4.77, SD = .48); “Saving money on babysitting fees” (M 4.23, SD = 1.02; M = 4.35, SD = .57); and “saving money on food” (M = 4.21, SD = .64; M = 4.30, SD = .83). This data shows that we do well what so many others have mastered recently, and the students have “life-style” concerns which we all are meeting in helpful and productive ways.


When we compared the grand mean scores of “perceived importance” with those relating to “learning experience satisfaction,” we were able to target topics and areas that helped us modify our online course design. The research revealed that the majority of students surveyed for this study were satisfied with the approach to online learning as the delivery vehicle for a Research Methods Course. In other words, students appreciate the values and techniques associated with online learning and consider them appropriate to Graduate work. Indeed, a real majority of students surveyed for this study preferred online course work to conventional "face-to-face" classroom instruction. With a strong voice, the students disagreed with the statement “I would not have taken an online course if there had been some other means of receiving credit” that was included in the profile data. Students were most satisfied with the convenience component of our program. Many students choose online distance education programs because of the personal flexibility such courses offer. Online education’s main advantage is its ability to liberate students from time constraints and geographical distance problems (Caverly & MacDonald, 1999; Fisher, 2003). Another advantage of online learning is that it allows a student to progress at his/her own pace (Nguyen & Kira, 2000). The participants agreed with these benefits of this modern technology.

The demographic information about students in this study reflected typical characteristics of online learners who are full-time or part-time students living at a significant distance from their school campus. Since the majority of students in this course were employed many hours during each week, and more than half of the students also lived a considerable distance from campus, both statements “being able to work on assignments at any time, day or night” and “being able to complete classwork at home, office, etc” were vitally significant to them and resulted in the highest student satisfaction rating as we might expect.

According to the information obtained from the profile data, most of the students in this study had positive experiences with online learning. Some reasons for this attitude appeared to be that they were highly motivated, they were confident about their ability with computers, and they did not feel isolated when working online - many even prefer working alone on projects. In general, the success of online courses does require a high degree of self-motivation and self-direction (Bocchi, Eastman & Swift, 2004; O’Lawrence, 2006; Palloff & Pratt, 2003). Ng (2005) stated that “Online instruction requires the students to be very motivated to get onto their computers and do the required work at the appropriate times. Students who are not self-motivated will not do well in the online course setting” (p. 67).

Despite having quite positive attitudes about online learning, the vast majority of students participating in this study agreed that “getting a good grade is not easy in an online course” and rated “passing the class” quite low on the satisfaction scale, yielding the largest differential between importance and satisfaction ratings. This “gap” indicated that uncertainty existed about the quality of their work, our feedback methodology, and our approach to assessment of their work. Students did feel very satisfied about “gaining knowledge of the content” and this sense of achieving some content mastery was supported by the responses to the statement “I learned less than I expected from online courses” in the profile data. Seventy-seven percent of students disagreed with that statement. Consequently, it is clear that even though the majority of students surveyed in this study expected to pass the class, they did believe that getting a good grade was not that easy, they did feel quite satisfied with the actual core content they learned, but they were somewhat dissatisfied by the methods we employed to clearly articulate their levels of achievement. Consequently, we have been redesigning the structures of the course modules and our methods of communication with the students.

With regard to “student support,” the majority of students in this study saw “encountering few or no technical difficulties” as well as the “reliability of the server and/or equipment” as the most important components in the success of any online learning process. The study suggested that technical problems continue to be a real issue for some students, although most students claim to be quite comfortable when working with computers. It is obvious that students want to participate in online learning but do not want to waste time because of serious technical difficulties and related problems. “Readily available and appropriate technical assistance” was the one factor in our study where some signs of student dissatisfaction were evident. This data indicated a high differential between importance assigned and satisfaction experienced by students. Online learning, by its nature, requires students to be able to master rather basic applications of technology prior to enrolling in the course. But, some students do lack the necessary technical skills - emailing, downloading and opening files, viewing video clips using Flash and Quicktime, and downloading and listening to a podcast using iTunes. Qureshi (2004) suggested that “Using the computer as a learning mode requires new strategies and skills that cannot be taken for granted. Therefore, technical advice and support needs to be provided not only initially, but as an ongoing measure” (p. 157). For the ready availability of technical assistance, the “office hours” of an “ITS HELP DESK” need to be extended for evenings and weekends. Online course orientation video clips and brief tutorial video clips can be used in the online course management system.

Most of the students reported they did continue to develop their own written communication skills throughout this course. One positive aspect of a typical online format is that it generally does require substantial amounts of formal writing, often on a daily basis. The statement expressing “improvement in my written communication skills” in the variable of communication had the lowest importance rating, but a relatively high satisfaction rating. The students experienced an important advancement in their writing skills that was clearly beyond their expectations. We have kept this significant benefit in mind as we have designed new writing prompts and thought - provoking questions each new semester.

In addition, most of students in the study stated that they had a meaningful and satisfactory interaction with their course instructors. In particular, with regard to the statements associated with feedback from faculty to students -“receiving timely feedback about my progress from the instructor” and “getting personal feedback from the instructor,” both were rated with the highest mean score in terms of importance and learning experience satisfaction. Students wanted to and expected that they would receive feedback from their instructors to their questions, assignments, and discussion postings, and that this would be done in a timely manner. According to Howland and Moore (2002), “Some students expect immediate feedback in online courses because they have the perception that the instructor is readily available, regardless of the day and time” (p. 191). This result was congruent with the findings that showed the students had a high level of agreement to the statement made in the profile data survey: “I do not like having to wait for the instructor to respond to my e-mail.” Consequently, we are working to improve the online procedures to facilitate improved communication with the students. 

The statements regarding student to student contact including “quantity of student-student interactions” and “quality of student-student interactions” were rated the lowest on the satisfaction scale. Weekly discussions were conducted on the discussion board within the online course and their purpose was to provide more opportunities for students to develop collaborative partnerships within the online learning course structure. But our results showed that students showed little satisfaction in these possible interactions. Accordingly, the instructor/researcher was challenged to structure the course in new ways that might promote intensive and fruitful interactions among students. Some components of the program such as small group discussion, an increase in the weight/value of the discussion board, some open-ended questions designed to elicit discussion rather than simply to earn points, and even simple “chat sessions” - all have been considered as appropriate means to stimulate students’ interactions with one another. Swan et al. (2000) found active interactions between student and student during online courses often results in collaborative learning efforts which enhances understanding and adds to more relevant and meaningful learning experience.

With regard to the “emotional health” variable, while students ranked “being mentally prepared for taking tests online” as being of great importance to them, they also implied they experienced only a moderate degree of satisfaction in this regard. They accepted the idea that a student’s success in the tests is made possible through effective instruction by the faculty. But students also acknowledged that they have a responsibility to be well-prepared to take online tests and to engage carefully in the online discussions and various activities designed by the course instructor. Topics and assignments for weekly discussions were directly related to the material on mid-term and final exams. If students had diligently completed these weekly assignments by referencing the textbook and course lecture material, they might well have been better prepared for formal examinations. “Having confidence in submitting my work online” was rated highly on both the importance and satisfaction scales within this emotional health variable. The most plausible explanation for this finding might be that most students had indeed become accustomed to online learning but still needed to be thorough and careful as they worked their way through the various modules.

In conclusion, because of the unique nature of an online learning experience, and the unique personality of an online learner, it is important that we continue to study the “online experience” from the students’ perspective. Consequently, validated information about the levels of assigned importance by the students themselves and their levels of satisfaction with the course learning experience can provide any instructor/researcher with quite useful insights that will help him/her refine his course design and instructional methodology in very practical ways. Certainly, we see that further research in diverse online course designs and implementation using a similar survey to measure the consistency of the congruency of student expectations and satisfaction levels may well prove to be a necessary tool helping us to understand the evolution of distance learning.


Allen, I. E., & Seaman, J. (2008). Staying the course: Online education in the United States,  2008. Needham MA: Sloan Consortium.

Benbunan-Fich, R., Hiltz, S. R., & Harasim, L. (2005). The online interaction learning  model: An integrated theoretical framework for learning networks. In S. R. Hiltz & R. Goldman (Eds.), Learning together online: Research together online: Research on asynchronous learning networks (pp. 18-37). Mahwah, NJ: Lawrence Erlbaum Associates.

Bennett, G., & Green, F. P. (2001). Student learning in the online environment: No  significant difference? Quest, 53, 1-13.

Bocchi, J., Eastman, J., & Swift, K. (2004). Retaining the online learner: Profile of students  in an online MBA program and implications for teaching them. Journal of Education for Business, 79(4), 245-253.

Brown, P. L. (2003). A comparison of online instruction versus traditional classroom  instruction in a fitness for Life course. Unpublished doctoral dissertation. University of North Carolina, Greensboro.

Cannon, M. M., Umble, K.E., Steckler, A., & Shay, S. (2001). We’re living what we’re learning: Student perspectives in distance learning degree and certificate program in public health. Journal of Public Health Management and Practice, 7(1), 49-59.

Caverly, D.C., & MacDonald, L. (1999). Techtalk: Asynchronous distance developmental Education [Electronic version]. Journal of Developmental Education, 25(2), 36-37.

Dahl, J. (2004). Strategies for 100 percent retention: Feedback, interaction. Distance Education Report, 5(16), 1-7.

Fallah, M., & Ubell, R. (2000). Blind scores in a graduate test: Conventional compared with web-based outcomes. Asynchronous Learning Networks Magazine, 4(2), Retrieved June 12, 2009, from http://www.aln.org/publications/magazine/v4n2/fallah.asp

Fisher, M. M. (2003). Designing courses and teaching on the Web: A “how to” guide to proven, innovative strategies. Lanham, MD: Scarecrow Press.

Hendry, S. R. (2005). Student perceptions: Importance of and satisfaction with aspects of an online biology course. Unpublished doctoral dissertation, University of Southern Mississippi, Hattiesburg. Howland, J. L., & Moore, J. L. (2002). Student perceptions as distance learners in internet-based courses. Distance Education, 25(2), 183-195.

Johnson, S. D., Aragon, S. R., Shaik, N., & Palma-Rivas, N. (2000). Comparative analysis of learner satisfaction and learning outcomes in online and face-to-face learning environments. Journal of Interactive Learning Research, 119(1), 29-49.

Jonassen, D. H., Peck, K. L., & Wilson, B.G. (1999). Learning with technology: A constructive perspective. Upper Sadler River, NJ: Prentice Hall.

Larson, D. K. (2009). Comparing student performance: Online versus blended versus face-to-face. Journal of Asynchronous Learning Networks, 13(1), 31-41.

Lin, S. Y., & Overbaugh, R. C. (2007). The effect of student choice of online discussion format on tiered achievement and student satisfaction. Journal of Research on Technology in Education, 39(4), 399-415.

Moody, J. (2004). Distance education: Why are the attrition rates so high? The Quarterly Review of Distance Education, 5(3), 205-210.

Muirhead, B. (2000). Interactivity in a graduate distance education school. Educational Technology and Society, 3(1), 93-96.

Ng, K. E. (2005). Instructor satisfaction and attitude toward online instruction. Unpublished doctoral dissertation, University of Southern Mississippi, Hattiesburg.

Nguyen, D., & Kira, D. (2000). Summative and formative evaluations of internet-based teaching. In L. K. Lau (Ed.), Distance learning technologies: Issues, trends and opportunities (pp. 22-38). Hershey, PA: Idea Group Publishing.

O’Lawrence, H. (2006). The influences of distance learning on adult learners [Electronic version]. Techniques: Connecting Education & Careers 81(5), 47-49.

Palloff, R.M., & Pratt, K. (2003). The virtual student: A profile and guide to working with online learners. San Francisco, CA: Jossey-Bass.

Pontz, S. (2006). The effects of interaction in an online class on student satisfaction. Unpublished Doctoral dissertation, Wayne State University, Detroit.

Pulichino, J. (2006). Future directions in e-learning research report 2006. The e-Learning Guild Research. Retrieved May 4, 2009, from The eLearning Guild Web site: http://www.elearningguild.com/pdf/1/apr06-futuredirections.pdf

Qureshi, E. (2004). Investigation of factors affecting students ‘satisfaction with online course components. Unpublished doctoral dissertation, University of Windsor, Ontario, Canada.

Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88.

Roach, V., & Lemasters, L. (2006). Satisfaction with online learning: A comparative descriptive study. Journal of Interactive Online Learning, 5(3), 317-332.

Russell, T. (1999). The no significant difference phenomenon: As reported in 355 research reports, summaries and papers. Raleigh, NC: North Carolina State University.

Schulman, A. H., & Sims, R. (1999). Learning in an online format versus an in-class format: An experimental study. T.H.E. Journal, 26(11), 54-56.

Spellings, M. (2006). A test of leadership: Charting the future of U. S. higher education. Retrieved April 5, 2009, from U.S. Department of Education Web site: http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/final-report.pdf

Swan, K., Shea, P., Fredericksen, E., Pickett, A., Pelz, W., & Maher, G. (2000). Building knowledge building communities: Consistency, contact and communicate on in the virtual classroom. Journal of Educational Computing Research, 23(4), 389-413.

U.S. Census Bureau. (2005). Computer and internet use in the United States: 2003. Retrieved June 8, 2009, from http://www.census.gov/prod/2005pubs/p23-208.pdf

Wanstreet, C. E. (2006). Interactions in online learning environments: A review of the Literature [Electronic version]. The Quarterly Review of Distance Education, 7(4), 399-411.

About the Authors

Jong-Hoon Yu is an assistant professor in the department of physical education, health & sport studies at Canisius College, where he has taught since 2005. He earned his bachelor’s degree from Kyung Hee University (Seoul, Korea), master’s degree from Indiana University, and doctorate degree from Boston University. His research interests are focused on the descriptive-analytic studies using systematic observation instruments and constructivist online course design. He recently presented “Online course production and implementation” at the NYS AHPERD 72nd Annual Conference.
He is currently teaching graduate online courses including research methods course and capstone course, and undergraduate self-defense course. He is a first-Dan black belt in Taekwondo and has developed the self-defense unit in the P.E. curriculum. He also serves as the faculty advisor of the Canisius College table tennis club.

He can be reached at yuj@canisius.edu

Jwa K. Kim is a full professor in the department of psychology at Middle Tennessee State University. He has a bachelor’s degree in education, a master’s degree in educational psychology from Kyungpook National University (Taegu, Korea), and a doctoral degree in psychometrics & quantitative psychology from University of Oklahoma. His research areas include item response theory, multivariate analysis, measurement and scaling, nonparametric statistics, computer application, quantitative analysis. He has published many scientific papers in journals and given presentations at many international conferences.

He can reached at pykim@mtsu.edu

go top
February 2010 Index
Home Page