August 2007 Index
 

Home Page


Editor’s Note:
Distance learning assigns to the student responsibility for time scheduling and management. Student self-regulation is key to successful adoption of distance learning technologies.

 

Relationships between Course Management System (CMS) Use and Teacher Technology Adoption

Robert Hancock, Gerald Knezek
USA

Abstract

This study examined the difference in student self-regulation as defined by technology adoption for participants in the free Hotchalk course management system (CMS) which is referred to as the Hotchalk Learning Environment (HLE). 1078 respondents completed a battery of instruments to determine if there was a relationship between self-regulation and CMS use. Previously, the authors had successfully cross-validated these instruments against each other in a study that won the best research paper at the 2007 National Education Computing Conference. Results of this study indicate there is a strong positive relationship between self-regulation as defined by level of technology adoption and use of the Hotchalk learning environment that is significant at the .01 level across all three instruments of assessment.

Introduction

A course management system is hardly a new concept in the field of education and online instruction is now an integral part of higher education across institutions (Terry, 2007). The question has long shifted from online versus traditional instruction, and for most professors is a matter of hybrid or online-only forms of delivery (Willett, 2002). Course management systems have made dramatic inroads to K-12 school systems with many school districts using one of the many paid or free systems currently in the market to facilitate online instruction and parent communication. (Barge & Loges, 2003). Indeed, it is this parent expressed desire (2003) for a different kind of communication pattern with teachers that has driven implementation in many school districts. Parents are no longer satisfied with the traditional patterns of infrequent teacher–parent conferences and notes home. For example, a parent commented on the benefits of more frequent contact; “When I saw the good information about my child instead of only bad things, it made me feel good. More positive, less negative. [sic] We do need to hear the good and not just the bad” (Barge & Loges, 2003, p. 146). There is strong evidence that course management systems are effective in accomplishing this goal as well their more traditional roles of organizing the class environment (Munoz & Duzer, 2005). There is also strong evidence (2005) that free systems such as Hotchalk, Moodle, and Sakai do as good a job or better at managing course information and facilitating communication as the pay systems such as Blackboard. There remains however the question of the relationship between the user and the management system.

The concept of academic self-regulation and its role in the learning environment has been around for some time (Ruban, McCoach, McGuire, & Reis, 2003). According to Zimmerman, (1989) self-regulation refers to “the degree that individuals are metacognitively, motivationally, and behaviorally active participants in their own learning process” (p. 329). Zimmerman (1989) identifies the hallmarks of academic self-regulation to include academic time management, practice, mastery of learning methods, goal-directedness, and a sense of self-efficacy.

Recently, researchers have begun to look at issues of self-regulation and its relationship with success in online learning environments (Zerr, 2007; Whipp & Chiarelli, 2004). Yet studies in this area remain few and far between (Whipp & Chiarelli 2004), and a call for more research has been issued (Hodges, 2005). It is in response to that call that this study was undertaken.

Although primarily conceived of as measures of technology adoption, the Concerns-Based Adoption Model-Levels of Use (CBAM-LoU) (Hall. Loucks, Rutherford, & Newlove, 1975), the Apple Classrooms of Tomorrow Teacher Stages (ACOT) (Dwyer, 1994), and the Stages of Adoption of Technology (Christensen, 1997; Knezek & Christensen, 1999) roughly conform to Zimmerman’s hallmarks of practice, mastery of learning methods, goal directedness, and self-efficacy and provide a good starting point for measuring self efficacy for teachers in the use of course management systems. Time management, while not a component measured by these instruments, is measured separately in the study in terms of frequency of use of the Hotchalk CMS. It is felt by the authors that these instruments, combined with frequency of use (time) measurements, provide a vital first step in assessing the relationship between self-regulation and online learning and design of a practical instrument for assessment of self-regulation in the online learning environment.

Conceptual Rationale

It is the author’s belief that the Concerns-Based Adoption Model-Levels of Use (CBAM-LoU), the Apple Classrooms of Tomorrow Teacher Stages (ACOT), and the Stages of Adoption of Technology (Stages) can serve as a starting point for assessing what Hodges (2005) referred to as the behavioral, personal, and environmental components of self regulation by establishing a self-judgment regarding the individual’s will and skill with technology (in this case the Hotchalk course management system) tool.

According to Zimmerman this self-judgment is essential. He states (1989) that "self-judgment refers to students' responses that involve systematically comparing their performance with a standard or goal" (p. 333). The format of the instruments provides this framework of self-judgment to the user.

By administering this battery of instruments to active educators who are either users or non-users of course management systems, it is possible to determine if there exists a relationship between self-regulation in terms of technology adoption and course management system use. Further, by comparing the scores of active users with their frequency of use, it becomes possible to examine more closely the role of time management (in terms of usage) in the adoption relationship.

Together these measurements give us the starting point for measuring self-regulation in terms of technology adoption for users of course management systems.

Review of Literature

The use of online course management systems is now commonplace (Caruso, 2004). In a national study with a sample of 18,400 students with a response rate of 4,374 students across thirteen separate universities, Caruso (2004) found that 96.4 % of students used the internet for classroom activities. Students also responded that they used technology primarily for classroom activities with an average of four hours of study per week using electronic means and an average of 2.48 hours per week interacting with a course management system (Caruso, 2004). These numbers were in excess of the amount of time spent playing computer games and shopping online.

Communication and convenience stand out as major benefits offered by the CMS to both students and instructors (Caruso, 2004; Morgan, 2003).

Caruso (2004) indicates that the number one impact of CMS technology in the classroom as stated by students was, “helped me to better communicate with the instructor” (Caruso, 2004, p. 5). The second was, “resulted in prompt feedback from the instructor” (Caruso, 2004, p. 5). The third was, “helped me communicate and collaborate with my classmates” (Caruso, 2004, p.5). Close behind these results were easy access to materials online. In a nutshell, the findings indicated that for these students the main benefits of the CMS were communication and convenience.

Morgan (2003), in a study of 740 faculty members, indicates that 80% are using the CMS in a hybrid manner with some face-to-face and some online instruction. Morgan asserts that 59% of professors surveyed indicated an increase in communication with students as a result of using the CMS. Perhaps best of all, Morgan reports that “in the process of using these tools, many faculty members begin to rethink and restructure their courses and ultimately their teaching…resulting in an accidental pedagogy … faculty teaching is improved as a result” (Morgan, 2003, p. 4)

These findings were largely replicated in the pilot study for this research conducted by Devaney and Hancock in 2006. In this study of 655 respondent students from a sample of 1900, it was found that the majority (76.7%) of students had access to high speed internet services, 87.2 % were enrolled in an online course, 94.6% were accessing course material primarily from home, and were utilizing online services with low anxiety as indicated on the anxiety subscale of the Teacher’s Attitude Toward computers questionnaire with an overall mean attitude equal to 1.58 (M = 1.568, SD = .631).

Morgan (2003) indicates that the high cost of CMS systems will force high usage rates among faculty and students. While usage is definitely high, there has been a response that was partially unforeseen by Morgan; the rise of the free CMS system. As costs for systems such as Blackboard continue to spiral upward, many universities and public school systems have turned to free systems such as Hotchalk, Moodle, and Sakai as a free alternative for CMS services.

In 2005, Munoz and Duzer conducted a comparison study of Blackboard and Moodle CMS implementations. Blackboard is a commercially available product partially owned by Microsoft whereas Moodle is a freely available open source alternative. The results were intriguing, 0% of Blackboard users strongly agreed that Blackboard enhanced instruction while 7.1% of Moodle users strongly agreed that Moodle enhanced instruction (Munoz & Duzer, 2005). The somewhat agrees were roughly equal (2005) with 23.1% of Blackboard users agreeing that Blackboard enhanced instruction and 21.4% of Moodle users saying the same for Moodle. The number of Moodle users who did not believe that Moodle enhanced instruction was also less than the number of Blackboard users with 53.9% of Blackboard users thinking Blackboard was not useful in enhancing instruction and 42.9% of Moodle users thinking that Moodle was not useful in enhancing instruction (Munoz & Duzer, 2005).

Similar results were found when considering communication with the instructor. 46.2% of Blackboard users agreed/strongly agreed that Blackboard enhanced communication with the instructor while 61.4% of Moodle users agreed/strongly agreed that Moodle enhanced communication (Munoz & Duzer, 2005). The number of Moodle users who believe that Moodle impeded instruction was also less than the number of Blackboard users with 38.5% of Blackboard users thinking Blackboard somewhat/significantly impeded instruction and 7.1% of Moodle users thinking that Moodle somewhat/significantly impeded instruction (Munoz & Duzer, 2005).

Moodle also did better than Blackboard in enhancing student communications. Over 71.4 % of students felt Moodle enhanced student to student communication as compared to 53.9% of students who utilized Blackboard (Munoz & Duzer, 2005).

This study by Munoz and Duzer was consistent with other literature that indicates that communication is the key benefit of the CMS; it also indicates that free CMS has equivalent or even superior potential to be beneficial to both student and instructor. This leaving us with the question of how to measure that benefit.

The question of achievement is part of this study; however, it is more concerned about “accidental pedagogy” associated with self-regulation that is referred to by Morgan (2003). The focus is the relationship of self-regulation and use. Self-regulation is measured as technology adoption on the part of participants.

The Concerns-based Adoption Model-Levels of Use (CBAM-LoU) questionnaire is targeted to describe behaviors of innovation users through various stages - from orienting, to managing, and finally to integrating use of the technology. It is designed to be more quickly administered than SoCQ, and CBAM-LoU does not focus on attitudinal, motivational, or other affective aspects of the user. The instrument is based on eight levels of use defined in the Levels of Use Chart (Loucks, Newlove, & Hall, 1975). The levels of use are: (0) Non-Use, (I) Orientation, (II) Preparation, (III) Mechanical Use, (IVA) Routine, (IVB) Refinement, (V) Integration, and (VI) Renewal. The levels-of-use concept also applies to groups and entire institutions. Because the Concerns-Based Adoption Model - Levels of Use is a single item survey, reliability measures of internal consistency cannot be calculated for data it gathers. However, test-retest reliability estimates generally fall in the range of .84 to .87 for elementary and secondary school teachers (Christensen, Parker, & Knezek, 2005, p. 189).

The Stages of Adoption of Technology instrument is a single-item survey used in both pre-service and in-service education to measure the impact of information technology training as well as trends over time. It was derived from work of Russell (1995) in research assessing adults learning to use electronic mail. Russell's stages included: (1) awareness, (2) learning the process, (3) understanding the application of the process, (4) familiarity and confidence, (5) adaptation to other contexts, and (6) creative applications to new contexts. In the Stages of Adoption of Technology instrument (Christensen, 1997; Christensen & Knezek, 1999) the stage descriptions are generalized to make them appropriate for any information technology.

As with the CBAM-LoU, the Stages of Adoption of Technology instrument is a single item survey, internal consistency reliability measures cannot be calculated for data gathered through it. However, high test-retest reliability estimates (.91 - .96) have been obtained from validation studies on Stages of Adoption (Christensen, Parker, & Knezek, 2005, p. 189).

By the time ACOT research ended in 1998, Apple Classrooms of Tomorrow, had been working with the National Science Foundation for more than eight years on professional development for teachers in different environments. In its entirety, the ACOT project was one of the largest and longest continuing educational studies of its kind.

Drs. Eva Baker, Joan Herman-Cooper, and Maryl Gearhart of the Center for the Study of Evaluation at UCLA designed and implemented a three-year, cross-site study of ACOT. Student demographic and psychometric data were collected annually from participating districts, using subsets of the Iowa Tests of Basic Skills and other measures.

The project analysis (Baker, Herman, & Gearhart, 1988) indicated a progression of attitudinal change that the authors believed could be viewed as an evolutionary process similar to other models of educational change (Berman & McLaughlin, 1976; Giacquinta, 1973; Gross & Herriott, 1979) such as CBAM. This served as the basis for what would become labeled stages of instructional evolution in the ACOT classrooms: Entry, Adoption, Adaptation, Appropriation, and Invention. In this model, text based curriculum delivered in a lecture-recitation-seatwork mode is first strengthened through the use of technology, and then gradually replaced by far more dynamic learning experiences for the students (Dwyer, Ringstaff, & Sandholtz, 1991).

These instruments have been cross-validated against each other and found (when used in combination) to form a reliable, valid, measure of the construct technology integration that is stable across geographic location and time (Hancock, Knezek, and Christensen, 2007). Stages of Adoption of Technology, CBAM Level of Use, and ACOT stages of instructional evolution form a consistent self-report measure that has stable construct validity and aligns well with anticipated changes in educator attitudes as technology integration progresses.

It is the author’s belief that these measures, when combined with frequency of use information, will provide an accurate picture of self-regulation in terms of technology adoption useful for analyzing participants in course management systems.

Methods

This study examined the difference in self-regulation as defined by technology adoption. The study consisted of a sample of 20,000 teachers drawn from across the United States with 1078 respondents. Of these 1078 respondents, 307 were active participants in the free Hotchalk course management system which is referred to as the Hotchalk Learning Environment (HLE). This response comprised 5.6% of the accessible population. Respondents represented 47 of the 50 US states, plus the District of Columbia.

The HLE is free community software for teachers, students, and parents. It includes curriculum management, lesson plan development, automated assignment distribution, collection, and grading in a web-based environment. Teachers find, create and share standards aligned resources and best practices through the HLE interface. The HLE is accessed through an Internet browser and is available anywhere there is an internet connection worldwide. The HLE is funded through advertising with individual schools able to control the type of advertisement displayed as well as the time of display. Currently 100,000 schools are in the HLE database.

Participants were given a questionnaire consisting of the three instruments and several demographic questions including frequency of use of the Hotchalk CMS.

Analysis

Analysis consisted of the examination of levels of adoption for users versus non-users, adoption versus frequency of use, and function analysis of these factors in combination.

Technology Adoption and Use/Non-Use

The first step was to determine if there was a difference in technology adoption among users or non-users of the Hotchalk Learning Environment (HLE). Analysis shows that Stages, CBAM, and ACOT are all significantly different at the p < .01 level for users versus non-users. Factor scores on technology adoption were produced for each teacher. These factor scores were then included in the ANOVA (Table1). The results is mean = -.08 for the do not use Hotchalk group vs. mean = .22 for the group that reported using Hotchalk. The magnitude of the difference between the two means is .22 – (-.08) = .30 / 1.00 STD. The ES of .30 would be considered educationally meaningful in most circles. (Bialo, 1996)

 Table 1
Analysis of Variance in Technology Adoption among Users and Non-Users of HLE

 

 

Sum of Squares

df

Mean Square

F

Signif

STAGES

Between Groups

8.375

1

8.375

6.366

0.012

 

Within Groups

1086.693

826

1.316

 

 

 

Total

1095.068

827

 

 

 

CBAM

Between Groups

27.412

1

27.412

17.392

0.000

 

Within Groups

1303.458

827

1.576

 

 

 

Total

1330.870

828

 

 

 

ACOT

Between Groups

11.276

1

11.276

9.542

0.002

 

Within Groups

977.276

827

1.182

 

 

 

Total

988.552

828

 

 

 

REGR factor
score 1 for analysis 1

Between Groups

13.716

1

13.716

13.916

0.000

 

Within Groups

814.145

816

0.986

 

 

 

Total

827.861

817

 

 

 

 

Technology Adoption and Frequency of Use

The second step was to determine if there was an association between technology adoption scores and frequency of use of the Hotchalk Learning Environment (HLE) (among those who reported using the HLE). Among the 358 teachers who reported using the HLE, there was a significant positive correlation (p < .01) found between extent of Hotchalk use and ratings on CBAM Level of Use, ACOT teacher stages, and the composite factors scores of technology adoption (Table 2). There was not a significant association between Stages of Adoption and HLE use, although the association in the case of Stages vs. Hotchalk level of technology integration implementation was also positive (r = .047, p = .34).

Table 2
Correlations between Technology Adoption Measures and Frequency of Use
 
 
FREQHOTU
STAGES
CBAM
ACOT
REGR
factor score 1 for1 analysis
FREQHOT U
Pearson
Correlation
1
0.059
0.160
0.163
0.157
 
Sig. (2 tailed)
-
0.280
0.003
0.003
0.004
 
N
358
340
340
339
339
STAGES
Pearson
Correlation
0.059
1
0.700
0.578
0.876
 
Sig.(2 tailed)
0.280
-
0.000
0.000
0.000
 
N
340
340
340
339
339
CBAM
Pearson
Correlation
0.160
0.700
1
0.605
0.889
 
Sig.(2 tailed)
0.003
0.000
-
0.000
0.000
 
N
340
340
340
339
339
ACOT
Pearson
Correlation
0.163
0.578
0.605
1
0.836
 
Sig.(2 tailed)
0.003
0.000
0.000
-
0.000
 
N
339
339
339
339
339
REGR factor
score 1 for 1 analysis
Pearson
Correlation
0.157
0.876
0.889
0.836
1
 
Sig.(2 tailed)
0.004
0.000
0.000
0.000
0.000
 
N
339
339
339
339
339

** Correlation is significant at the 0.01 level (2-tailed)

 

Strength of Predictors

If Technology Integration Implementation is analyzed as a function of a) whether or not one reports using the HLE, and b) how much the HLE is used, if it is used, then most of the variance is attributable to whether or not use takes place, (beta = .145, See table 3) while the remaining portion due to frequency of use is not a strong predictor. (beta = .067).

 
Table 3
Strength of Predictors
Model
R
R
Square
Adjusted R Square
Std. Error of Estimate
 
 
1
0.189
0.036
0.031
1.00681524
 
 
a  Predictors: (constant), FREQHOTU, USEHOT2
Coefficients
 
 
 
 
 
 
 
UnStandardized
Coefficients
Std.
Error
Standardized
Coefficients
t
Significance
1
(Constant)
-0.573
0.168
 
-3.404
.001
 
USEHOT2
0.297
0.117
0.145
2.545
0.011
 
FREQHOTU
6.750E-02
0.058
0.067
1.168
0.243
a  Dependent Variable: REGR factor score  1 for 1 analysis

 

Conclusions

Results of this study indicate a strong positive relationship between level of technology adoption and use of the Hotchalk Learning Environment. This is significant at the .01 level across all three instruments of assessment. There is also a strong positive relationship between frequency of use of the Hotchalk Learning Environment and level of technology adoption for two of the instruments (CBAM-LoU and ACOT) significant at the .01 level. When modeled together, reported use of the Hotchalk Learning Environment accounted for roughly 15% of the variance in total technology adoption scores, significant at the .01 level.

These results indicate a strong relationship between use of the Hotchalk Learning Environment and level of technology adoption. This leads to the question whether or not the impact of the learning environment is the basis of the relationship, if stronger adopters are more likely to embrace the use of a system like the HLE, or perhaps both.

While it is tempting to leap toward the second question, there is intriguing evidence for the last. Figure 1 compares teacher responses toward CMS impact upon instruction in this study with responses the Munoz and Duzer study of 2005 (2005) and shows a dramatic difference in teacher attitudes toward a particular CMS and its ability to impact instruction.

Figure 1. Teacher response to the question,
“Did the learning environment enhance instruction?”
taken from this study and the Munoz and Duzer study of 2005.
 

The authors are aware of dangers inherent in conclusions from comparing results of different studies, and such conclusions are beyond the scope of this document.

The current study establishes a strong relationship between use of the Hotchalk Learning Environment and high levels of teacher technology adoption. It indicates that function analysis is a stronger than frequency of use in predicting a CMS user’s ability to self-regulate instruction.

Further research is needed to validate these findings with other populations and CMS environments and to determine which factors lead to significant differences in CMS results and under what conditions. The implications of this data to guide design and implementation of CMS based systems of learning would be far-reaching.

References

Barge, J., & Loges W. (2003). Parent, student, and teacher perceptions of parental involvement. Journal of Applied Communication Research, 31(2), 140-160.

Bialo, Ellen R. & Sivin-Kachala, Jay. The Effectiveness of Technology in Schools: A summary of recent research. SLMQ Vol. 25 (1), Fall 1996.

Christensen, R. (1997). Effect of technology integration education on the attitudes of teachers and their students. Doctoral dissertation, University of North Texas. [Online]. Available: http://courseweb.tac.unt.edu/rhondac.

Christensen, R., & Knezek, G. (1999). Stages of adoption for technology in education. Computers in New Zealand Schools, 11(3), 25-29.

Devaney, T., & Hancock, R. (2006) Technology Skills, Availability, and Anxiety of Graduate Students Enrolled in Online Programs, Mid-South Educational Research Association Annual Conference, November 8th, 2006, Birmingham, Alabama.

Dwyer, D. (1994). Apple classrooms of tomorrow: What we’ve learned. Educational Leadership. April.

Caruso, J. (2004). ECAR Study of students and information technology, 2004: convenience, connection, and control. Retrieved, April 21, 2007, from http://www.educause.edu/ir/library/pdf/ERS0405/ekf0405.pdf

Farmer, L. (2003). Facilitating faculty incorporation of information literacy skills into the curriculum through the use of online instruction. Reference Services Review, 31(4) 307-302.

Hall, G. E., Loucks, S. F., Rutherford, W. L., & Newlove, B. W. (1975). Levels of use of the innovation: A framework for analyzing innovation adoption. Journal of Teacher Education, 26(1).

Hodges, C. (2005). Self-Regulation in web based courses: a review and the need for Research. Quarterly Review of Distance Education, 6(4), 375-384.

Loucks, S. F., Newlove, B. W., & Hall, G. E. (1975). Measuring levels of use of the innovation: a manual for trainers, interviewers, and raters. Austin: SEDL.

Morgan, G. (2003) Faculty use of course management systems. Retrieved, April 21, 2007, from http://www.educause.edu/ir/library/pdf/ERS0302/ekf0302.pdf

Munoz, K. & Duzer, J. (2005) Blackboard versus Moodle: A comparison of satisfaction with online teaching and learning tools. Retrieved, April 20, 2007, from http://www.humboldt.edu/~jdv1/moodle/all.htm

Ruban, L., McCoach, B., McGuire, J., & Reis, S., (2003) The differential impact of academic self-regulatory methods on academic achievement among university students with and without learning disabilities. Journal of Learning Disabilities, 36(3), 268-284.

Terry, N. (2007) Assessing instruction modes for master of business administration (MBA) Courses. Journal of Education for Business, 82(4), 220-226.

Willet, H. (2002) Not one or the other but both: hybrid course delivery using WebCT. The Electronic Library, 20(5), 413-419.

Whipp J. & Chiarelli S. (2004) Self-regulation in a web-based course: a case study. Educational Technology Research and Development, 52(4) 5-23.

Zerr, R., (2007) A Quantitative and Qualitative Analysis of the Effectiveness of Online Homework in First-Semester Calculus. The Journal of Computers in Mathematics and Science Teaching, 26(1), 55-74.

Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81, 329–339.

About the Authors

Dr. Robert Jason Hancock is editor of Connexions for Educational Technology Leadership, the online technology in administration journal of the National Council of Professors of Educational Administration. Dr. Hancock is the first author of Cross-Validating Measures of Technology Integration: A First Step Toward Examining Potential Relationships Between Technology Integration and Student Achievement, which the award for best research paper at the 2007 National Education Computing Convention sponsored by the International Society of Technology in Education. Dr. Hancock is a regular author and presenter at research conferences worldwide and is a reviewer for ITDL.

Robert Hancock, Ph.D.
Department of Educational Leadership and Technology
Southeastern Louisiana University
robert.hancock@selu.edu
 

Dr. Gerald Knezek is President-Elect of the Society for Information Technology and Teacher Education (SITE). He is the founder of the American Education Association Special Interest Group TACTL. He has won four separate research paper awards from AERA and is second author on the above mentioned work of Dr. Hancock. He is active in multiple world-wide initiatives and is currently working with UNESCO on ubiquitous computing. He currently is President of the Institute for the Integration of Technology into Teaching and Learning, which is an international organization aimed at improving the integration of technology in teaching and learning worldwide. gknezek@gmail.com

Gerald Knezek, Ph.D.
President-Elect: Society for Information Technology and Teacher Education
Department of Computer Education and Cognitive Systems
University of North Texas

gknezek@gmail.com

 
go top
August 2007 Index
Home Page