Dec 2004 Index
 
Home Page

Editor’s Note: Research studies are needed that investigate the quality of discussions in distance education colleges and universities. Today’s online instructors use discussion forums as a vital tool to meet course learning objectives, promote student collaboration on assignments and to enhance individual critical thinking skills. The authors have provided valuable insights into online dialogs that will be helpful to instructors and individuals involved in training and mentoring activities.

 

Assessing Discussion Forum Participation:
In Search of Quality

Stephen Corich, Kinshuk, and Lynn.M.Hunt

 

Abstract

The flexibility that e-learning offers and the growing maturity of e-learning management systems has lead to a rapid growth in the acceptance of e-learning as a method of delivering educational and vocational training. The use of computer-mediated communication (CMC) tools, and in particular asynchronous discussion forums, as a means of promoting communication and a collaboration between e-learning participants has lead to a growing interest by the academic and training community in the pedagogical value of such tools.

This paper looks at the role of asynchronous discussion forums in e-learning and attempts to address the issue of the quality of interaction of discussion forum participants. A number of measurement models are investigated and two of them are used to assess the quality of forum contribution for students participating in a first year undergraduate degree course. The paper concludes by attempting to identify areas where the models could be improved and discusses areas for future study.

The paper will be of interest to those who are involved in delivering e-learning courses and who would like to use discussion forums as a possible assessment tool. It would also be of value to learners who choose to enroll in distance learning courses and who are asked to participate in assessed discussion forum debate.

Key words: e-learning, collaboration, discussion forums, content analysis, cognitive communication.
 

Introduction

Online discussion forums are now regularly used as a component of distance education courses in tertiary education as a means of promoting interaction between course participants (Spatariu, Hartley & Bendixen, 2004 ). Discussion forums create an environment similar to the face-to-face classroom environment where knowledge can be critically constructed, validated and shared (Knauka & Anderson, 1998). As the use of discussion forums has grown, an increasing number of researchers have attempted to produce models that measure and analyse the networked conversations produced (Campos, 2004).

This paper presents the results of using two popular discourse analysis methods for evaluating higher order learning and knowledge building in an assessed discussion forum. The forum was used as a teaching tool in a traditionally presented course that was conducted as part of a first year undergraduate computing programme.

The paper looks briefly at the growth in the use of discussion forums in the academic environment and refers to literature supporting the use of computer mediated communication (CMC). The paper then investigates a number of models used to measure activity within discussion forums. Two of the models are then the focus of extensive scrutiny. The first is based on the model proposed by Henri (1992) and modified by Hara, Bonk & Angeli (2000). The second model was developed by Garrison, Anderson and Archer (2001). These two models were then used to analyse the activity of a discussion forum. The results are presented and discussed and finally recommendations are made on how the models could be modified to better measure levels of critical thinking in an assessed discussion forum environment.

Background

The increasing popularity of the Internet and its ability to provide seemingly transparent communication between different computing platforms has simplified the processes of providing learning opportunities to remotely located learners. The rapid expansion in the use of distance education in the postsecondary education setting is well documented (Spatariu, Hartley & Bendixen, 2004; Green, 2000). The growing maturity of learning management systems (LMS) and the increased sophistication of the communication tools within these systems have lead to an awareness of the ability to duplicate many of the teaching practices available in face-to-face delivery by the academic and vocational training practitioners (Kang, 1988; Rice, 1989).

Computer–mediated communication (CMC) is now used by almost everyone in distance education training (Garrison, 2000) and comprises various forms of electronic communication including synchronous chat, audio and video and asynchronous conferencing, email, and file exchange.

Support for the use of discussion forums in distance education is widespread. Discussion forums are said to allow students to see different perspectives which can help to foster new meaning construction (Heller & Kearsley, 1996; Ruberg et al., 1996).  Discussion forums encourage student ownership of learning and collaborative problem-solving skills (Becker, 1992). They encourage participants to put their thoughts into writing in a way that others can understand, promoting self-reflective dialogue and dialogue with others (Valacich, Dennis, & Connolly, 1994). Discussion forums have the potential to expose students to a broader range of views than face-to-face talk, and hence enable them to develop more complex perspectives on a topic (Prain and Lyons, 2000).

A number of different approaches have been attempted to identify quality in online discussions. Spatariu, Hartley & Bendixen (2004), having reviewed current literature, suggest that the majority of studies can be loosely categorized into one of four categories, according to the construct being measured: levels of disagreement; argument structure analysis; interaction-based; and content analysis.

Studies belonging to the level of disagreement category adopt the approach of coding messages according to the level of disagreement that is exhibited in relation to previous posting. Researchers to use this method include Marttunen (1998) and Nussbaum, Hartley, Sinatra, Reynolds & Bendixen (2002). Marttunen (1998) looked at the relationship between personality variables such as anxiety and extraversion in email messages. Nussbaum et al. (2002) adopted a similar approach when looking at students’ postings to an online discussion forum. The coding scheme used by both studies was based upon observed willingness of students to disagree with their peers.

The argument structure analysis category codes messages according to the argument quality demonstrated by participants. Researchers that have adopted this approach include Inch & Warnick (2002) and Veerman, Andriessen & Kanselaar (1999). Inch and Warnick (2002) coded arguments into four categories according to the degree of complexity in the argument structure while Veerman et al. (1999) used a combination of argument and content analysis, classifying messages in terms of information exchange.

Interaction-based coding methods place an emphasis on the message as part of a larger discussion. Schaeffer, McGady, Bhargava & Engel (2002), Järvelä & Häkkinen (2002) and Nurmela, Lehtinen & Palonen (1999) have adopted this approach. Schaeffer et al. (2002) developed five exchange categories and coded postings according to level of relatedness and agreement. Järvelä and Häkkinen (2002) used two different classifications of messages to analyse multiple perspectives. Nurmela et al. (1999) used a three dimensional social network analysis to study the structure of documents and the connections between them.

The content analysis approach codes messages according to the message type. A review of literature suggests that content analysis is the most popular approach used by researchers to evaluate quality in discussion forum postings. The more commonly cited researchers include Henri (1991), Gunawardena, Lowe & Anderson (1998), Newman, Webb & Cochrane (1995), Garrison, Anderson & Archer (2000) and Hara, Bonk & Angeli (2000). Henri (1992) developed an analytical model that highlights five dimensions of the learning process that can be found in messages. Gunawardena et al. (1997) introduced a model of analysis to assess the social construction of knowledge and collaborative learning. Newman et al. (1995) developed an analytical method for the study of critical thinking, which presented a list of indicators of critical thinking. Hara et al. (2002) used a content analysis approach, based largely on Henri’s (1992) cognitive and metacognitive dimensions, to support the investigation of quality online discussions. Garrison et al. (2000) assessed inquiry capabilities as well as critical thinking through three dimensional model which measured cognitive presence, teaching presence, and social presence.

A review of current literature indicates that the methodologies adopted by Henri (1991) and Garrison, Anderson & Archer (2000) are two of the most popular content analysis approaches. These two methodologies have been either duplicated or incorporated into models developed by other researchers.

Henri (1991) identified following five dimensions which can be used to evaluate CMC: participative, social, interactive, cognitive and metacognitive. The cognitive and metacognitive dimensions measured reasoning, critical thought and self-awareness and as such are more likely to be of interest when attempting to reward participants for assessed discussion forum contribution. The coding system used by Henri was not clearly defined in her research, but it was used as the basis of subsequent research conducted by Hara, Bonk & Angeli (2000). The cognitive and metacognitive components of the Hara, Bonk & Angeli analysis framework were well defined and they were chosen for this research.

Garrison et al. (2000) developed a ‘community of learning’ model which assumes that learning occurs through the interaction of three core components: cognitive presence, teaching presence, and social presence. Cognitive presence is defined by Garrison et al. (2000) as “the extent to which the participants in any particular configuration of a community are able to construct meaning through sustained communication”. Social presence deals with all those declarations of the students or tutors where the creation of a dynamic group is promoted, including social relationships, expressions of emotions, and affirmation messages. Teaching presence considers the interactions of teachers and students, as they formulate questions, expose ideas and answer questions. The cognitive presence concept was expanded by Garrison ,Anderson, & Archer (2001) into a four stage cognitive-processing model, which was used to assess critical thinking skills in on-line discussions. The model classified student responses into triggering, exploration, integration and resolution categories. The framework for the model was well documented and it was chosen as the second model for the research.

Methodology

The research in this study is ethnographic due to its small sample size and lack of statistical testing. It was designed as a preliminary exercise to a larger research project that will use a larger sample across a variety of institutes, utilizing intelligent software to perform the content analysis coding. The research was conducted to allow the researcher to become familiar with two of the most popular quantitative content analysis models and to attempt to identify if the models could be applied to determine the level of critical thinking for individual students. The research also aims to investigate the attitudes of students in using online discussions as an assessment tool in an environment which combines online learning elements with face-to-face learning elements. Such environments are commonly referred to as blended learning environments. 

The transcripts for the discussion forum were compiled into a single document and the document was surveyed in an attempt to identify what to use for a unit of analysis. Having established how the majority of postings were structured and following the advice of Campos (2004), it was decided to use the sentence as the human cognitive unit of analysis. The compiled document was then split into sentences which were then hand-coded against the two models by the course instructor and another instructor who had delivered the course on a previous occasion.

At the completion of the discussion in the forum, access statistics were generated by Blackboard learning management system and students were interviewed to establish their reactions to the exercise.

The Course

The research was conducted during the second semester of a first year undergraduate degree course. All the students were enrolled in a computing systems degree and as such were familiar with using information technology. The course was an introductory data communications and networking class that was delivered using a blended learning environment, combining traditional face-to-face activities with web publishing, on-line review and discussion forum activities. On-line activity, which included publishing the results of a research project, evaluating the work of peers and participation in a discussion forum formed a significant part of the course. The use of the discussion forum was seen as a way to encourage participation as well as to provide a tool to promote discussion over a period of time to a topic that was a key component of the course curriculum. Previous offerings of the course did cover the same topic, the future of data communications, in a normal classroom setting, using face-to-face discussion over a period of at most two hours. Using the discussion forum approach, students were allowed three weeks to participate in on-line discussion.

The software used to support the discussion forum was an integral part of the Blackboard learning management system. All students had previously used Blackboard to retrieve course materials and to participate in on-line tests in their earlier courses; however none of the students had participated in discussion forums during their previous academic study.

The class consisted of fifteen students, three females and twelve males, aged between 18 and 38, and of varying academic abilities. Students were given the topic for the discussion early in the course and instructions were provided to the students as to what was expected in the discussion forum. The instructions were given as a guide to encourage higher level critical thinking.  The student postings were monitored by an instructor who provided encouragement, added pedagogical comments and provided reinforcement and expert advice.

Findings and analysis

During the three weeks that the discussion forum was operational a total of 104 posts were made, 30 of which were made by the course instructor. Once the instructor postings were removed, the remaining 74 posts generated 484 sentences for coding.

Participation in the forum varied with almost 35% of postings being made by the three female class members. The six class members over the age of twenty five accounted for 63% of postings. In the under twenty five age group, one class member took no part in the discussion forum activities and the Blackboard software indicated no activity in the discussion forum area, while another who made no postings, obviously read postings as Blackboard indicated significant activity in the forum area. Two under twenty five year olds, made only a single posting, however monitoring software indicated significant activity for both.

Before coding the entire transcript, both instructors looked at the first 100 sentences from the transcript and agreed on how the sentences should be coded against the two models. Once the entire transcript had been coded, the results of the transcript analysis for the two instructors were evaluated to establish the level of agreement that existed, using a coefficient of reliability developed by Holsti (1969). The coefficient indicates a percentage agreement measure calculated by totaling the number of agreements between coders divided by the total number of coding decisions. The coefficient of agreement was 87% using the Garrison et al. model and 81% using the Hara et al. model. Once the coding was completed, tables were produced showing the average score for the two coders for each category of the two models.

Table 1 and table 2 present the summary information on how the 484 sentences were classified for Garrison et al. and Hara et al. respectively.

Table 1:
Number of postings using Garrison et al. (2001, pp. 15-16)
Category
Indicators
Number of Sentences
Percent of Total Sentences

1. Triggering

Recognizing the problem
Sense of puzzlement

73

15.08%

2. Exploration

Divergence within online community
Divergence within single message
Information exchange
Suggestions for consideration
Brainstorming
Leaps to conclusions

124

25.62%

3. Integration

Convergence among group members
Convergence within a single message
Connecting ideas, synthesis
Creating solutions

209

43.18%

4. Solution

Vicarious application to real world
Testing solutions
Defending solutions

58

11.98%

Not categorised

 

20

4.13%

Total number
of postings

 

484

100.00%

Coders found the Garrison et al. model the easier of the two models to code, with a higher coefficient of agreement and a lower number of uncategorised sentences. The model indicated a small number of triggering questions, since students were encouraged to discuss and build on the ideas of others. More than 68% of the contributions were in the exploration and integration area, which probably reflects the fact that this was a first year undergraduate course. Fewer than 10% of sentences were classified in the solution area, and most of the contributions in this area were from a mature student who had recent industry experience in the topic subject.

Table 2:
Number of postings using Hara et al. (2000)
Reasoning Skills
Indicators
Number of Sentences
Percent of Total Sentences

Elementary clarification

Identifying relevant elements

Reformulating the problem

Asking a relevant question

Identifying previously stated hypotheses

Simply describing the subject matter

63

 

13.02%

 

In-depth clarification

Defining the terms

Identifying assumptions

Establishing referential criteria

Seeking out specialized information

Summarizing

121

 

25.00%

 

Inferencing

Drawing conclusions

Making generalizations

Formulating a proposition which proceeds from previous statements

145

 

29.96%

 

Judgment

Judging the relevance of solutions

Making value judgments

Judging inferences

"I agree, disagree,,,,"

63

 

13.02%

 

Application of strategies

Making decisions, statements, appreciations, evaluations and criticisms

Sizing up

53

 

10.95%

 

Not categorized

 

39

8.06%

Total number of postings

 

484

100.00%

The Hara et al. model was harder to code having five categories compared to the Garrison et al. model with four. The coefficient of agreement was lower and the number of uncategorised sentences was higher. The model indicates that the majority of responses to the forum (54.96%) were in the clarification and inferencing categories, indicative of students responding and building on ideas identified by others.

It was interesting to note that both models indicated similar levels of evidence of knowledge construction and similarly low levels of synthesis and real world application. Both models reflected the preference of students to build and expand on ideas suggested by others.

Even though the research was conducted using first year undergraduate students, the results indicate reasonable levels of knowledge construction and evidence of critical thinking that are comparable to research involving graduate students conducted by Meyer (2004). This higher than expected level of cognitive ability displayed appeared to be as a consequence of the exceptional quality of the first posting, which set the tone for the remainder of the discussion.

When questioned about the experience of using discussion forums as a way of discussing topics which are relevant to the course prescription thirteen of the fifteen students stated that they believed that it was a worthwhile exercise. Several indicated that the process was very time consuming but also said that the three week period gave them time to think about the topic and conduct research to assist with the postings. Students also stated that they found the discussion forum to be “addictive”, creating a desire to continually check to see if their postings had induced a response from the instructor or their fellow students. All those who participated in the exercise suggested that they had increased their knowledge as a result of the exercise and would be happy to participate in a similar exercise in the future.

Results and Recommendations

The research indicates that both models provided a useful tool for measuring the quality of student participation within an online discussion forum. Both gave a measure of the level of critical thinking and knowledge construction. When applied to contribution from individual students, again both models were able to distinguish between those who were contributing at higher levels and those who displayed little evidence of critical thinking.

Attempting to evaluate the relative merits of each model was not really part of this research, and the fact that both models indicated similar patterns of critical thinking would suggest that both have their places in the field of discourse analysis. The coding exercise for both models was time consuming and the coefficients of agreement between coders would suggest that the classification of messages is open to individual interpretation. If the models were to be applied to a larger population with a significantly larger number of postings, then some form of automatic coding system would need to be considered. Such a tool would need to be efficient, reliable, valid and practical.

While the use of content analysis as a mechanism for measuring quality was shown to have merits, neither of the models tested gave any indication of how the critical thinking had been applied to the subject domain that the forum was discussing. As such, the models as they have been presented would require modification to be used as a tool to grade student performance in an assessed discussion forum relating to a particular subject domain area.

Student reaction to using discussions forums as a component within the traditional face-to-face teaching environment would suggest that for students who are familiar with technology, the exercise could enhance the learning process. The evidence of this research would also suggest that the discussion forum mechanism better suits relatively mature learners who have desire to learn and take responsibility for knowledge construction. Discussion forums may also disadvantage students who have poor written communication skills and students who may be participating in a forum where the language used is not their first language.

There are several areas that were not investigated as part of this research which have been identified as being worthy of further investigation. These include the impact that familiarity with technology has in the use of discussion forums; the role of assessment in the use of discussion forums; the impact of the instructor in leading or encouraging discussion; the use of triggers to promote discussion and the use of students in moderating discussion content. Others areas worthy of study are the impact on learning of those who read but do not post, commonly referred to as “ropers” or “lurkers”; the impact of negative or zero responses to first postings; and the effect of time on the quantity and quality of responses as a forum progresses.

References

Becker, H. (1992). A model for improving the performance of integrated learning systems: Mixed individualized/group/whole class lessons, cooperative learning, and organizing time for teacher-led remediation of small groups. Educational Technology 32, 6-15.

Campos, M. (2004, April). A Constructivist method for the analysis of networked cognitive communication and the assessment of collaborative learning and knowledge building. Journal of American Learning Networks, 8(2),1-29.

Garrison, D.R. (2000). Theoretical challenges for distance education in the tenty-first century: A shift from structural to translational issues. International Review of research and Open and Distance Learning, 1(1). Avaialable: http://www.icap.org/iuicode?149.1.1.2

Garrison, D.R., Anderson, T., & Archer, W. (2000). Critical thinking in a text-based environment. Computer Conferencing in higher education. Internet in Higher Education, 2(2), 87-105.

Garrison, D. R., Anderson, T., and Archer, W. (2001). Critical Thinking, Cognitive Presence, and Computer Conferencing in Distance Education. The American Journal of Distance Education 15(1),  7–23.

Green, J. (2000). The online education bubble. The American Prospect, 11(22), 32-35.

Gunawardena, C.N., Lowe, C.A., & Anderson, T. (1997). Analysis of a Global Online Debate and the Develoment of an Interaction Analysis Model for Examining Social Construction of Knowledge in Computer Conferencing. Journal of Educational Computing Research, 17(4), 397–431.

Hara, N., Bonk, C. J., & Angeli, C. (2002). Content analysis of online discussion in an applied educational psychology course. Instructional Science, 28, 115-152.

Heller, H., & Kearsley, G. (1996). Using a computer BBS for graduate education: Issues and outcomes. In Z. Berge & M. Collins (Ed.), Computer-mediated communication and the online classroom. (Vol. III: Distance learning, pp. 129-137). NJ: Hamptom Press.

Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative learning through computer conferencing: The Najaden Papers (pp. 116-136). Berlin: Springer-Verlag.

Hosti, O. (1996). Content analysis for social sciences and humanities. Don Mills, ON: Addison Wesley.

Inch, E. S., & Warnick, B. (2002). Critical thinking and communication. Boston: Allyn & Bacon.

Järvelä, S., & Häkkinen, P. (2002). The levels of Web-based discussions—Using perspective-taking theory as an analysis tool. In H. Van Oostendorp (Ed.), Cognition in a digital world (pp. 77-95). Mahwah, NJ: Erlbaum.

Kang, I. (1998). The use of computer-mediated communication: Electronic collaboration and interactivity. In C.J.Bonk, & K.S.King (Eds.), Electronic collaborators: Learner-centered technologies for literacy, apprenticeship, and discourse (pp. 315-337). Mahwah, NJ:Erlbraum.

Knuka, K. & Anderson, T. (1998). Online social interchange, discord, and knowledge construction. Journal of Distance Education, 13(11), 57-74.

Marttunen, M. (1998). Electronic mail as a forum for argumentative interaction in higher education studies. Journal of Educational Computing Research, 18, 387-405.

Meyer, K.A. (2004, April). Evaluating online discussions: four different frames of analysis. Journal of American Learning Networks, 8(2),101-114.

Newman, G., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face computer supported group learning. Interpersonal Computing and Technology, 3(2), 56-77.

Nurmela, K., Lehtinen, E., & Palonen, T. (1999). Evaluating CSCL log files by social network analysis. In C. Hoadly & J. Roschelle (Eds.), Proceedings of the Third Conference on Computer Supported Collaborative Learning (pp. 443-444). Stanford, CA: Stanford University.

Nussbaum, E. M., Hartley, K., Sinatra, G. M., Reynolds, R. E., & Bendixen, L. (2002, April). Enhancing the quality of on- line discussions. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.

Palloff, R. M., & Pratt, K. (2001). Lessons from the cyberspace classroom: The realities of online teaching. San Francisco: Jossey Bass.

Prain, V. & Lyons, L. (2000) Using information and communication technologies in English: An Australian perspective. In A. Goodwyn (ed.). English in the Digital Age. London : Cassell Education.

Rice, R.E. (1989). Issues and concepts in research on computer-mediated communication systems. In J. Anderson (Ed.), Communication Yearbook, 12, 436-476). Beverly Hills, CA: Sage Publications.

Ruberg, L., Moore, D., & Taylor, D. (1996). Student participation, interaction, and regulation in a computer-mediated communication environment: A qualitative study. Journal of Educational Computing Research, 14(3), pp. 243-268.

Schaeffer, E. L., McGrady, J. A., Bhargava, T., & Engel, C. (2002, April). Online debate to encourage peer interactions in the large lecture setting: Coding and analysis of forum activity. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.

Spatariu, A., Hartley, K. & Bendixen, L.D. (2004, Spring) Defining and Measuring Quality in On-line Discussion. Journal of Interactive Online Learning. Vol 2(4).

Valacich, J., Dennis, A., & Connolly, T. (1994, March). Idea generation in computerbased groups: A new ending to an old story. Organizational Behavior and Human Decision Processes, 57, 448-467.

Veerman, A. L., Andriessen, J. E. B., & Kanselaar, G. (1999). Collaborative learning through computer- mediated argumentation. In C. Hoadly & J. Roschelle (Eds.), Proceedings of the third conference on Computer Supported Collaborative Learning (pp. 640-650). Stanford, CA: Stanford University.

 

Steve Corich

 

Steve Corich is Principal Lecturer of Information Technology at the Eastern Institute of Technology, Hawkes Bay, New Zealand. He is currently pursuing PhD in the area of models of assessment in asynchronous interaction environments.

 

Kinshuk

 

Kinshuk is Director of Advanced Learning Technology Research Centre and Associate Professor of Information Systems Department at the Massey University, New Zealand. He has published over 130 research papers in international refereed journals, conferences and book chapters. He is currently chairing IEEE Technical Committee on Learning Technology, and the International Forum of Educational Technology & Society. He is also editor of the SSCI indexed Journal of Educational Technology & Society (ISSN 1436-4522).

 


Lynn Hunt
 

Lynn Hunt is Senior Lecturer at Massey University, New Zealand. She has a PhD in learning psychology. She is working and researching in area of e-learning since 1984. She was the Director of the project PC-120 to upgrade the government owned flight training school (Curug) in Indonesia, the largest curriculum design and development project ever undertaken by Massey University.

 

 
go top
Dec 2004 Index
Home Page