January 2006 Index
Home Page

Editor’s Note
: Threaded discussions are one of the principal dynamic tools used in eLearning. This exploratory study uses tools for formative evaluation and analysis to examine theoretical constructs and mechanisms to enhance design, application, evaluation and optimization of threaded discussions for library science students. Some findings, such assigning a moderator or using subject lines as advance organizers, have immediate value for all disciplines. Other findings require additional research to determine applicability for other subject matter and learner populations.

Mapping MLIS Asynchronous Discussions

Barbara A. Frey, Millie S. Sass, Susan W. Alman


Asynchronous discussions offer a convenient, flexible communication forum to actively engage students and enhance learning. This paper describes a formative assessment process of mapping discussions to analyze group interaction and critical thinking skills in a graduate level Library and Information Science course. Discussions of various depth, breadth, and complexity were mapped, beginning with the initial or parent posting and branching to include all student responses within a thread. Postings were analyzed with a content analysis tool to identify statements according to the level of cognitive skill, questions, reflections, and affirmations. In general, open-ended questions solicited contributions at higher cognitive levels. The deepness of ongoing responses in a discussion did not necessarily lead to higher levels of thinking. Recommendations include training students to manage discussion threads they begin, to write clear, concise contributions, and to use subject lines as advanced organizers for each posting or reply. In addition, facilitators should establish clear expectations, summarize forums, and model effective online communication strategies.

Keywords: asynchronous, discussion, online, distance education, bloom’s taxonomy, content analysis, formative assessment, formative evaluation, critical thinking


Many Web-based courses rely on asynchronous discussions as a form of computer mediated communication to enhance learning. Online asynchronous discussions support high levels of thinking and interaction in a convenient, flexible environment (Berge, 2002; DeArment, 2002). Instructors often structure their distance education courses into weekly lessons or modules and discuss the course concepts in the discussion forums. Weekly discussions may include responses to instructor questions, case studies, guest speakers, debates, or Web sites. There are learning benefits when students actively engage in online discussions. In postings, students have the opportunity to reflect on their understanding of the course content and develop substantive comments that contribute to the community learning environment (Allan, 2004; Flynn & Polin, 2003; Tu and Cory, 2003). Many instructors report that online discussions benefit shy or international students especially by allowing them time to clarify and develop their remarks (Chickering & Ehrman, 1996; Warschauer, 1996). However, “simply requiring students to post messages to address the instructor’s questions may not result in effective learning” (Tu & Corry, 2003, p. 304). Effective discussions require thoughtful design, facilitation, and assessment. A detailed analysis of course discussions can help faculty to identify best practice strategies and refine areas that need improvement.


The purpose of this study was to analyze group interaction and higher order thinking in online asynchronous discussions in a graduate level Library and Information Science course. The study population was comprised of all students who contributed to the threads made available to us by the course instructor. The threads that we selected for analyses were triggered from open-ended, discussion-type questions; they were not of the type that required explicit response or submission of assignments. For purposes of this study we used the terms higher order thinking and critical thinking interchangeably. In general, higher order thinking skills can be defined as the top four levels of Bloom’s Taxonomy (1956).

Specific discussions were analyzed with maps that provided a visual diagram to show the depth and breadth of the online discussion threads. Each map began with the assignment (or the “parent” posting that initiated the discussion) and linked the network of postings. Maps allowed both the instructor and students to see the richness of the discussions through the number of postings linked to each comment or question. For our study, we reviewed discussion threads for the assignments or questions that began the dialogue, and the levels of thinking according to Bloom’s Taxonomy (Bloom, 1956). The course instructor began this analysis as a “snapshot” or formative assessment to review the effectiveness of her asynchronous discussions; however, the analysis model and recommendations may be of interest to other faculty.

In particular, our goal was to respond to the following questions:

  • Do higher levels of cognitive skill emerge in deeper, broader, or more complex threads?

  • What changes in contributors’ actions will make a thread easier to follow and will contribute to critical thinking?

We used the mapping approach to begin our quest in the analysis of discussion threads. To map a thread or portion of a thread, we converted a text list (Subject Lines) into a flow chart showing the relationship between initiator and reply postings. It is a labor intensive approach; we do not suggest that it should be used for every analysis and case, but it met our needs of formative assessment. Note that our purpose was to investigate contributions to discussion threads; we focused on threads triggered by open-ended questions, but our purpose was not to delve into the deep and complex genre of question construction.

Literature Review

The benefits of online asynchronous discussions are well documented in the literature. Web-based discussions are convenient because they are not time or place dependent. Not only can students respond at their own pace, they all have an equal opportunity to express themselves (Palloff and Pratt, 2002; Peters, 2000). Discussion boards provide a permanent record (Meyer, 2004; Clouse, 2003) of interaction that is easy to archive and search. In addition, asynchronous discussions are collaborative, which allows for a social construction of knowledge (DeArment, 2003; Thomas, 1999). Positive learning outcomes are also attributed to the thoughtful reflection required in composition of postings (Allan, 2004; Flynn & Polin, 2003; Tu and Corry, 2003). Writing-to-learn literature describes writing as a way to reflect, analyze, and communicate important ideas and concepts to others. Elbow (1994) wrote, “Students understand and retain course material much better when they write copiously about it” (p. 4).

Student reflection is prompted by questions that serve one of two functions—they are either centering (questions that promote convergent thought) or expanding (questions that promote divergent thought) (Hunkins, 1972). In an asynchronous learning environment, Blanchette (2001) found that questions at higher levels exhibited higher cognitive levels and more interaction among learners than those at lower levels. Muilenburg and Berge (2002) agreed and wrote, “The level of student thinking is directly proportional to the level of questions asked” (n.p.). The following six levels of Bloom’s Taxonomy (Bloom, 1956) are commonly used to categorize discussion questions from lower to higher cognitive skills: (1) knowledge, (2) comprehension, (3) application, (4) analysis, (5) synthesis, and (6) evaluation. The later categories represent higher levels of thinking (Davis, 1993; Hyman, 1979).

The benefits of online discussions have led several researchers to further explore student interaction and develop models and tools for online discussion analysis. There are several types of analysis, including frequency of postings and content analysis. Content analysis studies have been qualitative and explore issues such as problem solving or critical thinking (Rourke and Anderson, 2004; Meyer, 2004; Garrison et al., 2001; Angeli, Bonk, and Hara, 1998). Quantitative studies focus on measures such as frequency of postings, which may include the number of threads per forum, the number of postings per thread, or the number of instructor postings per thread (Mazzolini and Maddison, 2003; Monroe, 2003; Bullen, 1998; Ahern, Peck and Laycock, 1992). Allan (2004) noted the need for an analysis tool that reviewed the process of knowledge development as a whole. This paper includes a description of three content analysis models which were synthesized to create the instrument used in our analysis.

In 2001, Archers et al. analyzed critical thinking or cognitive presence in online discussions in higher education courses. They explained that “cognitive presence is grounded in literature on critical thinking” (n.p.) and is defined as the extent to which learners are able to construct and confirm meaning through reflection and discourse. The following four phases were used to analyze each discussion posting: (1) triggering event in which an issue or problem emerges; (2) exploration phase of brainstorming, questioning, or exchanging information; (3) integration phase constructs shared meaning within the community of inquiry; and (4) resolution is the closure or action taken to resolve the issue or problem posed.

Fahy (2002) noted that the four-phase analysis tool developed by Archer et al. (2001) is a sentence-level analysis tool. He and his colleagues “offer a different approach, focusing on content and interaction patterns at the component level of the transcript: the sentences of which it was composed” (n.p.). His message-level analysis tool is called a Transcript Analysis Tool (TAT). The following are his TAT categories: (1) vertical and horizontal questions, (2) referential or non-referential statements, (3) reflections, (4) scaffolding and engaging comments, and (5) quotations, paraphrases, and citations. In the first category, vertical and horizontal questions, vertical questions have a “correct” answer and horizontal questions do not. In the second category, referential statements make reference to the preceding statement, whereas non-referential statements do not invite response; their main intent is to give information. Scaffolding and engaging comments, the fourth category, are intended to initiate, continue, or acknowledge interaction. Quotations, paraphrases, and citations, the fifth category, use text for resources and give credit to those resources.

Jeong (2003) used a content analysis coding system to analyze online discussions. The coding system consisted of the following twelve categories: (1) position statements, (2) disagreement statements, (3) agreement statements, (4) arguments, (5) personal experiences, (5) literature, (6) formal data, (7) personal or hypothetical actions, (8) evaluation or critiquing of arguments, (9) summary, (10) negotiation or conclusions, and (12) process comments. Jeong noted that position statements were most likely supporting or opposing arguments, and arguments were likely to generate additional arguments in subsequent responses. Disagreements were rarely posted in response to position statements, whereas agreements were ten times more likely to be posted.

Course Background

The Department of Library and Information Science (LIS) launched the first online degree completion program at the University of Pittsburgh. In May 2001, the inaugural class logged in to this FastTrack program leading to a Master’s Degree in Library and Information Science (MLIS). The initiative was designed to support Pitt’s emphasis on flexible course delivery to distance audiences and to respond to the growing need for library and information science education.

The online FastTrack program is a two-year, cohort-based curriculum consisting of 36 graduate credits. The community of learners complete the program together in two years by taking six credits per term. They bring a variety of undergraduate degrees to the MLIS program—elementary education, art history, zoology, and social work, among others. Most members of the cohort have some library experience. Students interact with one another, their instructors, and the course material through Blackboard®, a system commonly used by higher education for presenting online courses.

Major strengths of this program are the consistency of the instructors and the continuing relationships of the student cohort members. The cohort concept enhances the learning experience by providing supportive peer relationships for academic and social situations. We found the cohort model increases retention rates and establishes an ongoing community of learners. In addition, the same instructors who teach the MLIS face-to-face courses develop and teach the online courses.

In the first term of study, FastTrack faculty set guidelines for student participation in the asynchronous discussions. Initial student postings are limited to frequency and length (word count) in order to promote concise writing and the opportunity for all learners to participate in meaningful discussion. Small group discussions involve 12 to 14 participants because this maximizes active involvement. These guidelines provide structure to help students new to distance and graduate education by setting expectations. FastTrack instructors believe skills leading to focused discussion are transferred to other courses.

The 21 (16 women and 5 men) students in this study were in their sixth term of the FastTrack program. Therefore, they were already familiar with the cohort members and with instructors’ expectations for discussion participation. Learners were no longer held to the participation requirements of the first term, but were ready to contribute in a meaningful discussion of the topics. Critical to the success of this course is the knowledge gained from the active discussion of cohort members. At this point in the program, students have the confidence, knowledge, and familiarity with the cohort members to generate rich discussions. This learning environment provides multiple opportunities for students and instructors to interact.


We examined the structure and content of Discussion Forums that were presented within Blackboard. For this formative evaluation, we wanted to examine a variety of types of threads. In order to do that, we scrutinized the structure of the threads, as provided by Blackboard, and identified one thread that was deep, one that was broad, and one that was complex.

Focusing on Discussion Threads

Originally, we thought that it would be possible to determine the complexity of threads by working directly from the visual representation as provided by the Blackboard® application. Although initiating postings and responses to those postings are all represented in the illustrated structure shown in Figure 1, nevertheless, we struggled as we attempted to determine “who” responded to “what.”


Figure 1: Structure of expanded discussion as provided by Blackboard®.

Others have also found the structure of online discussions to be challenging and limiting. For example, Wijekumar (2005) described discussion boards as “unwieldy” (n.p.) and Xin (2002) noted that asynchronous communication systems offer fairly “primitive discourse structures” (p.22). We found that it was easy to get lost in the mechanics and confusion of expanding branches, reading postings, and closing postings. It was difficult to visually determine the “level” of responses in the Blackboard® view. That was important to us since one of our suspicions was that the type of cognitive skill exemplified might vary between postings in broad, deep, or more complex threads; moreover, we wanted to be able to draw some conclusions about postings at the same “level” within a thread. By “level” we meant how many steps away a contributed posting was from the initiating posting.

We needed to find a mechanism wherein we could store our comments and observations about the Blackboard Discussion Forum entries while, at the same time, see where they fit within the entire thread.

Mapping the Threads

We selected three discussion threads to examine: one thread was deep, with many “levels” of responses; a second thread was broad, with many responses at the same “level”; and the final thread was extremely complex, with many branches at many levels.

Looking for ways to visually represent the threads and their content, we found that some researchers (Thomas, 1999; Ahern et al., 1992) had mapped Discussion Threads into tree structures to help them analyze communication activities. Using a similar technique, we employed PowerPoint® for visualization of the threads and Microsoft® Word for capturing our evaluative comments. Each node OR CIRCLE represents a posting, identified by the initials of the student. Links were included in the PowerPoint file, leading to each posting. We used a PowerPoint file saved as html, but other applications could have been used as effectively.

To code each of the postings, we examined the content, made coding decisions, and edited the appropriate Word document. When that exercise was complete, we could easily view the structure of the examined threads and link to the coded postings. This PowerPoint view allowed us to quickly see how many contributions were made by a student and the complexities of the communication thread in general.

Figure 2 shows the hyperlinked PowerPoint file used to examine a “deep” thread (7 levels). This mapped thread is the same as that in the shaded area of Figure 1. We found the structure as shown in Figure 2 was easier to decipher than that in Figure 1.

Figure 2: PowerPoint file illustrating “levels” within a discussion thread.


Each of the nodes is identified with the initials of the contributor and each is hyperlinked to a file containing the coded posting (see Figure 3 for an example).

Figure 3 shows a sample of a Word file representing a specific posting. The file is accessed by clicking the hyperlink on the PowerPoint file.

Figure 3: Click on one of the nodes within the structure to display the posting.

Coding the Postings

Our original intent was to code each entry as to cognitive skill displayed, but on our first examination of entries, we found that indication of cognitive skill was not enough. There were some postings that could not be categorized according to cognitive skill. Instead of rejecting those entries, we looked at literature to find ways that we could categorize them. We devised a Discussion Board Analysis Tool to use as we examined each student posting, whether it exhibited cognitive skill or items that could not be categorized by cognitive skill. That tool was considered to be a “prototype” but was based on the work of a number of learning theorists and researchers (Jeong, 2003; Fahy, 2002; Archer et al., 2001). For analysis purposes we used the entire message as the unit for examination (Archer et al., 2001).

The following steps describe our coding process:

1.       We coded each posting statement as to the type of cognitive skill represented. This enabled us to evaluate whether the level of the posting was correlated to the type of cognitive skill in Bloom’s Taxonomy. We coded the entry with the highest (according to Bloom) cognitive level exhibited. Figure 4 shows the types of verbs that were used to operationalize the Taxonomy levels.

2.       We determined if a question was posed. This strategy was adapted from the Transcript Analysis Tool (TAT) set out by Fahy (2002). This enabled us to determine whether student questions prompted richer and more diverse threads.

3.       We coded each as to whether a reflection (supporting, dissenting, independent, or personal experience/opinion) was present. This is also adapted from the TAT (Fahy, 2002).

4.       We indicated if a posting was an affirmation or social comment, similar to that proposed by Jeong (2003).

From: www.umuc.edu/ugp/ewp/bloomtax.html

Figure 4: Operational Verbs

So, our final Analysis Tool is shown in Figure 5 below. If a message had elements that represented more than one Category of Contribution, we listed each.

Category of Contribution



Level of cognitive skill




Supporting, dissenting, independent, or personal

Affirmation or Social Comment


Figure 5: Discussion Board Analysis Tool

We examined each posting in the Discussion Thread and entered our Analysis Tool evaluative remarks directly onto the MSWord files. We also added date and initials of the coder. Both coders independently coded the entries and, for those few times that there were differences of opinion, we met to agree on categorization. Instructor comments and instructions were not coded since our aim was to investigate the contributions made by students in the online class. For an example:

Figure 6: Posting Annotated using the Discussion Board Analysis Tool.



The literature shows that analysis of asynchronous communication focuses on either quantitative or qualitative data with many ways to examine and evaluate the interaction of participants. There is not one set of analysis tools to fit every need or situation. Xin (2002) offers a good summary of measurement instruments used in studies of computer-mediated communication (CMC). Since online discussion technology is being upgraded at a rapid rate and since researchers often approach a topic from different viewpoints, one cannot be surprised at this myriad of available methodologies and tools. In our study we initially sought existing tools to use in our analysis; we found none that we could use unmodified. From the models that we examined, we pulled elements from each to create a tool that allowed us a broad perspective of student interaction and critical thinking. Using our customized Discussion Board Analysis Tool, we were able to examine postings and to verify characteristics of asynchronous communications that we had uncovered in our literature review. We found our analysis methodology and resulting flow diagrams, successful for visualizing data and for aiding in the coding of postings. This process met our goal to develop a useable approach for formative evaluation, rather than to do rigorous analysis with associated interrelated correlation.

The results enabled the instructor to modify her questions and facilitation techniques to engage her students in higher level thinking skills. While we do not suggest that this process should be used for every discussion thread, occasional use can highlight areas in need of possible improvement for an instructor. Not surprisingly, our analysis showed that open-ended questions or comments solicit answers at higher cognitive levels according to Bloom’s Taxonomy than did direct questions/comments. On the other hand, we had expected that postings appearing “deeper” into a discussion forum would illustrate more thoughtful and higher levels of cognitive skills. This did not happen; high level cognitive skill was demonstrated very early in respondent postings in some threads. Note that while we coded categories other than cognitive skill (i.e., question, reflection, affirmation) when we examined postings illustrating these other categories, there were no trends that could lead to recommendations for the instructor.

Part of the problem we initially experienced in our analysis was determining exactly where a posting occurred within the structure of a large communication thread. While online discussion systems enable postings to be displayed by date, by structure (i.e., where the posting falls in the dialogue), or by contributor, it can be confusing to understand the details of any thread. This confusion is an unfortunate result of the cryptic views presented by the bulletin board software, but could have been ameliorated somewhat if students had modified the subject line of postings to provide clues about the content of their contributions. More precise subject lines could have aided students as they navigated through the discussion forum. Others have also made this suggestion (Hara et al., 2000; Pelz, 2004).

Any online forum is a culmination of the directions and requirements given by the instructor. The literature shows that some discussions are highly managed and regulated with precise directions for number and quality of contributions expected from students (Tu and Corry, 2003; Hara, 2000), while other studies look at free-style, spontaneous threads focusing more on the social aspects of online forums while also examining the quality and quantity of content (Knupfer et al., 1997; Savicki, Lingenfelter and Kelley, 1996). Since student behavior is driven by instructor requirements and expectations, those requirements are strong forces in the actual content of any thread. For example, if students know that they are expected to exhibit higher levels of cognitive skill in their responses, they will aim to do that and any examination of level of cognitive skill would be rather meaningless. In our case, students were directed to participate in the discussion forums and to be alert to new forums as they were begun by the instructor, but they were free to respond in any way they saw fit.


This research adds to the body of literature analyzing online asynchronous discussions, particularly that dealing with formative evaluation and analysis tools. The results are not generalizable to a larger teaching population.

As Allan noted (2004), content analysis is time consuming. Content analysis requires coding and that coding cannot be automated. We found that the effort in mapping the discussion structure and coding the posting was necessary. The resulting coded and linked files were critical to our analysis of the threads and our ability to see trends in the interactions. Additionally, it is our expectation that, as course management systems mature, the ease of building and understanding discussion maps will be enhanced through application programming interfaces.

In this formative evaluation we did not have opportunity to design a controlled study but were looking for recommendations to improve interactions within the forums. We did not examine inter-rater or intra-rater reliability or validity for categorizing the student postings.

Further research is needed to determine how student interaction and critical thinking in asynchronous communication mode affects performance (i.e., online course satisfaction and achievement measures).


As we designed and implemented this formative evaluation, we saw the following as being important issues:

Students need some basic training and guidelines to effectively participate in asynchronous discussion--how to write concise postings, how and when to start new threads of discussion, how to use subject lines, how to post timely responses, how to format text for readability, and how to adhere to online etiquette guidelines.

To avoid student confusion and misunderstanding, the instructor must set clear rules for expectations on frequency of contributions, etiquette and tone of postings, weight of participation toward course grade. Learners can quickly take control of a discussion but need guidance (Peters, 2004) to be sure that they are meeting course requirements and instructor expectations. If the goal of a forum is to encourage higher levels of thinking, more structure from the instructor is advisable in order to enhance cognitive quality of posting. (Gilbert and Dabbagh, 2005).

In order to expedite classmates’ understanding of the structure of a discussion forum, subject lines should be used as advanced organizers. Ausubel (1963) proposed advanced organizers as a strategy to help students learn large amounts of material. As Allan (2004) noted, the more intensive the discussion, the more postings that will be generated. The subject line organizers introduced in advance of learning can bridge between new learning material and existing terms and concepts. Each posting (including a Reply) should have a new subject line (Monroe, 2003).

Unless there is a student assigned as a moderator, those who start a thread should be expected to do some type of moderation, wrap up, or summary. This is in agreement with others (Hara et al., 2000).

We would recommend use of analysis tools similar to those used in this study for focused, formative types of evaluation, especially studies aimed at improving the use of discussions. The pairing of the mapping process and coding tool was effective to support our formative evaluation of student interaction and critical thinking. The mapping illustrated connections between the postings at various levels, and the tool standardized our analysis of postings.


When the learning context changes from the traditional classroom to the online asynchronous textual context of computer mediated communication, strategies for teaching and learning also change. Knowing how to design and facilitate effective threaded discussions is critical for faculty teaching online courses. Asynchronous discussions take Web pages from being static information to being dynamic instruction. As formative evaluation, discussion mapping described in this study provides an effective means of viewing a large number of interconnected messages. The map provides an instant overview of the discussion process, including interaction patterns and the level of thinking.


Ahern, T.C., Peck, K. & Laycock, M. (1992). The effects of teacher discourse in computer-mediated discussion. Journal of Educational Computing Research, 8(3), 291-309.

Allan, M. (2004). A peek into the life of online learning discussion forums: Implications for Web-based distance learning. International Review of Research in Open and Distance Learning, 5(2), Retrieved April 19, 2005, at http://www.irrodl.org/content/v5.2/allan.html

Angeli, C., Bonk, C.J., & Hara, N. (1998). Content analysis of online discussion in an applied educational psychology. course. Center for Research on Learning and Technology, Indiana University, Bloomington, IN. Retrieved March 30, 2004, at http://crlt.indiana.edu/publications/journals/techreport.pdf

Archer, W., Garrison, D. R., Anderson, T., & Rourke, L. (2001). A framework for analyzing critical thinking in computer conferences. Paper presented at EURO-CSCL 2001, Maastricht. Retrieved March 22, 2004, at http://www.mmi.unimaas.nl/euro-cscl/Papers/6.doc

Ausubel, D. (1963). The psychology of meaningful verbal learning. New York: Grune and Stratton.

Berge, Z. (2002). Active, interactive, and reflective e-learning. Quarterly Review of Distance Education, 3(2), 181-190.

Blanchette, J. (2001). Questions in the online learning environment. Journal of Distance  Education. 16(2). Retrieved June 1, 2005, at http://cade.icaap.org/vol16.2/blanchette.html

Bloom, B. (1956). Taxonomy of educational objectives. New York, NY: David Mckay Company, Inc.

Bullen, M. (1998). Participation and critical thinking in online university distance education. Journal of Distance Education, 13(2), 1-32.

Chickering, A.W. & and Ehrmann, S.C. (1996). Implementing the seven principles: Technology as lever. AAHE Bulletin, 49(2), 3-6.

Clouse, (November 2003). Tips and tricks for online chats, threaded discussions, and PowerPoint lectures. Educause Annual Conference, Anaheim, CA.

Davis, B.G. (1993). Tools for teaching. San Francisco, CA: Jossey-Bass Publishers.

DeArment, C. Instructional uses of computer-mediated text-based messaging tools: A case study of faculty and student experiences and perceptions. (Doctoral Dissertation, University of Pittsburgh, 2002).

Elbow, P. (1994). Writing for learning – not just for demonstrating learning. University of Massachusetts, Amherst.

Fahy, P. J. (2002). Assessing critical thinking processes in a computer conference. Viewed April 26, 2005, at http://cde.athabascau.ca/softeval/reports/mag4.pdf

Fahy, P.J. (2002). Epistolary and expository interaction patterns in a computer conference transcript. Journal of Distance Education, 17(1), Retrieved March 24, 2004, at http://cade.athabascau.ca/vol17.1/fahy.html

Flynn, T. & Polin, L. (2003). Making sense of online learning: Frames, rubrics, and coding systems for analyzing asynchronous online discourse. Paper presented at AERA 2003 Conference held April 25, 2003 in Chicago, IL. Retrieved on July 2, 2005, at http://communitiesofinquiry.com/documents/COI%20Reference%20List.pdf

Garrison, D.R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. The American Journal of Distance Education, 15(1), 7-23.

Gilbert, P.K. & Dabbagh, N. (2005). How to structure discussions for meaningful discourse. British Journal of Educational Technology, 36(1), 5-18.

Hara, N., Bonk, C.J. & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional Science, 28(2), 115-152.

Hara, N. (2000). Visualizing tools to analyze online conferences. National Convention of the Association for Educational Communications and Technology, Albuquerque, NM. (ERIC Document Reproduction Service No. ED442845)

Henri, F. (1992). Computer conferencing and content analysis. In A.R. Kaye (Eds.), Collaborative learning through computer conferencing: The Najaden papers (pp. 115-136). New York: Springer.

Hunkins, F.P. (1972). Questioning strategies and techniques. Boston, MA: Allyn & Bacon, Inc.

Hyman, R.T. (1979). Strategic questioning. Englewood Cliffs, NJ: Prentice-Hall, Inc.

Jeong, A.C. (2003). The sequential analysis of group interaction and critical thinking in online threaded discussions. The American Journal of Distance Education, 17(1), 25-43.

Knupfer, N.N. et al. (1997). Participant analysis of a multi-class, multi-state, on-line, discussion list. National Convention of the Association for Educational Communications and Technology, Albuquerque, NM. (ERIC Document Reproduction Service No. ED 409845)

Mazzolini, M. & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education, 40(3), 237-253.

Meyer, K.A. (2004). Evaluating online discussions: Four different frames of analysis. Journal of Asynchronous Learning Networks, 8(2), Retrieved July 2, 2005, at http://www.sloan-c.org/publications/jaln/v8n2/v8n2_meyer.asp

Monroe, B. (2003). Fostering critical engagement in online discussion: Washington State University Study (newsletter). Pullman, WA: Washington Center for Improving the Quality of Undergraduate Education.

Muilenburg, L. & Berge, Z. (2002). A framework for designing questions for online learning. Retrieved September 22, 2003, at http://www.emoderators.com/moderators/muilenburg.html

Palloff, R.M. & Pratt, K. (2002). What we know about electronic learning (Eds. Lenoar Foster, Beverly L. Bower & Lemuel W. Watson) ASHE reader distance education: Teaching and learning in higher education. Boston, MA: Pearson Custom Publishing.

Pelz, B. (2004). My three principles of effective online pedagogy. Journal of Asynchronous Learning Networks, 8(3), Retrieved July 2, 2005 at,


Peters, K.M. (2000). Creative use of threaded discussion areas, Part 1. WebCT Online Teaching and Learning (newsletter), Retrieved July 21, 2005 at, http://www.webct.com/OTL/ViewContent?contentID=898084

Peters, K.M. (2000). Key issues in online discussion, Part 2. WebCT Online Teaching and Learning (newsletter). Retrieved July 21, 2005 at http://www.webct.com/OTL/ViewContent?contentID=2711014

Rourke, L. & Anderson, T. (2004). Validity in quantitative content analysis. Educational Technology Research and Development, 52(1), 5-18.

Rourke, L., Anderson, T., Garrison D.R., & Archer, W. (2001). Assessing social presence in asynchronous text-based computer conferencing. Journal of Distance Education. 14((2), Retrieved March 22, 2004 from http://cade.athabascau.ca/vol14.2/rourke_et_al.html

Savicki, V., Lingenfelter, D., & Kelley, M. (1996). Gender language style and group composition in Internet discussion groups. Journal of Computer Mediated Communication, 2(3), Retrieved July 7, 2005 at, http://www.ascusc.org/jcmc/vol2/issue3/savicki.html

Thomas, M. (1999). The impacts of technology on communication – Mapping the limits of online discussion forums. The University of Adelaide Intranet Project. Retrieved on July 2, 2004, at http://online.adelaide.edu.au/LearnIT.nsf/URLs/technology_and_communication

Tu, C.H. & Corry, M. (2003). Designs, management tactics, and strategies in asynchronous learning discussions, The Quarterly Review of Distance Education, 4(3), 315.

Using Bloom’s Taxonomy in assignment design. (n.d.). Retrieved December 1, 2005 from http://www.umuc.edu/ugp/ewp/bloomtax.html

Warschauer, M. (1996). Comparing face-to-face and electronic discussion in the second language classroom. Computer Assisted Language Instruction Consortium CALICO Journal, 13(2), 7-26.

Wijekumar, K. (2005). Creating effective web-based learning environments: relevant research and practice, 1(5). Retrieved July 20, 2005, at http://www.innovateonline.info/ index.php?view=article&id=26

Xin, C. (2002). Validity centered design for the domain of engaged collaborative discourse in computer conferencing. Unpublished doctoral dissertation, Brigham Young University, Provo, Utah.

About the Authors

Barbara A. Frey


Barbara A. Frey, D.Ed is a Senior Instructional Designer in the Center for Instructional Development and Distance Education at the University of Pittsburgh where she provides support and training to faculty on a variety of teaching and learning projects. In addition, she teaches as an Adjunct Assistant Professor in the Learning and Performance Systems Department of Pennsylvania State University World Campus. Her research interests include Web-based distance education, program evaluation, instructional design and technology, and human resource development. Dr. Frey received her D.Ed. from Pennsylvania State University and her M.Ed. from the University of Pittsburgh.

Barbara A. Frey, D.Ed.
Center for Instructional Development & Distance Education
University of Pittsburgh
4227 Fifth Avenue, Pittsburgh, PA 15260

Telephone: 412-624-1330 Fax: 412-624-7220
Email: frey@cidde.pitt.edu


Millie S. Sass


Millie S. Sass is a research assistant for the University of Pittsburgh Center for Instructional Design and Distance Education. Dr. Sass received degrees from Slippery Rock (University) State College (BS Secondary Mathematics), University of Pittsburgh (MS Educational Research) and Massey University (PhD Education). Dr. Sass has spearheaded the social and cognitive mapping project.

Millie S. Sass, PhD.
Center for Instructional Development & Distance Education
University of Pittsburgh
4227 Fifth Avenue, Pittsburgh, PA 15260

Telephone: 412-624-1330 Fax: 412-624-7220
Email: sass@cidde.pitt.edu


Susan W. Alman

Susan W. Alman directs distance education in the School of Information Sciences at the University of Pittsburgh. She holds degrees from Washington and Jefferson College (BA) and the University of Pittsburgh (PhD, MLS).

Dr. Alman’s areas of interest include asynchronous learning, marketing libraries, and interpersonal communication.

Susan W. Alman, Ph.D.
Department of Library and Information Science
University of Pittsburgh
135 N. Bellefield Avenue, Pittsburgh, PA 15260

Telephone: 412-624-5142 Fax: 412-624-5231
Email: salman@mail.sis.pitt.edu


go top
January 2006 Index
Home Page