May 2004 IndexHome Page


Editor’s Note
: Problem solving requires analysis, synthesis, and evaluation, the highest cognitive skills in Bloom’s Taxonomy of Behavioral Objectives. Electronic bulletin boards focus expertise to solve problem for which there is not yet a solution, or to find a solution if one exists. The bulletin board provides a display space for requests and responses for asynchronous dialog and brainstorming. Dave Knowlton proposes bulletin boards as a way to transitioning from classroom learning to solving real-world problems.

Electronic Bulletin Boards as Medium for Asynchronous Problem Solving in Field Experiences

Dave S. Knowlton

Higher education should provide students with a liberal arts education, heightened self-awareness, and preparation for the workforce. None of these purposes can be achieved apart from a problem-based learning curriculum (Knowlton, 2003a). Within this view of higher education’s purposes and the need for a problem-based learning curriculum, computer-mediated communication (CMC) may be a valuable tool for students as it has been shown to support two of these purposes - problem solving in the liberal arts tradition (e.g., DeVries, Lund, & Baker, 2002; Knowlton, 2002) and solving problems that likely will create heightened self-awareness (e.g., Merryfield, 2002). The purpose of this paper is to offer connections between CMC and the third purpose of higher education - problem solving within the labor market.

One way that college students can prepare for the workforce is to participate in field experiences, and CMC can serve as an efficient problem-solving tool for these students. Some existing literature supports this contention. CMC can support field experiences (Admiraal, Lockhorst, Wubbels, Korthagen, & Veen, 1998), and more specifically, bulletin boards can serve as an asynchronous communication tool for supporting field experiences (Doering, Johnson, & Dexter, 2003). CMC also can support certain types of problem solving (Uribe, Klein, & Sullivan, 2003; Jonassen, 2002; Jonassen & Kwon, 2001). Beckett and Grant (2003) have made cursory connections among CMC (specifically, bulletin boards to support asynchronous communication), problem solving, and field experiences. Scarce within the literature, though, are firm theoretical frameworks and pedagogical models supporting a connection among all three.

This paper begins with a theoretical framework that discusses the nature of problem solving within field experiences and describes the potential role of electronic bulletin boards in problem solving activities. After these theoretical connections, a model for using asynchronous communication to support collaborative, field-based problem solving is described. The last section discusses implications and recommendations.

Theoretical Framework

The theoretical framework offered in this section serves as an argument for using asynchronous discussion in the context of field-based problem solving. In sum, the argument has two premises: (1) Students must solve problems to be successful in field experiences and (2) asynchronous bulletin boards can serve as efficient tools for solving many types of problems. If these two premises are true, then a conclusion follows: asynchronous discussion may be useful as a tool for solving problems that students encounter as they participate in field experiences.

Problem Solving and Field Experiences

When students move from classroom settings to field settings, they will experience a shift in what it means “to learn.” In classrooms, learning is often the result of memorizing and regurgitating a database of information. In field experience settings, learning occurs through the act of solving problems. This view is consistent with Jonassen (2002) who notes differences between classroom learning and learning in the “real world.”

Even when professors try to incorporate problem solving into traditional classrooms, they must acknowledge a difference between classroom problems and problems encountered in field experiences. In classrooms, problems are often artificial and well structured. They are artificial because professors, in essence, manufacture problems for the express purpose of creating a problem for students to solve. Little - if any - “real” utility exists, then, in solving the problem. The problems are well structured because professors sometimes want students to find one pre-determined answer, or answers that fall within a narrow range of possibilities.

In field experiences, problems are neither artificial nor well structured. Instead, problems are authentic because they emerge naturally from the context of the field experience and students - who now have been recast as human capital - must analyze and solve the problem in order to be considered a “successful employee.” So, one way to describe the authenticity of the problem is to note the “stakes” of not successfully solving the problem. Furthermore, problems are ill structured because of their variety, frequency, and complexity. In addition, problems may be ill structured because they involve confluent and competing factors. Solving one element of a problem might exacerbate other elements.

The authenticity and ill-structured nature of problems that students will encounter in field experiences reinforce the importance of faculty members who supervise field experiences. Because field experiences are meant to provide students with a smooth transition from the classroom to the world of work, faculty members must adopt a sound pedagogy for helping students solve problems. In formulating and implementing this pedagogy, faculty members must maintain a careful balance. The pedagogy must scaffold students’ abilities to analyze and solve problems encountered in the field, but professors must not usurp students’ authority by solving the problems for them, or even providing a close-ended structure that leads students to a finite range of solutions (Beckett & Grant, 2003).

Asynchronous Discussion and Problem Solving

Jonassen’s (2002; 2003) point that not all online learning environments lend themselves to all forms of problem solving is clear. Still, asynchronous discussion among students is a useful strategy for promoting solution finding for many types of problems. This paper support this perspective first by pointing to the ways that asynchronous bulleting boards can serve as an efficient cognitive tool for problem-solving. Second, it points to the role of asynchronous communication as an agent for increasing social learning among students in field experiences.

First, asynchronous bulletin boards can support many problem-solving processes, such as representing problems in written form and manipulating the problem space. That is, writing contributes to problem solving in that students better understand their own perspectives, views, and beliefs as they write (Lindemann, 1995). Even though a contribution to an asynchronous discussion is not a formal type of writing, it still serves as an opportunity for students to make their ideas concrete. “Seeing” their problem and their own ideas about their problem can better help students develop a useful perspective of the problem. In essence, asynchronous discussion becomes a cognitive tool for representing problems; and as Jonassen (2003) notes, problem representation is an essential part of the problem-solving process, particularly if students are expected to transfer their problem solving skills beyond the immediate problem. These representations can be both an attempt by students to represent the problem for themselves - to make the ineffable, effable. But, asynchronous discussion also can provide the opportunity for students to represent the problem for a real audience to understand. Depending on a student’s comfort and skill with asynchronous discussion, this distinction between internal problem representation and representing the problem for an external audience may be unnecessarily obtuse (Jonassen, 2003). Conceptually, though, considering these procedures as separate may be useful in delineating a theoretical framework for using asynchronous bulletin boards as a problem-solving tool.

Second, asynchronous discussion increases opportunities for social learning and collaborative thinking - for cognition to be distributed across a community of learners who are engaged in field experiences but in different contexts. Students in field experiences are separated by distance and sometimes by time. In isolation, these students are situated in a specific context, but particularly for students who are accustomed to traditional classrooms, the isolation can be disconcerting. When students are embedded within a specific situation or context, their thinking becomes both bound by and free within that context - their cognition is situated. Students recognize themselves as situated because they accept “the mutual relation of content and context, of individual and environment, and of knowing and doing” (Barb, Moister, Moore, Cunningham, & the ILF Design Team, 2001, p. 73).

Perhaps there are varying degrees of being bound within a context as one’s participation moves from the periphery of that context to the center of contextual activity (Lave & Wenger, 1991) and as one comes to understand the fluid and iterative relationship between plans and actions (cf., Such man, 1987). This is true for students in field experiences. Such situations benefit students by getting gets them beyond the walls of an artificial classroom. It also is a hindrance because some benefits of classroom learning - discussion and collaboration among students, for example - are lost. In short, students in field experiences can benefit from thinking within a context, but they need to participate in a type of distributed cognition in order to fully appreciate the uniqueness of their situation. Asynchronous discussion can support this shift from situated cognition to distributed cognition.

The above two points - bulletin boards as a cognitive tool that supports internal and external problem representations and asynchronous discussion as a socializing medium that allows cognition to be distributed - intersect at the point where students engage in dialogue to analyze and solve real problems that they encounter in the field. As students use asynchronous bulletin boards to solve problems collaboratively, they develop a sense of productive community - a sense of distributed cognition that ultimately can transform students (Palloff & Pratt, 1999). No longer individually situated to think about their own contexts, students come to distribute their ideas across the community. Students in field experiences recognize that asynchronous discussion allows them access to new perspectives through dialogue; dialogue contributes to knowledge construction. Students begin to recognize that other participants’ contributions, interpretations, and constructions are situated within specific and unique contexts, as well. The power of the individual resides in the community, yet the power of the community is dependent on the contributions of the individual.
 

A Model for Analyzing and Solving Field-based Problems
through Asynchronous Discussion

This section presents an overview of the context in which the CMC model was used and a description of the discussion participants. It then describes the assignment guidelines and summarizes two problems that were shared and analyzed by the community of participants.

Context of the Field Experience

The participants in this asynchronous discussion were pre-service teachers in a midwestern university’s two-year, field-based teacher-certification program. These pre-service teachers were assigned to K-12 classrooms in partnership schools. During the first year of the program, the pre-service teachers often assumed a periphery role, assisting the full-time classroom teacher as a paraprofessional or aide. Over time, however, the pre-service teachers moved toward the center of teaching and learning activities within the classroom. This movement culminated in a formal “student-teaching” experience during the last semester of the two-year program.

Throughout the two years, a team of university faculty delivered weekly content seminars where the pre-service teachers were given opportunities to discuss their classroom experiences and learn various theories and methods that may serve them as future teachers. During each of the first three semesters of their field experience, the pre-service teachers were enrolled for one credit hour of educational psychology - the content that I was responsible for overseeing. “Courses,” though, were non-existent. Instead, each courses’ content was completely integrated within the seminars.

Because of time constraints, educational psychology was given short shrift within the weekly seminars during the third semester of the two-year program. CMC was used to fully engage the pre-service teachers in considering the ways that key principles of educational psychology manifest themselves in the pre-service teachers’ day-to-day problems within the partnership schools.

Assignment Guidelines for the Computer-Mediated Discussion

This CMC project was designed to (a) help participants analyze problems that they were experiencing in field experiences, (b) allow participants to share those problems with a community of peers, and (c) promote collaborative problem solving and inquiry among participants. The assignment guidelines were posted on my website so that the discussion participants easily could access them and regularly refer back to them. The guidelines were similar to those proposed by Knowlton (2002) in that participants were divided into two groups and the discussion was based on a three-week cycle of sharing and response. At the end of each cycle, roles were reversed so that students in one group performed the responsibilities of the students in the other group.

During the first week of the cycle, each participant in group one was responsible for posting the details of a problem that she or he was experiencing in the classroom. The participants were instructed to provide enough detail and background so that a reader could understand the problem fully. The participants also were instructed to describe any strategies that already had been implemented in an effort to overcome the problem. As the assignment guidelines noted, “We should all feel like we experienced this problem firsthand. But, please make sure that you are discussing the problem professionally - avoid personal attacks on those involved, for example.” Since the main purpose of the field experience was to improve pre-service teachers’ skills in designing lessons, teaching, assessing students, and evaluating their own lesson design and delivery, the assignment guidelines dictated that the problems shared must involve instructional issues. To scaffold the pre-service teachers’ understanding of how they might describe a problem, the assignment guidelines included a link to an example problem description. Participants in group two had no formal responsibilities during week one of the cycle, but the assignment guidelines did encourage them to “reply with questions and comments that might help [their] classmates in group one clarify their individual problems.”

In week two of the cycle, each group two participant was responsible for replying to two problems that were posted during week one. Importantly, the replies were to serve two purposes. First, the reply should frame the problem theoretically by pulling ideas and concepts from an Educational Psychology textbook or other resources, like academic journal articles or credible web-based resources. Specifically, the assignment guidelines informed group two participants that their responsibility was “to do more than offer a one-liner - ‘Maybe a role play would help.’” Instead, their responsibility was “to summarize [a] theory, instructional strategy, or [pedagogical or design] model thoroughly enough for [others] to understand the connection to the problem.” By meeting this responsibility, participants were theoretically framing the problem offered by a classmate. This approach supported a type of learning-on-demand, where readings were not assigned in advance of the discussion. Rather, participants found readings based on their own view of how the problem might be theoretically framed.

A second purpose of week two replies was to offer potential solutions to the problem. That is, once group two participants had theoretically framed the problems being discussed, they were responsible for offering solutions. These solutions could come from participants’ own experiences in the field, but they were encouraged strongly to “discuss the role of Educational Psychology theory in solving the problem.” That is, they were urged to address a question that connects theory to practice: “What do you understand about theory - from your own independent reading - that you now can apply to the problem at hand in an effort to help your classmate solve the problem?”

During the third week of the cycle, participants in both groups one and two were responsible for two additional contributions to a discussion thread. They could add additional ideas to a thread in which they had already participated, but they could expand their responses to the other threads that contained different problems, as well. Regardless of which thread participants responded to, their week three responsibilities obligated them to reply to a discussion contribution from week two, not an original problem posted during week one. This criterion created a continuation of the discussion based on theories and solutions that had already been described. In other words, participants were not replying to the original problem; they were replying to the various analyses and suggested solutions to the problems. This provided a richer discussion that provided more analysis of related theory and deeper critique of proposed solutions. Because week three contributions were the most nebulous in terms of purpose and scope, the assignment guidelines included a bulleted list of possible (but not necessary) approaches for contributing during week three:

  1. Pick two replies to the same problem and discuss why you think one would work better than the other.

  2. Pick a reply to a problem and discuss the strengths and/or weaknesses of the proposed solution.

  3. Pick a theory that someone wrote about during week #2 and apply that theory differently (or more thoroughly) to the problem offered during week #1.

  4. Describe innovative exercises, assignments, or activities based on the ideas in the week #2 contribution.

  5. Discuss your experiences with how a solution has/has not worked in the classroom.

  6. Write a summary of replies to your own problem and describe the biggest insights that you have gained by considering the advice of your classmates.

Examples: Problems and Corresponding Discussion

Since over sixty pre-service teachers participated in the implementation of this CMC model, it is not practical to include a description of each problem and solution posted by students. An overview of two of the threads of discussion is illustrative of the types of problems and responses that were offered.

Example #1: Teaching Reading. One participant shared frustrations about her students’ struggles to comprehend readings. This participant recounted several examples where she had read a story to her students, checking for understanding throughout the reading of the story. She noted that if students had questions, she would answer them and check again for students’ understanding. After hearing the story, students routinely completed a worksheet about the story and took a quiz to check their comprehension of the story. “My difficulty,” this participant wrote, “is that the students have not been doing well on their worksheets and quizzes. I ask them if they understand, and they all nod their heads that they do. I don't know that they don't understand unless they let me know. I can't read their minds! What do I do? Please help!”

Over the next two weeks, participants offered sixteen responses. One participant suggested that students have a fear of failure and admitting that they do not understand a reading is tantamount, in the students’ mind, to failure. Other participants, though, suggested direct strategies for solving the problem. For example, one participant rhetorically asked how the reading lessons might be structured so that students can depend on each other to clear up their own confusion. Another participant raised an issue about the specific stories and readings that were being selected. She noted that students would understand stories better if they are stories that interest the students. Another participant extended the idea of selecting stories that interest students, but offered a more academic slant by referring to an Educational Psychology text. This student noted three components of reading comprehension: broad background knowledge, the ability to apply comprehension strategies like summarizing, and an understanding of how to monitor one’s own reading through metacognitive strategies. The participant suggested the need to incorporate these components into the reading lessons.

Example Problem #2: Questioning as a Teaching Strategy. One participant shared a problem that dealt with her use of questioning in the classroom. Specifically, she had concerns about students’ abilities to understand her questions and reply to them in a short amount of time:

“Sometimes it seems that the students are really thinking about the answer [to a question that I have asked, but] it is just taking them a long time. Other students seem to be wasting time, [or perhaps they] know what they want to say but cannot verbally express it. Currently, if a [student] does not give an answer fairly quickly, [then] I ask another student. Should I give the students more wait-time and cut down the lesson? Should I just skip the student? Or are there other options?”

Over the next two weeks, participants contributed fourteen replies to this discussion thread. One participant suggested that the K-12 students could be asked to jot down their thoughts in writing before answering the question. This type of informal writing, the participant argued, might help students clarify their thoughts so that they could “see what they are thinking” prior to participating orally. Another participant agreed, but noted that the strategies for helping students clarify their thoughts might vary with the level of thinking being required. If students were simply being asked to report factual information, then a large amount of time to formulate thoughts may not be necessary. If, however, the questions were more open-ended, then more thorough strategies might be useful. Another participant suggested that providing clues or rewording questions might be useful strategies for helping students process the question and arrive at an answer.

Importantly, not all contributions to the thread were ideas for immediate solutions. Some participants, for example, described the ways that various theories would inform the use of questioning in the classroom (e.g., how would a behaviorist use questioning differently from a cognitivist or constructivist?). Other participants raised tangential issues to the problem. One participant, for example, raised questions about students’ self-esteem and the long-run effects on a students’ self-efficacy if they did not know the answers. Other participants raised questions about classroom management and urged the participant who posted this problem to consider how classroom rules and processes may influence students’ willingness to answer questions during lessons. Other participants rejected the opportunity to answer in academic terms, but instead turned their reactions into an opportunity for answers based on their own experiences: “One of the worst feelings in the world is getting asked a question [and] either having no idea what the answer is or not having time to think about [the answer].”
 

Conclusions and Implications

I have described the use of CMC as a tool to support problem solving among students who are participating in field experiences. The underlying assumption is that helping students solve problems in field experiences will better lead them to opportunities to be transformed by CMC (cf., Palloff & Pratt, 1999). This idea of transformation is consistent with views about the purposes of educational psychology courses (cf., Dembo, 2001).

Admittedly, I am not the first to use CMC as a problem-solving tool to support the learning of educational psychology in field experiences (cf., Bonk, Malikowski, Angeli, & East, 1998). Where this previous article, though, focused on empirical rigor to analyze the use of CMC in field experiences, I focus on pedagogical rigor by offering a theoretically-grounded model of CMC. Furthermore, new here are the unique connections among problem solving, CMC, and field experiences.

Importantly, the model discussed in this paper is not inherently tied to a specific discipline. This model could be used by participants in field experiences across a variety of disciplines, from clinical experiences in the medical profession to internships in business to archaeology expeditions in the far reaches of earth. For faculty members who might consider implementing this model, though, numerous questions must be considered. The remainder of this paper raises some of these salient questions that have emerged from my experiences developing and implementing this model.

Is broad analysis of a problem better than depth in analyzing a problem?

Notably, within this model of CMC described here, probability of participants offering a discussion of the solution that was actually implemented seems slim. While a three-week cycle of discussion does allow broad participation in analyzing and offering potential solutions that might be implemented, it does not provide the participant who owned the problem with enough time to implement a proposed solution, evaluate that solution, and report on the implementation to the participants in the CMC discussion. From a problem-based learning (PBL) perspective, the emphasis on breadth over depth is not typical, as a hallmark of PBL is usually an in-depth analysis of the example problem (Dods, 1997).

Faculty members who implement this model might overcome this problem in two ways. One way would be to leave some flextime at the end of the semester that would allow students to revisit the problems that they contributed to the computer-mediated discussion. This flextime might give students the opportunity to share the ways that they used input from other participants to solve the problem. A second way to overcome the breadth versus depth issue might be to require the completion of a separate but related assignment - such as requiring students to write a more formal reflection paper that includes a discussion of field-based problems and corresponding solutions. A formal writing of this type even might be a capstone of the field experience.

How does student independence need to be balanced with faculty guidance?

As noted in the description of this model, numerous devices were used to scaffold participants’ potential for success within the discussion. For example, links to example problem descriptions were offered. In addition, questions to guide participants’ week three contributions were included in the assignment guidelines. The degree to which CMC participants need such scaffolding depends on many factors, not the least of which might be their prior experiences using CMC to solve problems. If participants are nascent users of CMC, faculty members might consider providing additional scaffolding to support participants. For example, a link was provided from assignment guidelines to a job aid that described strategies for making CMC more productive. These strategies proved useful to many of my students, but the job aid included items that may be little more than statements of the obvious to expert CMC users - for example, sign contributions to the discussion, double space between paragraphs, and adopt a tone of respect toward opposing opinions.

How will assessment of students’ efforts be considered?

Numerous articles and resources can guide faculty members as they consider student assessment (e.g., Anderson, Bauer, & Speck, 2002; Bauer & Anderson, 2000; Anderson & Puckett, 2003; Knowlton, 2003b). Often in online discussions, students receive most of their credit simply by participating. Participation is not in itself a sufficiently rigorous assessment of students’ contributions to CMC. Conversely, though, my experiences suggest that professors can get bogged down in “grading” students. This can be counterproductive because grading is not necessarily congruent with assessing, providing feedback, and other activities that denotatively are quite different, not to mention more educationally salient. (For a discussion of the language of grading, see Speck, 1998).

One way to overcome this problem was through the use of a form that students completed and e-mailed to the instructor at the end of each three-week cycle. This form asked students to report factual information about their participation for that cycle. For example, the form asked students to list the subject line of the threads in which they participated. The form also asked students to briefly list the resources that they used in theoretically framing a problem to which they were responding. This form was not a self-assessment as much as it was a productivity report. When students submitted their forms, they were matched to a list of threads that defined their contributions. This made the process of “grading” less time consuming.

Beyond using a productivity report to track students’ contributions, faculty members might consider implementing self-assessments that go beyond reporting participation. Such self-assessments could obligate students to assess their own work against the “purposes” of the discussion itself. Similarly, faculty members might implement formal procedures for peer assessments. Self- and peer-assessments can come in forms as simple as dichotomous or Likert-scale checklists, but they also might involve processes indicative of a more careful analysis of students’ contributions, such as qualitative and open-ended assessments (Knowlton, 2003).

What mechanism will allow for systematic revision of assignment guidelines?

Whereas assessment focuses on student learning, evaluation focuses on the success of the model’s design and implementation. While the model described in this paper is sound, problem-solving processes vary among disciplines and exact assignment guidelines may be idiosyncratic to different professors even within the same discipline. Therefore a solid plan is needed for refinements of the model prior to its implementation. (The prototyping of this model from an Alpha version to a Beta-B version is the subject of a forthcoming article.) In general, student input should be considered as one data source for justifying changes to the assignment guidelines. Further, as a professor better comes to understand participants’ roles in their various field experiences, changes to the assignment guidelines may be needed.
 

References

Admiraal, W. F., Lockhorst, D., Wubbels, T., Korthagen, A. J., & Veen, W. (1998). Computer-mediated communication environments in teacher education: Computer conferencing and the supervision of student teachers. Learning Environments Research, 1(1), 59-74.

Anderson, R. S., Bauer, J. F., & Speck, B. W. (Eds.). (2002). Assessment strategies for the online teacher: From theory to practice. San Francisco: Jossey-Bass.

Anderson, R. S. & Puckett, J. (2003). Assessing students’ problem-solving assignments. In D. S. Knowlton & D. C. Sharp (Eds.), Problem-based learning in the information age, (pp. 81-88). San Francisco: Jossey-Bass.

Barab, S. A., MaKinster, J. G., Moore, J. A., Cunningham, D. J., & the ILF Design Team (2001). Designing and building an online community: The struggle to support sociability in the inquiry learning forum. Educational Technology Research and Development, 49(4), 71-96.

Bauer, J. F. & Anderson, R. S. (2000) Evaluating students’ written performance in the online classroom. In R. E. Weiss, D. S. Knowlton & B. W. Speck, (Eds.) Principles of effective teaching in the online classroom, (pp.65-72). San Francisco: Jossey-Bass.

Beckett, J. & Grant, N. K. (in press). Guiding students toward solutions in field experiences. In D. S. Knowlton & D. C. Sharp (Eds.), Problem-based learning in the information age, (pp. 67-72). San Francisco: Jossey-Bass.

Bonk, C. J., Malikowski, S., Angeli, C., East, J. (1998). Web-based case conferencing for pre-service teacher education: Electronic discourse from the field. Journal of Educational Computing Research, 19(3), 269-306.

Dods, R. F. (1997). An action research study of the effectiveness of problem-based learning in promoting the acquisition and retention of knowledge. Journal for the Education of the Gifted, 20, 423-437.

Dembo, M. H. (2001). Learning to teach is not enough - future teachers also need to learn how to learn. Teacher Education Quarterly, 28(4), 23-35.

DeVries, E., Lund, K., & Baker, M. (2002). Computer-mediated epistemic dialogue: Explanation and argumentation as vehicles for understanding scientific notions. The Journal of the Learning Sciences, 11(1), 63-103.

Doering, A., Johnson, M., & Dexter, S. (2003). Using asynchronous discussion to support pre-service teachers’ practicum experiences. TechTrends, 47(1), 52-55.

Jonassen, D. H. (2002). Engaging and supporting problem solving in online learning. Quarterly Review of Distance Education, 3(1), 1-13.

Jonassen, D. H. (2003). Using cognitive tools to represent problems. Journal of Research on Technology in Education, 35(3), 362-381.

Jonassen, D. H. & Kwon, H. I. (2001). Communication patterns in computer mediated versus face-to-face problem solving. Educational Technology Research and Development, 49(1), 35-52.

Knowlton, D. S. (2003a). Preparing students for educated living: The virtues of problem-based learning across the higher education curriculum. In D. S. Knowlton & D. C. Sharp (Eds.). Problem-based learning in the information age, (pp. 5-12). San Francisco: Jossey-Bass.

Knowlton, D. S. (2003b). Evaluating college students' efforts in asynchronous discussion: A systematic process. Quarterly Review of Distance Education, (4)1, 31-41.

Knowlton, D. S. (2002). Promoting liberal arts thinking through online discussion: A practical application and its theoretical basis. Educational Technology & Society, 5(3), 189-194.

Lave, J. & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge, MA: Cambridge University Press.

Lindemann, E. (1995). A rhetoric for writing teachers (3rd Ed). New York: Oxford University Press.

Merryfield, M. M. (2001). The paradoxes of teaching a multicultural education course online. Journal of Teacher Education, 52(4), 283-299.

Palloff, R. M. & Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies for the online classroom. San Francisco: Jossey-Bass.

Speck, B. W. (1998). Unveiling some of the mystery of professional judgment in classroom assessment. In R. S. Anderson & B. W. Speck (Eds.) Changing the way we grade student performance: Classroom assessment and the new learning Paradigm, (pp. 17-31.) San Francisco: Jossey-Bass.

Suchman, L. (1987). Plans and situated actions: The problem of human machine communication. New York: Cambridge University Press.

Uribe, D., Klein, J. D., & Sullivan, H. (2003). The effect of computer-mediated collaborative learning on solving ill-defined problems. Educational Technology Research and Development, 51(1), 5-19.

Weiss, R. E. (2000). Humanizing the online classroom. In R. E. Weiss, D. S. Knowlton, & B. W. Speck (Eds.), Principles of effective teaching in the online classroom, (pp. 47-51). San Francisco: Jossey-Bass.
 

About the Author

Dr. Dave S. Knowlton is an Assistant Professor of Instructional Design & Learning Technologies, Department of Educational Leadership, at Southern Illinois University. Edwardsville IL He is coeditor of Principles of Effective Teaching in the Online Classroom [Jossey-Bass Publishing, 2000]. He is also coeditor of Problem-Based Learning in the Information Age [Jossey-Bass Publishing, 2003]. For more about his professional interests, please visit www.siue.edu/~dknowlt. He can be contacted at dknowlt@siue.edu

 

go top
May 2004 Index
Home Page