March 2007 Index
 
Home Page


Editor’s Note
: Multi-faceted and reusable learning objects are moving from the laboratory into daily teaching. Earlier research shows instructional designers produce more effective learning objects. This study trains teachers to prepare learning objects and compares the product and results. Teachers produce a different kind of product with less media and more verbal explanations to achieve similar results. This research demonstrates a step forward and opportunities for further study.

Should K-12 Teachers Develop Learning Objects? Evidence from the Field with K-12 Students

Yavuz Akpinar, Huseyin Simsek
Turkey

Abstract

The emergence of learning objects for teachers as a focus of educational concentration is relatively new and much of the discussion has not been based on the actual development of objects, but different definitions, learning theories, properties and standards or decorative packages of learning objects (LOs). In many teacher education programs, prospective teachers take a computer literacy class separate from content methods classes and rarely engage in producing authentic teaching/learning experiences. This research addresses prospective K-12 teachers’ development of learning objects. In this study, a group of prospective K-12 science teachers’ learning objects were examined, evaluated and compared with LOs developed by instructional designers (IDs). A total of forty learning objects were closely investigated and effectiveness of eight of them was tried out with 180 target students in classrooms. Detailed analysis of the LOs demonstrated that while both preservice teachers and the IDs use similar number of instructional elements in their LOs, the IDs represent concepts and procedures with screen objects other than the text and used the text for supporting graphical objects. Both groups developed LOs similar in quality measured with the LORI 1.5. Statistical tests on data obtained from classroom usage of the LOs showed marked improvements in the students’ learning.

Keywords: learning object, prospective teachers, development, evaluation.

Introduction

Teachers are responsible for tailoring instructional activities to meet curriculum standards and the unique interests and educational needs of their students. Teachers decide on "conditions, time, and strategies" of using technology in the classroom. Those decisions may include selecting learning objects that enlarge and enrich their repertoire of instructional techniques for the content to teach (Bratina, Hayes & Blumsack, 2002). E-learning systems replace the teacher as the center for learning, the teacher role shifts from lecturer to that of course developer and, once a course is in session, the learning facilitator (Cohen & Nycz, 2006). Teachers can now engage their students in computer based processes that help them build a personal knowledge base by manipulating aspects of simulated worlds, analyzing and visualizing data. Also computer based modeling tools allow students to express their theories in models that can be simulated and students will be confronted with the consequences of their ideas (Van Joolingen, Jong & Dimitrakopoulout, 2007).

To realize these potentials, the learning object (LO) development for and use in K-12 environments has become popular in related yet varied projects across the globe. In the CELEBRATE project organized by European SchoolNet (2002-2004), many such objects were made to use in classrooms. The UK government, in 2003, initiated a web portal to give teachers easy online access to a range of digital learning resources to support their teaching across the curriculum. One year later, the ARIADNE Foundation of Europe started to create tools and methodologies for producing, managing and reusing computer-based pedagogical elements and ICT supported training curricula. In the USA, Apple’s Learning Exchange is one of the first repositories; the NSF funded SMETE Digital Library project was developed as a learning object repository and is used as a resource and knowledge base by both K-12 and higher education instructors (McGreal, 2004). A third project in the USA, MERLOT consortium, held a repository and uses peer-reviews of learning objects as the basis for inclusion. Australia, Canada and New Zealand have also made efforts to engage K-12 schools in design and development of initiatives (Bennetta & McGeeb, 2005).

As the use of learning objects for teaching via technology became more widespread in educational settings (Conceição & Lehman, 2003), most of the research literature on learning objects has focused on the specifications and potential designs of learning objects. Even the National Educational Technology Standards for Teachers does not require teachers to develop their technology-based learning resources, but asks to use those facilities. Interdisciplinarity in teams of LO development may be necessary for high quality as highlighted by Kay and Knaack (2005; p.231) who stated: “Developing high quality learning objects is a daunting task involving collaboration among subject specialists, programmers, multimedia designers, and evaluators”.

It is often said that we don’t expect teachers to write their own textbooks, so why should we expect them to design their own technology based materials? (Bratina et al., 2002). Ainsworth and Fleming (2006) reply to this question and argue that teachers do customize their textbooks to use in their classroom by suggesting an order to read chapters, explaining difficult terms, providing exercises and worksheets. They propose that much can be gained by providing teachers with simple authoring tools.

Other researchers (Bell, 1999, Boyle, 2003; Merriënboer & Martens, 2002) suggest that instructional software templates may positively affect the efficiency of the development process and compensate for the developers’ lack of experience. This can be beneficial for the authoring of instructional software because more people with low instructional design and software production skills are becoming involved. Further, teacher involvement in the development of online learning resources has received attention only recently (Akpinar & Simsek, 2006; Kay & Knaack, 2005; Muirhead & Haughey, 2005; Lajoie, 2003; Oliver, Harper, Hedberg, Wills, & Agostinho, 2002; Recker et al., 2005) and researchers (Dunning et al., 2004; Jones, 2004) have suggested that with the addition of simple templates, teachers will be able to make their own objects. Haughey and Muirhead (2005) stress that it is likely that teachers will be able to develop objects requiring activities such as drag and drop, or put the items in a sequence.

In developing learning objects, different type of information might be created using traditional tools such as scanner software, spreadsheets, word processing, painting tools, HTML editors, GIFmakers, video editors/capturers and some general and specific purpose software. In LO terms, picture, animation, simulation, sound file, hyperlink, game, video, and downloadable-file are called assets. Assets can be combined to form larger files and sharable content objects (SCO). The number, quality and orientation of screen elements loaded into a lesson are an issue for development of LOs, though Learning Content Management Systems (LCMS) and authoring environments provide many facilities to create and edit screen components. The usage of those facilities should not require experience and expertise but should demand great care because research data (Hannafin & Hooper, 1989; Li, 2006; Stemler, 1997) for possible components of a computer based lesson suggest that for effective learning, screen design decisions should reflect balance among learner attributes, content factors, and processing requirements of the learning task.

Teachers with ready access to learning objects become designers who adapt and customize learning objects to fit their local needs and context (Dede, 2003; Littlejohn, 2004). In this context, learning objects become catalysts for creating locally relevant instructional solutions to support learning (Recker et al., 2005) Unfortunately most LO repositories are in English and that creates a “language divide”. Teachers instructing in other languages have to do more than adaptation and customization to develop their own LOs in the light of instructional theories and available LO repositories. Perhaps LO repositories in languages other than English, especially in the developing countries, can be constructed and enriched by such efforts. Other researchers (e.g., Figg, & Burson, 1999; Oliver et al, 2002; Waddoups & Wentworth, 2002) have also pointed out the importance of including teachers in the development process. While a number of design features have been incorporated by developers of learning objects in the literature, only a few studies did a formal descriptive evaluation of the final learning object product (e.g., Cochrane, 2005; Krauss & Ally, 2005), there are not enough number of studies examining impact of learning objects developed by teachers on students’ achievement .

Problems of the study

This research studied preservice science teachers’ development of learning objects in an LCMS and compared those with instructional designers’ LO development. The study provides a preliminary quantitative measure and evaluation of different authors’ use of assets, organization of assets and instructional directions in a learning object they create. This study aimed to:

(1) compare preservice science teachers’ and IDs’ development of K-12 science learning objects with different (a) number of assets (picture, animation, simulation, sound file, hyperlink, game, video, downloadable-file), (b) text density on each learning objects (small amount, moderate amount and large amount of text), (c) number of instructional elements (advance organizers, questions and didactical directions), (d) number of screen orientations (templates, picture orientation, font types and font sizes, colors, main topics, sub-topics and Sharable Content Objects (SCO)) in their products, (e) the quality of LOs using the Learning Object Review Instrument (LORI, version 1.5 by Nesbitt and Li, 2004), and

(2) investigate the effect of LOs with the targeted students in real classroom environments.

Method

Subjects

To investigate preservice science teachers’ development of learning objects in a LCMS, a series of studies were conducted with 40 subjects (20 preservice science teachers and 20 newly graduated instructional designers). During the study, the preservice science teachers, the experimental group, were studying their final year in a school of education in order to be the teachers of varying fields as Primary and Secondary School Science and Mathematics Education in 2006 spring and fall semesters. They complete their degrees in four or five years after one-year of English Language preparation. The ones who will teach in secondary schools study five years, but others study four years. Selection of the subjects was carried out on the basis of accessing them during the research activities. Before the study, they all studied at least one ICT related course, “e.g. Introduction to Computing”. They were familiar with and users of information and communication technologies.

The instructional designers (IDs), the control group, were new graduates of the same faculty, studied in the Department of Computer Education and Educational Technology, and completed courses including instruction, learning, analyzing performance problems, and design, development, implementation and evaluation of instructional strategies and products. All participants contributed to the study on a voluntary basis.

Procedure and Materials

The materials of this study included “Instructional Materials Development course, LCMS, BU-LeCoMaS, environment, the learning objects for K-12 developed by the subjects using the BU-LeCoMaS, LO Review Instrument and achievement tests used in pre and post testing of K-12 students’ achievement. The preservice science teachers followed a thirteen week Instructional Materials Development course focusing on the development, implementation and evaluation of ICT based instructional materials. Special emphasis was given to the properties of learning objects. In this four-hour per week course (two hours theoretical and two hours practical activities), subjects were given opportunities for intensive experience in web based learning materials; some learning activities in the course were based on developing online support materials and web sites with a commercial web editor to assist K-12 students’ learn content. The course included practical sessions on how to create learning resources in MM Flash and DreamWeaver environments.

When the preservice science teachers completed their course, both the instructional designers and the preservice science teachers were provided with a username and a password to the BU-LeCoMaS server and received one-hour of training in use of the BU-LeCoMaS learning content management system. The training was carried out in two sessions. The lab was equipped with one server and 20 PCs organized in U shape in the room, all connected to the Internet. After training, participants were instructed to select a K-12 science learning task, prepare and bring their materials (assets of learning objects) to the lab in a week time to aggregate those materials and develop learning objects for K-12 science students. They were encouraged to use any sort of learning materials  and assets from text to animations, and from static graphics to video segments. They were allowed to re-use graphics borrowed from Internet; they were free to use anything they found that was appropriate for their instruction.

In the following week, subjects were asked to use the system facilities and to develop a set of web based materials as learning objects for a part of their chosen K-12 science learning unit. They required enough materials to create one lesson hour of study. During their usage of the system, one of the researchers was present in the lab to resolve technical problems but did not intervene in the participants’ work. Each participant developed one learning object, a total of forty, in K-12 science.

The BU-LeCoMaS, learning content development and management system (see Figure 1), is an easy-to-use LCMS, requiring content authors with little or no technology expertise to develop learning objects. It helps online material developers, with time, place and platform independent content authoring. The architecture of BU-LeCoMaS can handle and execute any content input. It facilitates integration of textual content, sound, movie and animations into software packages and enables multimedia platform creation. It has lesson templates, layout templates and information creation and editing tools. Multiple users can easily and collaboratively construct, share and re-use content within the LCMS as well as re-use after development. Further, it supports SCORM standards, allowing developed content to be used in different learning management systems based on the idea of reusable learning content as sharable content object.

To create a small set of learning content, an authorized author can use BU-LeCoMaS to sequence and group learning materials to constitute a learning unit, the size of which varies and depends on its author. A list of available learning units and learning topics is displayed in the root window of the system. A learning topic is a subset of a learning unit. BU-LeCoMaS supports both constructing a learning unit and constructing an asset, a granular learning content. The author specifies the title and description of the material he or she is creating, selects a template, object type, tree-view type, background and foreground colors, and style sheet. The author determines “create as template” or share it as a“public template”, and decides whether ore not to include it in the subject index of the BU-LeCoMaS.

The subject index is used to search the object repository of the BU-LeCoMaS and it is used by authors to manage associations of their materials for learning units. Once the author enters relevant information and selects, for example, LO Template Tutorial-1, the learning unit frame is provided. The author receives a screen where the name of his or her materials appears with four sub-sections: Objectives, Introduction, Read & Study, and Images. The author can select any sections and designs. Finally, the author publishes his or her projects and set viewer permissions. Viewing options include user-only, the author’s students, or anyone browsing the BULeCoMaS. The project can be downloaded as a SCORM-compliant zip archive for use outside BULeCoMaS.
 

Figure 1. BU-LeCoMaS learning object development platform.

Following two consecutive days when participants develop learning objects for K-12 with the BU-LeCoMaS, one for preservicers and one for IDs, they are given a usability questionnaire with 44 five-point Likert type items and two essay items to measure the usability of the content development system, BU-LeCoMaS (five additional questions collect personal information). The scale was previously developed and used elsewhere (Akpinar & Simsek, 2006) and for testing usability of a similar tool. Cronbach's alpha reliability coefficient of the questionnaire was estimated in this study as 0.91. Each participant’s total usability score was estimated. The mean of those scores was 165.25 of a possible score of 220.00.

Review of the Learning Objects and Data

The forty learning objects were analyzed by the two researchers. They studied the LOs to identify patterns and counted elements including (1) number of assets (picture, animation, simulation, sound file, hyperlink, game, video, downloadable-file), (2) text density (small amount, moderate amount and large amount of text) on each learning object, (3) number of instructional elements (advance organizers, questions and didactical directions) and (4) number of screen orientations, sub-topics –Sharable Content Object (SCO)s, templates, picture orientation, font types, font sizes, colors, and main topics in each LO.

Table 1
Data on the two groups’ LOs

LORI items

Group

Mean

Std.Dev.

Rank

1. Content Quality: Veracity, accuracy, balanced presentation of ideas, and appropriate level of detail

P. Teacher

3.00

.95

19.73

I. Designer

3.30

.43

21.28

2. Learning Goal Alignment: Alignment among learning goals, activities, assessments, and learner characteristics

P. Teacher

2.67

.92

20.70

I. Designer

2.85

.45

20.30

3. Feedback and Adaptation: Adaptive content or feedback driven by differential learner input or learner modeling

P. Teacher

2.00

.85

16.13

I. Designer

2.47

.53

24.88

4. Motivation: Ability to motivate and interest an identified population of learners

P. Teacher

2.75

.89

20.30

I. Designer

2.65

.54

20.70

5. Presentation Design: Design of visual and auditory information for enhanced learning and efficient mental processing

P. Teacher

2.87

1.08

17.23

I. Designer

3.16

.48

23.78

6. Interaction Usability: Ease of navigation, predictability of the user interface, and quality of the interface help features

P. Teacher

3.22

.83

23.73

I. Designer

2.90

.44

17.28

7. Accessibility: Design of controls and presentation formats to accommodate disabled and mobile learners

P. Teacher

2.97

.63

22.93

I. Designer

2.74

.34

18.08

8. Reusability: Ability to use in varying learning contexts and with learners from differing backgrounds

P. Teacher

3.00

.76

23.03

I. Designer

2.82

.36

17.98

9. Standards Compliance: Adherence to international standards and specifications

P. Teacher

5.00

.00

20.50

I. Designer

5.00

.00

20.50

LORI Total

P. Teacher

27.50

5.37

20.20

I. Designer

27.91

3.37

20.80

# of assets

P. Teacher

11.90

10.02

16.15

I. Designer

18.65

11.29

24.85

Amount of text

P. Teacher

2.15

.58

25.10

I. Designer

1.55

.51

15.90

# of instructional elements

P. Teacher

4.90

4.73

21.18

I. Designer

4.05

4.03

19.83

# of screen orientation

P. Teacher

3.50

1.35

13.55

I. Designer

6.30

2.36

27.45

Usability score

P. Teacher

172.45

18.89

23.78

I. Designer

158.05

12.14

14.75

To establish a learning object repository for various levels requires criteria to assist teachers develop, submit and assess LOs (Akpinar & Simsek, 2006). These criteria are crucial to ensure quality and accessibility of resources in the repository. For that purpose, Nesbit and Li (2004) developed a Learning Object Review Instrument (LORI 1.5). This study used it based on evidence that LORI can reliably assess some aspects of LOs. LORI 1.5 uses nine items with brief descriptive rubrics associated to each item and Likert-style five point response scale scored from low (1) to high (5). If an item is judged not relevant to the LO, or if the reviewer does not feel qualified to judge that criterion, the reviewer may opt out of that item by selecting “not applicable”. Items of LORI 1.5 are given in the first column of Table 1.

In order to evaluate the LOs developed by the preservice science teachers (see Figure 2 for an example), two researchers reviewed and rated the LOs individually using LORI scoring sheets. Following the reviewing and rating process of 20 LOs, the researchers combined the ratings and estimated average ratings for each of nine issues for a particular LO.

The reviewers’ overall ratings for a LO was obtained through summing up points given to each nine issue for a particular LO. Next, the LOs developed by the control group, IDs (see Figure 3 for an example), were made available in a web server to the IDs who reviewed and rated their twenty developed LOs independently using the LORI and the twenty IDs’ ratings were averaged. The two researchers who rated the preservicers’ LOs also rated the IDs’ LOs.

The correlation between the researchers rating and the IDs’ rating was high (0.96) so LO ratings of the preservice and control groups were combined. A formative reliability analysis of the LORI 1.5 data revealed that the overall internal-consistency reliability (Cronbach’s alpha) of the LORI 1.5 is 0.94.
 

Figure 2. A learning object developed by a preservice teacher.

 
Figure 3. A learning object developed by an ID.
 

To test whether the preservice science teachers’ and IDs’ LOs are meaningfully different in terms of number of assets, amount of text, number of instructional elements, number of screen orientations and quality, (SPSS estimated Skewness and Kurtosis measures on the data sets showed that the data was not distributed normally), Mann-Whitney U tests were conducted on the groups’ data (Table 1). The tests revealed that

(a) The preservice science teachers  used meaningfully less number of assets in their LOs than the IDs (U=116,50; p=0,024);

(b) The preservice science teachers used meaningfully more amount of text in their LOs than the IDs (U=102,50; p=0,003);

(c) The preservice science teachers’ and the IDs’ use of number of instructional elements in their LOs are not meaningfully different (U=190,00; p=0,317);

(d) The preservice science teachers used meaningfully less number of screen orientations in their LOs than the IDs (U=60,00; p=0,000);

(e) The quality of LOs developed by the participants were rated by using the LORI and compared using the Mann-Whitney U test (Table 2). As the rating of LOs was carried out for the nine individual items of the LORI as well as LO overall rating, the statistical test was conducted for all of them. The preservice science teachers included meaningfully less features of Feedback and Adaptation in their LOs than the IDs (U=120,50; p=0,031). The quality of the groups’ LOs did neither differ in the use of other properties nor in overall quality that LORI measures;

(f) Although the groups did not differ in most of the LORI items, whether the preservice science teachers as well as the IDs found the LO development platform usable, the usability questionnaire data was examined. The result indicates that the groups’ average perception of the BU-LeCoMaS facilities was positive in general: The current state of the most facilities was confirmed. However, the preservice science teachers found the BU-LeCoMaS facilities more usable than the IDs did (U=112,50; p=0,018).


Students’ Evaluation of the Developed Learning Objects

Evaluation rubrics as the LORI give a preliminary idea about instructional quality of learning objects. Studies carried out with actual users of LOs, students, may provide data about the effects of LOs on student achievement in the relevant content area. Kay and Knaack (2005) stressed that a set of pre and post-test content questions is important to assess whether any learning actually occurred. Hence, to investigate effect of the LOs in K-12 science with the targeted students in real classroom environments, all forty LOs were ordered according to their overall LORI scores and the LOs received an overall rating of 30 and over were selected: As the maximum overall rating score of LORI for a LO is 45, two-third of the top score was defined as a threshold score. There were eight LOs received an overall LORI rating between 30 and 36. The first three LOs given in Table 2 were developed by the preservice science teachers and the other five LOs were developed by the IDs. The selected LOs were then taken to classrooms where students of the target grades studied the LOs in a lesson hour.

The samples of the evaluation studies given in Table 2 were obtained from five different local schools where the preservicers do their training. Both before and after the students’ work with the LOs, a pretest and post-test, containing parallel items in multiple choice formats to measure students’ achievement, were administered. When the students were working with the LOs, their classroom teachers and a researcher were present, but the class teachers simply explained how students will work with the LO facilities and helped them to use the facilities. All students studied the LOs independently.

Table 2
Statistics on the students’ evaluation of selected learning objects

LO Subject

Grade

Sample size

Pre-Posttest items

Pretest mean

Pretest St.Dev.

Posttest mean

Posttest St.Dev.

Paired t

df

Sig. (2 tailed)*

Mirrors

4

20

10

3.20

1.936

5.10

1.447

6.371

19

0.000

Color formation

4

20

10

2.80

0.833

4.53

1.375

4.989

19

0.000

Atoms

7

18

10

5.77

2.414

7.27

2.539

3.319

17

0.004

Motion

7

17

10

4.82

1.976

5.76

1.348

2.791

16

0.013

Electric Circuits

8

18

10

3.44

1.099

6.22

1.003

8.444

17

0.000

Solubility

9

24

10

3.00

1.685

4.67

1.351

4.097

23

0.000

H.Projectile Motion

9

47

10

3.32

1.353

2.74

1.276

2.230

46

0.031

Frictional force

9

16

10

4.00

1.549

4.88

1.996

2.573

15

0.021

*P<0.05;

The answers to the pre and post tests were scored and analyzed. In all eight applications, Skewness and Kurtosis measures showed that the data was normally distributed; hence, Paired-Sample t tests were conducted to compare the pre and the post test data. The analysis (Table 2) revealed that while seven of the LOs helped the sample students improve their pretest scores in the learning tasks of the LOs, only one of the LOs did not assist the students to improve their pretest scores, instead that LO about Horizontal Projectile Motion (HRM) for ninth grade lowered the students’ pretest scores.

Discussions and Conclusions

In the design of K-12 science LOs, according to the data analysis,

  1. the preservice science teachers embedded fewer assets (picture, animation, simulation, sound file, hyperlink, game, video, downloadable-file) and fewer screen orientations (sub-topics –Sharable Content Object (SCO)-, templates, picture orientation, font types and font sizes, colors, main topics) in their LOs than did the IDs;

  2. the preservice science teachers authored more text in their LOs than the IDs;

  3. the preservice science teachers developed similar number of instructional elements (advance organizers, questions and didactical directions) in their LOs as the IDs. The preservicers  embedded more text and less assets and screen orientation in their Los. This may be because preservicers wanted to explain concepts and procedures directly with text and support it through other representations with screen objects.

The overall quality of LOs the groups developed was similar; the quality of the preservicers’ LOs differed only in the feedback and adaptation item from the IDs’ LOs. While 50% of the preservice science teachers’ LOs and 20% of the instructional designers’ LOs received low ratings from the reviewers in terms of adaptivity of content to learner needs that the LORI item 3 measured, 75% of the instructional designers’ LOs received moderate ratings from the reviewers and 20% of the preservice science teachers’ LOs received high ratings from the reviewers. Low rated LOs are unable to tailor instructional activities to the specific needs of learners: A model of the learner is not maintained in those LOs that influence effectiveness of the learning objects. The LOs mainly present content and do not use learner responses to adapt subsequent presentations and deliver rich feedback. In almost one-half of those learning objects, interactivity for navigation or selection of information is supported but the delivered feedback is poor.

The reviewers’ rating of the LOs developed by the preservicers demonstrated that the preservicers are able to develop LOs in “moderate” quality. The LOs are in “tutorial” mode and most participants sequenced a few SCOs to form a LO. The contents are mostly presented in a didactic manner and student-centered activities are not common in the LOs.

Analyses of the developed LOs showed that participants prefer to use granular resources. That agrees with the findings of a study by Recker et al (2005). Participants seemed to be creating simple projects with somewhat directed activities. This may result because participants were novice developers. Comparisons of the preservicers and the instructional designers’ LOs on the basis of reviewers’ rating through LORI 1.5 reveal that the groups’ LOs did not differ in overall ratings (except for LORI item 3, the groups did not differ at eight individual items of LORI 1.5.). The quality of the preservice science teachers’ LOs are similar to the quality of the IDs’ LOs.

The preservicers’ design of the material type, tutorial, is somewhat different from materials the teachers developed in the study of McCormick et al (2004) where many of activities were designed to reinforce information. Recker and her colleagues stated that teachers with little teaching experience are less likely to adapt resources and more likely to use them unchanged. This study had a different outcome because formative evaluation showed a need for preservicers to receive training in development of learning materials.

The current study did not investigate whether students developed any misconceptions due to the LO, different learning/teaching strategies should be further studied. The effect of the HRM LO may demonstrate that evaluation rubrics as the LORI do not always provide enough information about quality of LOs and additional evaluation strategies may be needed. However, this study did validate the LORI to a certain extent.

Limitations in the design and development of these experimental studies avoid generalizing the findings to the larger preservice-teacher population. However, they do provide preliminary insights on the role of teachers as developers. The results are encouraging and show that preservice science teachers and IDs are able to design and develop LOs which helps students to learn. This finding supports the aggregation of content objects into learning objects by preservice science teachers; this should encourage LO projects to ask teachers to evaluate and use LOs and to involve science teachers in developing LO repositories. Pilot LO evaluation studies with students show that the LORI may be used to predict the quality of LOs; however this tool should be used with caution.

In order to avoid undesired effects, LO repositories should contain detailed usage information. Information patterns should include suggested sequence of activities based on  past success; each activity should be linked to additional information regarding purpose of the activity, what the activity entails, and guidelines for teacher intervention including when to intervene. The authors’ work on learning resource development environments focused on (1) a learners’ record repository containing information about students’ learning difficulties, teachers’ experiences to overcome those difficulties, and information about student reactions that will help teachers, (2) a global task pool with critique and suggestions about each task or regime to enable teachers to have quality authentic tasks validated by colleagues, and (3) an experience repository with  information about students’ task manipulation and learning styles, actions to follow an activity, and type of additional help and intervention that may be needed.

Further work in this area should consider (1) collaborative LO design and how teachers and prospective teachers can be enabled to develop LOs that meet at least the issues that the LORI measures; (2) how teachers with LOs can play a meta-cognitive function for students by probing their knowledge and reasoning, monitoring participation and student engagement, and (3) expanding the framework for supporting students through teachers’ LO authoring by considering the different backgrounds of students and preferred teaching/learning style of teachers/students, and (4) robust methods for evaluating students and teachers using different task regimes. 

References

Ainsworth, S. E. & Fleming, P. F. (2006). Teachers as instructional designers: Does involving a classroom teacher in the design of computer-based learning environments improve their effectiveness? Computers in Human Behavior, 22, 131-148.

Akpinar, Y. & Simsek, H. (2006). Learning object organization behaviors in a home-made learning content management. Turkish Online Journal of Distance Education, 7(4), Article 3. [Retrieved at 12 January 2007 from http://tojde.anadolu.edu.tr/tojde24/pdf/article_3.pdf].

Bell, B. (1999). Supporting educational software design with knowledge-rich tools. International Journal of Artificial Intelligence in Education, 10(1), 46-74.

Bennetta, K. & McGeeb, P. (2005). Transformative power of the learning object debate. Open Learning, 20(1), 15–30.

Boyle, T. (2003). Design principles for authoring dynamic, reusable learning objects. Australian Journal of Educational Technology, 19(1), 46-58.

Bratina, T. A., Hayes, D. & Blumsack, S. L. (2002). Preparing teachers to use learning objects. The Technology Source, November/December 2002.

Littlejohn, A. (2004). Reusing online resources: A substantial approach to e-learning. Routledge Falmer. London.

Cochrane, T. (2005). Interactive QuickTime: Developing and evaluating multimedia learning objects to enhance both face-to-face and distance e-learning environments. Interdisciplinary Journal of Knowledge and Learning Objects, 1. 33-54.

Cohen, E. B. & Nycz, M. (2006). Learning objects and e-learning: An informing science perspective. Interdisciplinary Journal of Knowledge and Learning Objects, 2, 23-34.

Conceição, S. & Lehman, R. (2003). An evaluation of the use of learning objects as an instructional aid in teaching adults. Paper presented at the 2003 Midwest Research to Practice Conference in Adult, Continuing, and Community Education. Ohio State University, Columbus, Ohio.

Dede, C. (2003). The role of emerging technologies for knowledge mobilization, dissemination, and use in education. [Retrieved at 12 January 2007 from http://www.virtual.gmu.edu/edit895/knowlmob.html].

Figg, C. & Burson, J. (1999). Student teachers as instructional designers: A first experience. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications (p. 1671). Chesapeake, VA: AACE.

Hannafin, M. J. & Hooper, S. (1989). An integrated framework for CBI screen design and layout. Computers in Human Behavior, 5(3), 155 165.

Haughey, M. &  Muirhead , B. (2005). The pedagogical and multimedia designs of learning objects for schools. Australasian Journal of Educational Technology. 21(4), 470-490.

Kay, R. & Knaack, L. (2005). Developing learning objects for secondary school students: A multi-component model. Interdisciplinary Journal of Knowledge and Learning Objects, 1, 229-254.

Krauss, F. & Ally, M. (2005). A study of the design and evaluation of a learning object and implications for content development. Interdisciplinary Journal of Knowledge and Learning Objects, 1, 1-22.

Lajoie, S. P. (2003). Enhancing learning and teaching with emergent technologies. Keynote presentation, ED-MEDIA Conference, June 26. Honolulu, HI. [Retrieved at 13 October 2006 from http://www.aace.org/conf/edmedia/speakers/lajoie.htm].

Li, Z. (2006). Effectively incorporating instructional media into web-based information literacy. The Electronic Library, 24(3), 294-306.

McCormick, R., Scrimshaw, P., Li, N. & Clifford, C. (2004). CELEBRATE Evaluation report (version 2). [Retrieved at 13 October 2006 from http://celebrate.eun.org/].

McGreal, R. (2004). Online education using learning objects. Routledge Falmer, New York.

Merriënboer, J. J. G. & Martens, R. (2002). Computer-based tools for instructional design. Educational Technology, Research and Development, 50, 5-9.

Muirhead, B. & Haughey, M. (2005). An Assessment of the Learning Objects, Models and Frameworks. Report Developed by The Le@rning Federation Schools Online Initiatives. [Retrieved at 13 October 2006 from http://www.thelearningfederation.edu.au].

Nesbit, J. C. & Li, J. (2004). Web-based tools for learning object evaluation. Proceedings of the International Conference on Education and Information Systems: Technologies and Applications, 2, 334-339.

Oliver, R., Harper, B., Hedberg, J., Wills, S. & Agostinho, S. (2002). Formalising the description of learning designs. In A. Goody, J. Herrington & M. Northcote (Eds), Quality Conversations: Research and Development in Higher Education, V. 25,  496-504. Jamison, ACT: HERDSA.

Recker, M., Dorward, J., Dawson, D., Mao, X., Liu, Y., Palmer, B., Halioris, S. & Jaeyang, P. (2005). Teaching, designing, and sharing: A context for learning objects. Interdisciplinary Journal of Knowledge and Learning Objects, 1, 197-216.

Stemler, L. K. (1997). Educational characteristics of multimedia: A literature review. Journal of Educational Multimedia and Hypermedia, 6(3/4), 339-359.

Van Joolingen, W. R., Jong, T. & Dimitrakopoulout, A. (2007). Issues in computer supported inquiry learning in science. Journal of Computer Assisted Learning. [Retrieved at 12 January 2007 from http://www.blackwell-synergy.com/toc/jca/0/0].

Waddoups, G. & Wentworth, N. (2002). Restructuring teacher education: Lessons from evaluating preservice teacher products using NETS. In C. Crawford et al. (Eds.), Proceedings of Society for Information Technology and Teacher Education International Conference (pp. 1821-1825). Chesapeake, VA: AACE.
 

About the Authors

Yavuz Akpinar is an associate professor at Bogaziçi University, Department of Computer Education and Educational Technology in Instanbul, Turkey. His research interests are in interactive learning environments design, human computer interaction, graphical user interfaces, simulations in learning, authoring systems for software design, educational testing, designing and evaluating multimedia and hypermedia in education and training, interactive video, distance education, learning object and e-learning design, Learning managements systems.

Akpinar@boun.edu.tr

Huseyin Simsek is an instructor at Bogaziçi University, Department of Computer Education and Educational Technology. His research interests are in teaching programming and scripting, interactive learning environments design, authoring systems for software design, computer mediated communication, web based learning design, learning managements systems

Huseyin.simsek@boun.edu.tr

 

go top
March 2007 Index
Home Page