| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Editor’s Note: Instructional design continues to be explored as a means of optimizing the learning experience. It requires integration of what we know about the learner, what needs to be learned, and how proficiency is measured. At the end of the day we test performance and gather data on learner perceptions. Performance data tells us “how effective was the learning?” and “how well were performance criteria achieved?” Learner perceptions can be helpful in interpreting results and identifying areas for improvement. If students evaluate their own learning, this is perceived learning and not performance evaluation. Such data is subjective and additional research is needed for confirmation. Designing Online Learning Environments |
Figure 4a: Practice simulation | Figure 4b: Activity simulation |
If completion of one activity leads to another, place both of them on an internal navigational structure.
Give clear and appropriate instructions – The students should be able to clearly understand what is expected from them (what they should do as a whole and what they should post to the LMS). However, if students had questions regarding the activities they could discuss them with others using forums.
Add guided or help text where appropriate – If a learning activity was designed based on a simulation, we added guided or help text to motivate the learners to complete the activity. Activities which might need further clarifications for individual students were designed based on forums (Figure 5a) and students could ask questions and get help from other students and the teacher.
Figure 5a: Activity forum | Figure 5b: Discussion forum |
Add relevant feedback for the students’ interactions - Students in an OLE need to receive feedback to their responses. Therefore, we added automatic feedback to most of the activities. The activities which led to learner-learner interactions and learner-teacher interactions where students could receive feedback from the teacher and the other students were designed based on forums in the LMS.
Other than activity forums we had discussion forums (Figure 5b) which provided a discussion topic. The students had to learn by expressing ideas, commenting on others’ ideas, asking questions and replying to others’ questions with respect to the topic in the discussion forum.
Evaluations (quizzes):
We attempted to constructively align interactive learning content on the course units, activities and other components of the OLE with the relevant learning objectives. In order to determine whether students achieved the desired learning objectives, we created quizzes with a set of multiple choice questions. Those quizzes provided an opportunity for the students to evaluate their learning achievements by themselves.
Our instructional designers did not have much to do in designing the quizzes. The subject matter expert (course coordinator or the person responsible of providing teaching materials to the instructional designers) provided the questions for quizzes and assignments. The instructional designers added them to the LMS. However, in designing a quiz we need to:
Use only the questions given or accepted by the subject matter expert
Place the quiz at the end of each course section
Added relevant questions - The questions on a quiz should be based only on the course section where it is placed.
In a previous paper we reported that students were satisfied with the OLE and managed to learn more efficiently using the OLE once they were used to it (Weerasinghe et al., 2008). Also, the results reported on in that paper implied that the OLE could support learners having different learning style preferences. In this paper we report; (1) what design components lead to the student satisfaction towards the OLE and its content, (2) what design features and strategies lead to the learning effectiveness as perceived by the students and (3) whether there is a relationship between students’ learning styles and their learning design preferences.
The students’ experiences were gathered using debriefings and four types of questionnaires;
LSQ: 40-item Learning Style Questionnaire obtained from Peter Honey and Mumford publications- Students rated a set of 40 statements. Each statement asked whether the student agreed or disagreed with it. (The same questionnaire and the result set reported on in our previous paper.)
LEEQ (Learning Environment Evaluation Questionnaire): A questionnaire addressing specific attitudes of students towards facilities and features in the OLE and the LMS.
LCEQ (Learning Content Evaluation Questionnaire): A questionnaire targeting students’ experience and attitudes towards the elements of interactive learning content such as graphics, animations, simulations and activities.
LEEQ and LCEQ were developed by the authors of this paper and they consisted of questions on a Likert Scale, dichotomous questions, filter or contingency questions and unstructured or open-ended questions which made the students write any comments freely.
The students’ learning experiences reported on in this paper were gathered from three face-to-face meetings; two meetings during the semester and one after the final examination. The LSQ was distributed among the students during the 1st meeting as reported in a previous paper (Weerasinghe et al., 2008). The students who expressed their willingness to participate in future meetings were invited for the 2nd and the 3rd meetings. Only 27 students participated in all three meetings. Among them, there were 9 females and 18 males. The majority of them belonged to the age group 20-25.
The students answered the LEEQ and participated in the debriefing session at all three meetings. At the 3rd meeting students answered the LCEQ as well.
The 1st author of this paper played multiple roles in the design experiment reported on in this paper. She worked as the instructional designer, content developer, the author of the student manual and the teacher of the course. Another instructional designer of the UCSC who was officially responsible for the course development work helped the 1st author of this paper in packaging and uploading the learning content to the LMS.
The students’ reports on the LCEQ and the debriefings were used to find the design components and features that led to the student satisfaction towards the OLE.
At the 1st meeting the majority of the students appreciated the OLE for delivering downloadable student manuals and providing quizzes. However, when moving from the 1st meeting to the 3rd meeting, the students appreciated the interactive learning content and the forums as much as the quizzes. At the 3rd meeting the students did not even talk about the student manuals instead they expressed their satisfaction towards the components like interactive learning content, private messaging, chat room, forums and quizzes on the OLE. They reported that those components were quite useful in their studies. However, they added that they would have liked to have more scheduled chat sessions and audio-video content.
We drew a graph (graph1) based on the student satisfaction towards the components of the OLE reported on in the LCEQ. According to students’ responses, they appreciated interactive learning content more than the other components on the OLE (Graph 1). The features of the interactive learning content were appreciated especially for their helpfulness and usefulness for learning.
Structure of the learning content:
At the 3rd meeting the students reported on in the LEEQ that the navigation structure to access the learning content was appropriate and user-friendly. More than 70 % of the students commented that contents in OLE were properly organized on the Topic Outline page (menu page).
Design of the learning content:
About 60% of the students, who participated in the 3rd meeting, reported on in the LCEQ that online learning material had been very useful and altogether more than 96% of the students replied that it had been useful in their studies (Table 1). One student reported “Studying material gave a big help that I never expected. When there was a problem we received so many related answers from our colleagues. I would like this LMS to help us in our future studies too.” Another student noted that she could apply the knowledge she obtained from the OLE in her other studies. She reported, “The LMS content encouraged us to do the BIT exam well. The LMS content was very useful for us. We could learn a lot from them. I could use the knowledge I obtained from BIT online learning content in my other exams in IT.”
Online learning content was useful for the studies | 96.30% |
Learning Objectives were clear and students could achieve them | 83.95% |
Online learning content was useful for the studies | 96.30% |
Animations clarified the text content | 92.59% |
Animations with guided text explained the steps or procedures | 85.19% |
Simulations helped to study the lesson | 92.59% |
Graphics clarified the meaning of text | 92.59% |
The students’ ratings for design features of the learning content were very high (Table 1). A student who did not attend any formal teaching sessions for BIT degree courses reported “This was the first time I experienced such a learning method. As a student who totally depended on the LMS content, I regard that everything in it is good, specially the interactive learning content. It was easy to memorize facts when they were presented in lists and with interactive animations”.
The students found that simulations, other animations and graphics were very helpful to them in learning the lessons (Table 1). Following are three quotes taken from LCEQ.
“Slides were very interesting to see. So, we could study without getting bored.”
“Animated lessons were very good and easier to remember than studying them through notes.”
“Interactive learning contents were very useful to understand the theories.”
The text on the online learning content was appreciated for its simple language, font size and font type (Table 2). Also, the presentation of text content in lists was appreciated by the students. For example; one student reported on in the LCEQ, “OLE presented all the lessons in summaries. Therefore, we could finish the lesson quickly having knowledge about what we saw and read in the content” and another student reported “Lesson content was presented in bullets and it is useful to learn without wasting time.”
Simple Language | 81.48% |
Adequate amount of text on one page | 44.44% |
There was enough white space between the blocks of text | 48.15% |
Size of the text is appropriate | 81.48% |
Font type is good to read the text for a long time | 81.48% |
About 56% of the students found that there was not adequate amount of text on a page and 52% found that there was not enough white space between the blocks of text (Table 2). In the debriefing session students said that there were some pages that had too much text. Their comments relating to this problem referred to another course in the LMS but not to our online learning content. However, we appreciated this comment because it helped us to improve our set of instructional design guidelines presented in the next section of this paper.
Design of the Learning Activities:
The students of our OLE found that the learning activities on the OLE were quite helpful in their studies (Table 3). Also, according to our students’ reports, they could be online and study using the OLE for an average of 2.5 hours per visit. That can be interpreted as that the students found learning in the OLE interesting and when the students were given autonomy for their own learning they could learn for a longer duration of time.
At the first meeting we found that there were only 26% self-studying students who reported that they did not get any formal teaching for BIT degree studies in our sample. However, at the 3rd meeting, more than 85% replied that they were already or could be self-studying students in the OLE. Also, about 96% of the students reported that they could learn actively in the OLE. In elaborating their own replies in the questionnaire, the students reported that OLE made them actively involved in learning with different types of learning activities and they could collaboratively study with other students through forums and private messages in the LMS.
Activities were helpful for learning | 85.00% |
Activities could be completed after studying the learning content | 77.78% |
Forums helped to discuss the learning activities | 59.26% |
Forums helped to discuss other learning problems | 60.12% |
Practice Quiz helped to evaluate learning achievements | 92.59% |
Could actively learn in the OLE | 96.30% |
Was or can be a self-learner in the OLE | 85.19% |
Maximum duration of learning time per visit | 2.5hrs |
Even though we designed discussion and activity forums we did not design any group learning activities due to administration and online facilitation problems in the BIT degree programme. However, surprisingly more than half of our students found forums helpful for discussing the learning activities with others in the LMS. Also, the students found the forums useful to discuss their problems related to learning. Following are some of the comments given by the students regarding the helpfulness of forums.
“Subject Discussions were helpful to share our knowledge with others and to get more opinions from them.”
“I could ask questions from the teacher and the students.”
“When there was a problem, we received so many related answers from our colleagues.”
The students could evaluate their learning achievements by themselves using Practice Quizzes. The students reported that quizzes helped them to study the important areas of the lessons and face the exam confidently. One of the students commented, “Almost all the LMS questions were based on the syllabus. When I completed a section, I could go to the particular LMS quiz and evaluate my knowledge. That was a huge benefit to me”.
The LSQ (Learning Style Questionnaire) reported that there were 8 Activists, 12 Reflectors, 7 Theorists and 5 Pragmatists in our sample (Weerasinghe et al., 2008). The students’ preferences for design components on the OLE (students’ learning design preferences) reported on in the LCEQ were analysed with the students’ learning style preference (LSP)s. The results revealed that the students had appreciated the features of the learning content which supported their own learning styles. For example, when activists were happy about the online learning content because they could do the activities and discuss them in forums with the other students and the teacher, reflectors were happy about the animated lessons which helped them to remember the lessons more easily than the text based notes (The shaded area A in Table 4). Further, the students had requested more features or facilities that would again support their own learning styles. For example, pragmatists needed to have more support for the practical activities while theorists requested the UCSC to provide them with a search facility to find text in the learning content (The shaded area B in Table 4).
Activist | Pragmatist | Theorist | Reflector | |
I like online learning material. | Content presented in point form; could learn without wasting time | Could discuss subject problems with the teacher and other students | Could study lessons with pictures, animations and activities interestingly | Lessons in summaries; could study quickly |
Interactive learning content; very useful in our studies | Received hands on experience in using software without having it running in the computer | Interactive learning content; could solve our problems | (A) Animated lessons; easier to remember than going through notes | |
(A) Could do the activities and discuss in forums | Practice quiz and activities; very useful. | The learning content; very clear and easy to understand | Could ask questions from the teacher and students | |
Interesting and could complete lessons without getting bored. | Encouraged us to do the examination well. | Had all learning content | Practice quizzes; helped to study the key areas of the lessons and evaluate learning achievements | |
Simulations explaining how to do the tasks | Could use that knowledge in other activities /examinations | Subject discussions; useful to share my knowledge and get more ideas from others | Had all learning content we need to study | |
| | | | |
I like to have some more features /facilities | Add more challenging activities, activities that lead to experiments and group activities | Provide more activities and quizzes | Upload all content earlier so that we can go through them several times before the exam | Upload all content earlier so that we can go through them several times before the exam |
Provide more quizzes | (B) Give more support for practical activities | (B) Add search facility to find text in the learning content | Add a help page and a guide to use the LMS |
The student experiences reported on in this paper assert use of appropriate instructional design guidelines to design our online learning content. Further they suggested additional guidelines to enhance learner satisfaction and learning effectiveness in the future design of the OLE.
The results shown on Graph 1 and students’ experiences reported on in the debriefings show that student satisfaction towards the OLE was led by 1) interactive learning content, 2) practice quizzes and 3) learning activities. Learner-content interactions were a major factor in those components. Therefore, results can be interpreted as that learner-content interactions led to student satisfaction in OLE and that may comply with result reported by Rovai and Barnum (2003).
The students’ experiences reported on in the LEEQ assured that the contents of our OLE were well organized and placed on an appropriate navigational system. Also, the students of our OLE reported that OLE helped them to learn without wasting time. Therefore, the students’ comments reported on in this paper imply that the structure and organization of the learning content on an appropriate navigational system enabled students to quickly select what they wanted to learn. However, during the debriefings the students requested the addition of:
student guides to use the OLE, and
contact information for student support services such as technical guidance on the Topic Outline page.
The student experiences reported on in this paper assert that we have used appropriate instructional design guidelines to design our online learning content. Further, they provided information for addition of more guidelines to enhance learner satisfaction and learning effectiveness in the future design of the OLE.
The results shown on Graph 1 and students’ experiences reported on in the debriefings show that student satisfaction towards the OLE was led by interactive learning content, practice quizzes and learning activities. Learner-content interactions were a major factor in those components. Therefore, our results can be interpreted as that learner-content interactions lead to student satisfaction in the OLE and that may comply with a result reported by Rovai and Barnum (2003).
The students’ experiences reported on in the LEEQ assured that the contents of our OLE were well organized and placed on an appropriate navigational system. Also, the students of our OLE reported that OLE helped them to learn without wasting time. Therefore, the students’ comments reported on in this paper imply that the structure and organization of the learning content on an appropriate navigational system enabled students to quickly select what they wanted to learn. However, during the debriefings the students requested to add
student guides to use the OLE and
contact information of student support services like technical guidance to the Topic Outline page.
Selecting media for learning:
The students’ comments on the overall functionality of the interactive learning content were mainly focused on features such as learner engagement, interactivity and accessibility of the learning content. Therefore, in deciding what media is suitable for a learning content, the instructional designers should consider whether it;
can motivate the learners
can entice the learners
is constructively aligned with the learning objectives
can handle or support interactivity
will not exceed the weight limit that the network can have.
Text:
Text in blocks having short paragraphs or lists with bold key words helped the students to go through the learning content easily and quickly. Further, the reports of our students implied that they were motivated to go through the detailed information and do the activities placed on hyperlinks. However, about half of the students of this study reported that they found too much text and too little white space on the online learning content pages in the LMS (Table 3). Therefore, we added two more guidelines to the set of text design guidelines.
Keep one line of white space between blocks of text to increase readability
If main page has text that cannot be easily accommodated in the available space on the template, redesign the text (Identify the key text and place it on the main page and add the other text to the links on the main page or place the text on two or more consecutive main pages).
Graphics:
More than 90% of our students found the graphics on the learning content useful for understanding the meaning of the text content. This implies that we have designed the graphics on the online learning material to support learning and it may agree with Carney and Levin (2002) who concluded that carefully constructed graphics can enhance learning from text. However, there were a few students who reported access problems of some of the graphics. This comment helped us to improve one of our design guidelines.
Check the weight of the graphic before adding it to the online learning content. If it exceeds 500KB then split it into two using graphic-editing software and add these close to one another on the online learning content.
Animations:
Almost all the students who participated in this study replied that animations in the online learning material helped them to understand the concepts. Also, according to the students’ comments, the simulations on the online learning material enabled them to get hands-on experience in using the Web-design application without even having it installed in their computers. Therefore, the students’ experiences reported on in this paper can strengthen the reasoning of Syrjakov, Berdux & Szczerbicka (2000) who noted that not only the quality but also the efficiency of an e- learning material can be enhanced by using animations. However, the debriefings reported that the students needed more time to read the text on animations. Also, our students suggested that if an animation plays text, then it is important to have control buttons to allow students to control the pace. Therefore, we added two more guidelines to our set of guidelines to design animations.
Play the text more slowly in an animation which contains text and graphics
Design animation in steps and add control buttons to enable learners to control the pace.
Audio:
There were only a few audio files in our learning content in order to avoid exceeding the weight limit of the animation file. However, at the debriefing, the students replied that they would have liked to have audio playing with animations. Therefore, we decided to improve the last guideline in our list as follows to design the animations for online learning content.
Add audio where necessary if it is not going to exceed the weight limit of the file
Add audio control buttons
The results we reported on in a previous paper implied that our students could efficiently use the OLE and its content in their studies (Weerasinghe, et al., 2008). Based on the analysis of student experiences we can conclude that most of the students found online learning activities (Table 3) and learning content (Table 1) useful in their studies and they could learn actively in the OLE. This implies that our students could actively construct knowledge using the OLE. Even though forum participation was not compulsory for doing the activities, more than half of our students reported that discussions with other students and the teacher via forums were useful in their studies. However, the students’ reports on debriefing revealed that they preferred to have links to access the relevant forums from the interface of the interactive activity or the learning content. This leads us to add the following guideline to our list of guidelines for design of the learning activities.
§ If a lesson activity leads to a forum discussion, use a link to access it from the activity.
§ If there is an activity based on a lesson page or a sub-section of a lesson which leads to a forum, then give the link to access that forum within the learning content itself.
§ Students’ learning styles and their learning design preferences
Consider learning style preferences in designing online learning content. By analysing the students’ experiences, it became clear that our students appreciated the features of the OLE with respect to their own learning styles. However, according to Honey (2007) a student can have more than one learning style preference and their learning style preferences can change over the time. Therefore, it is important to consider the requirements of the learners with different learning style preferences in designing distance OLEs.
An OLE which was reported as successful in achieving learner satisfaction and learning effectiveness was further studied to determine what design components of it led to the learner satisfaction and what design strategies used to design the learning content and design features led to learning effectiveness. The student experiences of learning in the OLE were gathered using questionnaires and debriefings. We analysed the data to find whether there was a relationship between students’ learning style preferences and students’ learning design preferences. We found that our students were satisfied with the design of the interactive learning content, learning activities and the evaluations. The students’ learning effectiveness was led by the structure of the learning content, design of the interactive learning content, activities and quizzes. Also, we found that there was a relationship between the students’ learning styles and the students’ learning design preferences. These findings helped us to improve our set of instructional design guidelines for design of the online learning content for novice online learners, and especially for distance learning programmes, computer applications, and information technology.
We received help from the staff at the e-Learning Centre of the University of Colombo School of Computing to carry out the work reported on in this paper. The work presented in this paper is funded by the Swedish International Development Cooperation Agency.
Allinson, C. W., & Hayes, J. (1988). The learning styles questionnaire: An alternative to Kolb’s inventory?. Journal of Management Studies, 25(3), pp. 269-281.
Berge, Z. L. (1998). Guiding principles in Web-based instructional design. Educational Media International, 35(2) p. 72.
Bork, A., & Britton Jr., D. R. (1998). The Web is not yet suitable for learning. Computer, 31(6), pp. 115-116.
Bostrom, R. P., Olfman, L., & Sein, M. K. (1990). The Importance of learning style in end-user training. MIS Quarterly, 14(1), pp. 101-119.
Brown, D. J., Powell, H. M., Battersby, S., Lewis, J., Shopland, N., & Yazdanparast, M. (2002). Design guidelines for interactive multimedia learning environments to promote social inclusion. Disability and Rehabilitation, 24(11-12), pp. 587 – 597.
Carney, R., & Levin, J. (2002). Pictorial illustrations still improve students' learning from text. Educational Psychology Review, 14(1), pp. 5-26. Retrieved February 4, 2009, from Academic Search Premier database.
Cassidy, S., & Eachus, P. (2000). Learning style, academic belief systems, self-report student proficiency and academic achievement in higher education. Educational Psychology, 20(3), pp. 307-320.
Chapman, A. (2003). Kolb learning style: Honey and Mumford's variation on the Kolb system. Retrieved September 2, 2008, from http://www.businessballs.com/kolblearningstyles.htm.
Dewald, N., Scholz-Crane, A., Booth, A. & Levine, C. (2000). Information Literacy at a distance: Instructional design issues, Journal of Academic Librarianship, 26(1), pp. 33-44.
Ecom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education, 4(2), pp. 215-235.
Fung, Y. H., Ho, A. S. P. & Kwan, K. P. (1993). Reliability and validity of the learning styles questionnaire. British Journal of Educational Technology, 24(1), pp.12–21, 1993.
Galletta, D. F., Henry, R., McCoy, S., & Polak, P. (2004). Web site delays: How tolerant are users?, Journal of the Association for Information Systems, 5(1), pp. 1-28.
Goodyear, P. (2001). Networked Learning in Higher Education: Notes and guidelines, In The Final Report to JCALT, 3.
Grabinger, R.S. (1993). Computer screen designs: Viewer judgements, Journal of Educational Technology Research and Development, 41(2), pp. 35-73.
Gunawardana, K. D. (2005). An empirical study of potential challenges and benefits of implementing e-learning in Sri Lanka. Proceedings of the Second International Conference on eLearning for Knowledge-Based Society, August 4-7, Bangkok, Thailand.
Harasim, L. (1989). On-line education: A new domain. In R. Mason & A. Kaye (Eds.), Mindweave: Communication, Computers, and Distance Education, pp. 50-62.
Honey, P. (2007). Learning Style Questionaire: 40-item version, [PDF], Peter Honey Publications Ltd.
Ismail, J. (2002). The design of an e-learning system beyond the hype, Internet and Higher Education, 4, pp. 329-336.
Jonassen, D. H., Davidson, M., Collins, M., Campbell, J. & Haag, B. B. (1995). Constructivism and computer-mediated communication in distance education, American Journal of Distance Education, 9(2), pp. 7-26.
Kim, D., & Gilman, D. A. (2008). Effects of text, audio, and graphic aids in multimedia instruction for vocabulary learning, Educational Technology & Society, 11(3), pp. 114-126.
Kim, S. & Sonnenwald, D.H. (2002). Investigating the relationship between learning style preferences and teaching collaboration skills and technology: An exploratory study. Proceedings of the ASIS&T Conference, pp. 64-73.
Leflore, D. (2000). Chapter VI: Theory Supporting design guidelines for web-based instruction, In: B. Abbey (Ed.), Instructional and cognitive impacts of web-based education, (pp. 102–117). Idea Group Publishing.
Levy, Y. (2007). Comparing dropouts and persistence in e-learning courses. Computers and Education, 48(2), pp.185-204.
Mergel, B. (1998). Instructional Design & Learning Theory. Retrieved from February 2, 2009, from http://www2.yk.psu.edu/~jlg18/506/Instructional%20Design%20by%20Mengal.pdf
Merrill, M.D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), pp.43-59.
Nah, F. (2003). A study on tolerable waiting time: how long are web users willing to wait?. Proceedings of the American Conference on Information Systems, no. 285. Retrieved February 22, 2009 form http://aisel.aisnet.org/amcis2003/285
Owens, L. & Barnes, J. (1992). Learning preference scales: Handbook and test master set, Australian Council for Education Research Ltd, Melbourne.
Peng, L. L. (2002). Applying learning style in instructional strategies. CDTL Brief, 5(7), pp.1-3.
Phillips, R. (1998). What research says about learning on the Internet?. Proceedings of EdTech'98. Retrieved January 20, 2009, from
http://www.ascilite.org.au/aset-archives/confs/edtech98/pubs/articles/phillips.html
Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic it skills training. MIS Quarterly, 25(4), pp. 401-426.
Reiber, L.P. (1990). Animation in computer-based instruction. Educational Technology Research and Development, 38(1), pp. 77-86.
Reigeluth, C. M., Merrill, M. D., Wilson, B. G., & Spiller, R.T. (1980). The elaboration theory of instruction: A model for sequencing and synthesizing instruction. Instructional Science, 9, pp. 195-219.
Rovai, A.P., & Barnum, K.T. (2003). On-line course effectiveness: An analysis of student interactions and perceptions of learning. Journal of distance education, 18(1), pp.57.
Smith, S.M., & Woody, P.C. (2000). Interactive effect of multimedia instruction and learning styles. Teaching of Psychology, 27(3), pp. 220-223.
Stefanov, K., Stoyanov, S. & Nikolov, R. (1998). Design issues of a distance learning course on business on the Internet. Journal of Computer Assisted Learning, 14(2), pp. 83-90.
Stemler, L.K. (1997). Educational characteristics of multimedia: A literature review. Journal of Educational Multimedia and Hypermedia, 6(3/4), pp. 339-359.
Syrjakov, M., Berdux, J. & Szczerbicka, H. (2000). Interactive web-based animations for teaching and learning, Proceedings of the 2000 Winter Simulation Conference, J. A. Joines, R. R. Barton, K. Kang & P. A. Fishwick (Eds.), pp. 1651-1659.
Tessmer, M. & Richy, R. C. (1997). The role of context in learning and instructional design, Educational Technology Research and Development, 45(2), pp. 85-115.
Weerasinghe, T. A., Nishakumari, K. M. G. B., & Hewagamage, K. P. (2007). Gap between theory and practice: Human factors in designing and developing effective e-learning materials for a structured syllabus. Proceedings of the Fourth International Conference on eLearning for Knowledge-Based Society. [online] Retrieved January 20, 2009 from http://www.ijcim.th.org/v15nSP3/P19eLearningAP_GepBetweenTheory.pdf
Weerasinghe, T. A., Ramberg, R. & Hewagamage, K. P (2008). Learners' satisfaction, learning style preferences and effective use of an OLE, International Journal of Emerging Technologies in Learning (iJET), 3, pp. 77-85. [Abstract available from http://online-journals.org/i-jet/article/view/760]
Young, L.D. Bringing theory and practice: Developing guidelines to facilitate the design of computer-based learning environments. Canadian Journal of Learning and Technology, 29(3).
Thushani Alwis Weersinghe is a Ph.D. student at the department of computer and systems sciences, Stockholm University and the Royal Institute of Technology (KTH), Sweden. Her research area is online learning environments and she has a special interest on designing online learning environments for distance learning programmes. She is on leave for her studies from the University of Colombo School of Computing (UCSC), Sri Lanka. Before getting leave she worked as a teacher and also as a Team Leader of the e-Learning Centre at the University of Colombo School of Computing (UCSC).
e-mail: thushani@dsv.su.se. Tel: +468161668
Robert Ramberg (Ph.D. in cognitive psychology) is a Professor in Computer and Systems Sciences at Stockholm University. He is the research director of the knowledge and communication laboratory at the department of computer and systems sciences, Stockholm University and the Royal Institute of Technology (KTH), Sweden. In his research he has particularly focused theories of learning (socio-cultural perspectives on learning and cognition), pedagogy and how these theories must be adapted when designing and evaluating computer based learning and training environments. Of particular interest has been how artefacts of various kinds (IT and other tools) mediate human action, collaboration and learning.
e-mail: robban@dsv.su.se. Tel: +468164914
Dr. K. P. Hewagamage obtained his B.Sc. special degree in Computer Science (First Class Honors) from the University of Colombo and the Doctor of Information Engineering from Hiroshima University in Japan. The Professor Mohan Award for the outstanding computer science graduate in 1994, the best paper award at IEEE International Conference of Visual Languages in 1999, an award for the excellence in research by the University of Colombo in 2004 and 2006, are some of awards received for his academic activities. He has more than 60 publications in international peer reviewed journals and conference proceedings. He is a senior member of IEEE, an academic advocate of ISACA and a member of ACM. He is also current chair of IEEE Computer Society Chapter in Sri Lanka. Dr. K. P. Hewagamage is a senior lecturer in computer science and the head of the e-Learning Centre at the University of Colombo School of Computing (UCSC).
e-mail: kph@ucsc.cmb.ac.lk Tel: +94112581245