Editor’s Note: Digital technologies have revolutionized distance learning and impacted traditional classrooms. Interactive multimedia, learning management systems, and computer-managed diagnostic-prescriptive learning with learning-objects have provided alternative solutions for quality education. Government, corporate, and academic administrators continue to question the cost-benefits. Are these the tools that will revolutionize teaching and learning in the twenty-first century? Where is the break-even point for current investments, and what profits can we expect in terms of quality, accelerated learning, and cost. Here is a view from Finland too stimulate the dialog.
Do Investments in Digital Learning Resources Pay Back?
Comparison of estimated marginal post-test means between different learning conditions (post-test scores adjusted by pre-test scores)
Study I, Mathematics (N = 35)
Traditional classroom condition (n = 16)
F (1, 34) = 3.777,
Learning object condition (n = 19)
High prior knowledge
Traditional classroom condition (n = 8)
F (1, 18) = 1.394,
Learning object condition (n = 11)
Traditional classroom condition (n = 8)
F (1, 15) = .2.165,
Learning object condition (n = 8)
Study II, Language (N = 37)
Traditional classroom condition (n = 18)
F (1, 36) = .894,
Learning object condition (n = 19)
Traditional classroom condition (n = 8)
F (1, 15) = 2.385,
Learning object condition (n = 8)
Traditional classroom condition (n = 10)
F (1,20) = .522,
Learning object condition (n = 11)
|Note. S.E. = standard error of the mean.|
Although students working in the traditional classroom environment slightly outperformed students in the learning object environment in both studies, the differences were not statistically significant (p > .05). In the study I (mathematics) students in the traditional classroom environment scored better than students in the learning object environment both within low and high prior knowledge groups. However, in the study II (language) the learning object environment was more beneficial than the traditional classroom environment for the students with low level of prior knowledge. Among the high prior knowledge level students, the traditional classroom environment was more effective. Nevertheless, the differences between the compared learning environments within the prior knowledge groups were not significant.
As the studies were similar in their designs and required same kind of learning skills, it is possible to combine the results of the individual studies. Instead of focusing only on the results of individual studies, it is more beneficial to investigate the impact of identical parameters across the studies simultaneously. By combining the results from individual studies we increase the sample size, which enables us to make firmer conclusions on the effectiveness of the compared learning environments and detect more easily statistical differences. The Stouffer method allows combination of p-values from multiple studies and computation of an average p-value for these studies (p-value is a direct function of sample size). Combined results are presented in Table 2.
Average impact of the learning conditions across studies
Learning object vs. Traditional classroom (N = 72)
p = .06, ES = -.64
p = .34, ES = -.31
p = .04, ES = -.47 ±.47
p = .16, ES = -.70
p = .48, ES = .30
p = .63, ES = -.12 ±.66
p = .25, ES = -.53
p = .15, ES = -.73
p = .07, ES = -.62 ±.69
Learning object condition = condition in which students worked with drill-and-practice LOs
Traditional classroom condition = condition in which students used traditional learning methods and paper-and-pencil tasks.
Low and high-prior knowledge division was based on the median split of students’ pre-test scores.
ES = standardized mean difference effect size (ES) with Hedges' (1981) bias correction. In other words, the mean difference expressed in standard deviation units. The basic formula to calculate ES is to first subtract the mean of groupy from the mean of groupx and then to divide this difference by the square root of pooled variance of these two groups (see Rosenthal, 1984, for details and formulas).
AVERAGE = Averaged results from individual studies with identical parameters. Average p-values have been calculated via Stouffer method (Mosteller & Bush, 1954; see Rosenthal, 1984). Average ES is an average effect size from individual studies when each ES is weighted by degrees of freedom (N-2) of each comparison (see Rosenthal, 1984, for details).
± = 95% confidence interval for the ES.
As can be seen that overall, there is significant difference in learning outcomes between the LO and the traditional classroom condition. The investigation of combined average results reveals that the students using paper-and-pencil tasks in traditional classroom environment outperformed the students using drill-and-practice LOs (p < .05). But how much more effective is the classroom environment? The mean difference - expressed in standard deviation units - is called standardized mean difference effect size which is reported in the table. As a general rule of thumb, a standardized mean difference effect size (ES) of .20 should be interpreted as small, .50 as medium, and .80 as large (Cohen, 1988). If we interpret the magnitude of the effect in that way, the average difference between the means of the traditional classroom condition and the LO condition is of medium size (ES = .47) in favour of the traditional classroom group. Another useful, and perhaps more concrete, way to interpret the effect magnitude is to consider the percentage of overlap between the scores (or distributions) of two conditions. Using this logic, an ES of .47 means that 68% of the students in the traditional classroom environment did better than the average student in the LO environment.
The more detailed investigation of impact on students’ learning outcomes within prior knowledge level groups shows that overall traditional classroom teaching was more effective than LOs within both low and high prior knowledge groups. However, the differences between the conditions are not significant, partly due to the fact that the sample sizes of the level groups remained small.
Recently, considerable investments have been placed on building up ICT infrastructure and developing sharable digital learning resources for the needs of education all around the world. Although these new instructional technologies raise huge optimism and dazzle us with their promises (e.g. Parrish, 2008), these eLearning investments can only be considered justifiable if they succeed in introducing improvements on teaching practices and enhancements on students’ learning outcomes in comparison to normal classroom teaching activities. However, to date there has only been sparse empirical evidence on the effectiveness of learning objects on learning performance. Therefore the main aim of this article was to investigate the effectiveness of LOs in comparison to traditional classroom teaching.
Although the individual studies did not highlight significant differences, the pooled results from both studies showed that students using traditional paper-and-pencil tasks outperformed the students working with drill-and-practice LOs. The results demonstrated that traditional classroom teaching is at least as effective as LOs in implementing expository teaching activities and fact-oriented learning behaviour. Therefore using LOs to replicate traditional teaching activities which rely on presentation, transmission, exercising, rehearsal and reproduction of knowledge does not seem appropriate.
Why were traditional classroom activities more effective than using drill-and-practice LOs in these two studies? Firstly, based on researchers’ general observation during the interventions there seemed to be differences between the studied environments in the level of students’ engagement. It can be concluded that students working with LOs had difficulties in concentrating on the content to be learned and the atmosphere was somewhat restless. Students seemed to be hurrying through the LOs and they were even competing who was the fastest in completing all the LO exercises. Students were also interested in solving how the LOs works, i.e. what was the logic behind given LOs, more than the learned content itself. Instead in traditional classroom environments there were not such difficulties with the students’ concentration.
Secondly the available instructional support and control may have affected on the students’ learning behaviour. In the classroom conditions the teacher led the class and therefore the teaching-learning activities were rather strictly controlled. The less-controlled LO environments placed more requirements on students’ self-regulation and self-discipline, whereas in the classroom contexts, the teacher controlled activities. In this way, the instructional support was slightly different in the LO condition as compared to the traditional condition. In the classroom contexts students were allowed to seek help from the teacher during the working phase, and at the end of the lessons, the tasks were collectively checked, whereas in the LO contexts students received no support from the teacher and only elementary feedback from the LO itself. These differences between the conditions in their level of control and instructional support can be a critical factor in explaining the differences in the successfulness of the learning environments.
A third possible reason may be associated with students’ learning habits. It is clear that students were more accustomed to typical classroom activities with paper-and-pencil assignments. It is likely that taking advantage of technology completely requires some time and very short-term eLearning interventions are found to be predominantly ineffective, as shown in a classic review study by Khaili and Shashaani (1994). According to their findings the ICT impact increased decidedly when the intervention duration expanded from couple of days to four to seven weeks. It may be that the duration of the LO interventions in our studies was too short to reveal the real effectiveness of such environments. Furthermore, there may be inherent problems in the mechanistic learning behaviour that both studies required. Fact-oriented learning and rehearsal activities do not always motivate students enough and in addition, they cannot understand the purpose or objective of their learning, for example, learning of grammatical rules. As a result of low motivation and meaningfulness learning behaviour students’ focus may have drifted away from the actual content to be learned.
The fourth explanations can relate to the game-like features of the LOs used in these studies. Although LOs’ game-like features are designed to raise learners’ motivation, they could also bring their own challenges and limitations. Students are accustomed to play computer games in their free time where gaming means relaxation and is entertaining. Consequently, when computer games are used in education, there easily exist discrepancies between the expectations of educators and students. Instead of using games for learning purposes, students often seek entertainment as they would in their free time gaming, and then, as a consequence, they do not regard the use of educational games as important learning situations. Students’ and educators’ aims can also conflict. Sometimes students may not try to achieve the actual objectives of the educational games, but are aiming for loss or negative feedback if they feel it is somehow more rewarding. For example, in our studies it was observed that students were making mistakes because they were willing to see the negative feedback within an LO as they regarded it as funny or entertaining. Based on our results on the effectiveness of game-like drill-and-practice Los, it can be argued that learning resources aiming at ‘edutainment’ are not effective in terms of content learning when compared to the academic performance achieved in normal classroom contexts.
However, the whole question about the effectiveness of eLearning is problematic, because research has shown that technology as such does not have any particular impact on learning, but the impact is always related to the ways of using ICT as a part of certain, emerged learning environments. Therefore, the focus of the research should be placed on the effectiveness of whole learning environments, not just on the type of eLearning technology used. LOs are just a new chapter in the story of educational technology innovations that do not necessarily lead to students to improved academic achievement (c.f. Clark, 1983). It remains evident that, in order to have effective LOs as well as all educational technology, applications require sound instructional design strategies founded on contemporary learning theories and research-based evidence. These findings highlight again the crucial significance of context. As we have found out the available instructional support is a critical factor in explaining the successfulness of learning environments. Our students in LO conditions were required to work in self-directed ways, however with more structured instructional guidance in using Los, students’ learning performance could be better. As Wang et al. (2008) argued it is important to help and support learners to adapt to and cope with the open self-directed learning environments.
No technology is inherently good or bad, but its applications can be judged good or bad. LOs hold many promises and possibilities in various learning contexts when used according to appropriate instructional strategies, but they should not be seen as the primary or only solution for the challenges of learning (Parrish, 2008). This point is related to any eLearning innovation, since taken to its extremes any technology ends up reversing its original benefits (McLuhan and McLuhan, 1988).
Although these results did not support promises of LOs to enhance students’ learning outcomes when compared to traditional instruction, there are other important elements that can be accomplished by using LOs. For example, using LOs can provide ways to enrich and diversify daily instruction practices; can develop students’ technical skills and more generally, can improve their attitudes towards technology; may increase the interaction among students and/or between students and teacher; and offers possibilities to create positive learning atmosphere where students are motivated to work towards attaining desired learning objectives. However, more research is needed on the interaction between various LO types, ways to implement LOs and learning outcomes.
In addition to these learning perspectives, LOs also provide means to reuse once produced learning materials as mentioned in myriad of LO literature. However, it should be borne in mind that beside promised benefits (at least on a rhetorical level) of cost savings and quickness in lesson and material preparation through content reusability and easiness of updating (e.g. Weller, 2004), developing and implementing new learning contexts with existing LOs will always be difficult, costly, time-consuming and technically demanding (Tompsett, 2005; Wilhelm & Wilde, 2005).
Agostinho, S., Bennett, S., Lockyer, L. & Harper, B. (2004). Developing a learning object metadata application profile based on LOM suitable for the Australian higher education market. Australasian Journal of Educational Technology, 20(2), 191-208.
Balanskat, A., Blamire, R. & Kefala, S. (2006) The ICT impact report. A review of studies of ICT impact on schools in
Bennett, K. & McGee, P. (2005) Transformative power of the learning object debate. Open Learning, 20(1), 15-30.
Bernard, R. M., Abrami, P. C. & Wade, C. A. (2007) A summary of review of e-learning in
Butson, R. (2003) Colloquium. Learning objects: weapons of mass instruction. British Journal of Educational Technology, 34(5), 667–669.
Cohen, J. (1988) Statistical power analysis for the behavioural sciences (2nd edition). (
Collis, B., & Strijker, A. (2004) Technology and human issues in reusing learning objects. Journal of Interactive Media in Education, 4. Available:
Condie, R. & Munro, B. (2007) The impact of ICT in schools – A landscape review. Available: http://publications.becta.org.uk/display.cfm?resID=28221&page=1835
Kalz, M., Drachsler, J., van Bruggen, J., Hummel, H. & Koper, R. (2008) Wayfinding services for open educational practices. International Journal of Emerging Technologies in Learning, 3(2). Available:
Kay, R. (2007) A systematic evaluation of learning objects for secondary school students. Journal of Educational Technology Systems, 35(4), 411-448.
Kay, R. H. & Knaack, L. (2007) Evaluating the learning in learning objects. Open Learning, 22(1), 5–28.
Khaili, A. & Shashaani, L. (1994) The Effectiveness of Computer Applications: A Meta-Analysis. Journal of Research on Computing in Education, 27(1), 48-61.
Lambe, P. (2002) The autism of knowledge management. Available: http://www.straitsknowledge.com
McCormick, R. (2008) Evaluation of a Large-scale European Learning Object production, distribution and use. In L. Lockyer, S. Bennett,
McCormick, R. & Li, N. (2006) An evaluation of European learning objects in use. Learning, Media and Technology, 31(3), 213–231.
McLuhan, M. & McLuhan, E. (1988) Laws of media: The new science (
Nurmi, S. & Jaakkola, T. (2005) Problems underlying the learning object approach. International Journal of Instructional Technology & Distance Learning, Nov. 2005, 2(11), 61-66. Available: http://www.itdl.org/Journal/Nov_05/
Nurmi, S. & Jaakkola, T. (2006) Promises and pitfalls of LOs. Learning, Media and Technology, 31(3), 269–285.
Parrish, P. E. (2004) The trouble with learning objects. Educational Technology Research & Development, 52(1), 49-67.
Parrish, P. E. (2008) Learning with objects. In S. Carliner & P. Shank (Eds.) Handbook of E-Learning. Past Promises, Present Challenges (
Rehak, D. (2006) Challenges for ubiquitous learning and learning technology. Educational Technology, January-February, 43–49.
Rehak, D. & Mason, R. (2003) Keeping the learning in learning objects, in: A. Littlejohn (Ed) Reusing online resources: A sustainable approach to e-learning (
Richards, G. (2002). Editorial: The challenges of the learning object paradigm. Canadian Journal of Learning and Technology, 28(3). Available: http://www.cjlt.ca/content/vol28.3/editorial.html
Rosenthal, R. (1984) Meta-analytic procedures for social research. Applied Social Research Methods Series. Volume 6 (
Sclater, J., Sicoly, F., Abrami, P. C. & Wade, C. A. (2006) Ubiquitous technology integration in Canadian public schools: Year one study. Canadian Journal of Learning and Technology, 32(1) Winter. Available: http://www.cjlt.ca/content/vol32.1/sclater.html
Stipek, D. J. (1993) Motivation to learn. From theory to practice. Second edition (Needham Heights, MA, Allyn and Bacon).
Tompsett, C. (2005) Reconfigurability: Creating new courses from existing learning objects will always be difficult! Journal of computer assisted learning, 21(6), 440-448.
Triona, L. M. & Klahr, D. (2003) Point and click or grab and heft: Comparing the influence of physical and virtual instructional materials on elementary school students’ ability to design experiments. Cognition and Instruction, 21(2), 149-173.
Wang, Y., Peng, H., Huang, R., Hou, Y. & Wang, J. (2008) Characteristics of distance learners: Research on relationship of learning motivation, learning strategy, self-efficacy, attribution and learning results. Open Learning, 23(1), 17-28.
Waxman, H. C., Lin, M. & Michko, G. M. (2003) A meta-analysis of the effects of teaching and learning with technology on students outcomes. Available: http://www.ncrel.org/tech/effects2/waxman.pdf
Weller, M. (2004) Learning objects and the e-learning cost dilemma. Open Learning, 19(3), 293-302.
Wilhelm, P. & Wilde, R. (2005) Developing a university course for online delivery based on learning objects: From ideals to compromises. Open Learning, 20(1), 65-81.
Sami Nurmi, M.Ed. was formerly a researcher at the Centre for Learning Research of the University of Turku, Finland. Currently Nurmi is studying to be an airline pilot and at the same time preparing his educational sciences PhD dissertation on learning with simulation learning objects. Correspondence should be sent via email email@example.com
Tomi Jaakkola, M.Ed. is a researcher at the Centre for Learning Research of the University of Turku, Finland. Jaakkola is currently preparing his educational sciences PhD dissertation on the use of learning objects (LOs) and computer simulations to promote conceptual change and students’ understanding of complex scientific concepts.
 Both studies were conducted as a part of Context eLearning with Broadband Technologies (CELEBRATE, http://celebrate.eun.org), a large-scale European R&D project that developed, shared and used a large number of LOs in schools across six European countries.
 In Stouffer method average ’p’ (i.e. statistical probability) is calculated by a) transforming each two-tailed ‘p’ into one-tailed ‘p’, b) transforming one-tailed ‘p’ into a standard normal deviation Z-score (signs of Z-score should indicate the direction of an effect), c) adding Z-scores together, d) dividing sum of Z’s by the square root of the number of studies, e) transforming the new Z statistic first back into one-tailed probability, f) and finally into two-tailed probability (see Rosenthal, 1984).