Skip to main content

Comparison of different approaches on example-based learning for novice and proficient learners

Abstract

This study examined the effectiveness of two example-based instructional procedures. One procedure examines the effect of example-problem pairs when they are simultaneously presented to a participant. The other procedure involved studying an example and then removing it and having the participant solve a similar problem. Participants were assigned randomly to two above procedures. Cognitive load effects and learning outcomes were assessed. The results indicate that in the low prior knowledge group, learners who were simultaneously given examples and problems performed more effectively and efficiently, and reported significantly lower cognitive load than those who learned from example-problem pairs. In the high prior knowledge group, no significant difference was observed between the two procedures. This study provides empirical support for the process of learning from examples and problems simultaneously. It suggests that the effectiveness of certain instructional procedures in example-based learning depends upon the level of a learner’s prior knowledge.

Background

Cognitive-load theory (CLT) originated in the 1980s, when Sweller [1] argued that instructional design should be used to reduce the cognitive load on learners. Researchers continued developing and expanding the idea in the 1990s [24]. Several randomized controlled experiments generated a series of effects related to CLT [5]. The worked-example effect is a typical example of a learning effect derived from CLT [6]. After learning basic theoretical knowledge, it is more effective for novices to study examples than to practice using problem exercises [7]. However, the worked-example effect is only applicable to novices. As expertise increases, and the worked-example effect fades, such that actual practice solving problems becomes more effective. This is known as “the expertise reversal effect” [8].

Cognitive load theory and worked-example effect

CLT is a theory related to instructional design. It has been claimed that the limited cognitive resources of humans means that instructional methods must be designed carefully to ensure that learners are able to utilize their cognitive resources efficiently. There are three types of cognitive load: intrinsic, extraneous, and germane. Intrinsic cognitive load is created by the complexity of materials [3, 4]. The more complex information is, the more intrinsic cognitive load a learner experiences. Unrelated activities or the inappropriate presentation of materials can hinder learning and generate extraneous cognitive load. As long as materials and learning goals remain constant, the intrinsic cognitive load will remain fixed. Changing an instructional procedure can alter the extraneous cognitive load on learners. Germane cognitive load is defined as the amount of human cognitive resources devoted to the intrinsic cognitive load and is highly related to the learner’s motivation and level of engagement in learning. Germane cognitive load is associated only with the instructional designs that increase the use of the working-memory resources to reduce intrinsic cognitive load during the learning process [9].

In 2010, Sweller revised the formulation of CLT that element interactivity is considered as the main source of intrinsic, extraneous, and germane cognitive load [9]. Reducing the number of interactive elements in learning materials can alleviate the intrinsic cognitive load, while extraneous load can be decreased by reducing the number of interactive elements in the instructional procedure [10]. Overall, cognitive load is a combination of intrinsic and extraneous cognitive load. Therefore, the amount of working memory that is required when a learner is engaged in deciphering a number of interacting elements determines the overall cognitive load during learning.

In the initial learning phase, without the aid of proper problem-solving schemas, a learner may try to solve problems by randomly searching through existing schemas in his long-term memory. This causes the learner to focus their attention on specific characteristics of the problem and try to eliminate differences between current problem states and goal states rather than on schema-relevant principles. The attempt to eliminate differences between problem states requires that the learner maintain multiple sub-goals and consider different solutions, thereby leading to extraneous cognitive load [11]. On the other hand, if the learner has already developed an efficient problem-solving schema, studying worked-examples may become a redundant activity that contributes little or nothing to further development.

Recent research regarding example-based learning strategies

Worked-examples do help if they are well designed and presented to the novice in an appropriate fashion. Studying worked-examples can be more effective and efficient than actively solving problems for novice learners. Thus, many example-based learning strategies have been implemented in order to observe the learning effects on learners of various skill levels.

Three approaches are commonly used to manipulate examples and problems when teaching a new learner. The first approach encourages the student to study several worked examples before practicing using several problems [12]. The second approach is to study one worked example first, and then to practice a similar problem, and continue switching between the two through several cycles (example-problem pairs). Sweller and Cooper [13] discovered that example-problem pairs may provide more help than does studying examples only. Other studies have shown that studying example-problem pairs is more effective than solving problems only [1317]. The third approach involves practicing a problem first, and then studying a similar (the same structure but different data) worked example, and repeating this process several times (problem-example pairs). A few studies argued that let learners attempt to solve problems first and if they experience frustration, they might be more willing to study the worked-example and focus on the steps that they could not solve. [18, 19].

Reisslein et al. [19] showed that there is an interactive relationship between instructional procedures and prior knowledge. Learners with low prior knowledge appear to benefit most from example-problem pairs. Learners with high prior knowledge appear to benefit most from problem-example pairs. Van Gog et al. [20] examined four different conditions: worked-examples only, example-problem pairs, problem-example pairs, and problem solving only. Their results indicate that no differences exist between worked-examples-only and example-problem pairs, nor between the problem-example pairs and problem-solving-only. However, the situations involving problem-example-pairs and problem-solving-only were less effective than those involving only worked-examples-only or example-problem-pairs.

To summarize, example-problem pairs have been shown to be more effective than examples or problems only, and the effectiveness of each method may depend on the prior knowledge of the learners.

A CLT perspective: effectiveness of learning from examples and problems simultaneously

Despite substantial research into example-based learning, no study has examined the effectiveness of learning from examples and problems simultaneously.

Suppose a learner is doing his/her homework. If he/she were unable to find a way to solve a problem, what will he/she do? The students could try to find a similar worked-example in the textbook and imitate those steps in solving the current problem. This practice of learning from examples and problems simultaneously does exist in real learning scenarios, and it may be a useful method for novices engaged in learning how to solve a problem.

Based on previous analysis, maintaining sub-goals and considering different solution options can lead to extraneous cognitive load. Excessive extraneous load can explain why problem solving is not effective for novice learners. However, problem solving provides more active thinking than does the study of examples, which may be the reason why problem solving can be beneficial to those with more expertise. In other words, problem solving allows proficient learners to gain more experience than they would simply by studying worked-examples. Learning from examples and problems simultaneously frees novice learners from the need to consider different solution options. They can then construct an effective problem-solving schema and immediately put it to practice in the process of problem solving.

According to the contiguity principle [21], using examples and problems simultaneously should be more effective than using examples and problems separately, particularly when learners have to imitate worked-examples in order to solve current problems. When an example and a problem are separated from one another by time, people must use their limited cognitive resources to search their past experiences and come up with a satisfying match to the current problem. This generates extraneous cognitive load that is unrelated to the instructional goal. In contrast, when an example and a problem are presented simultaneously, the learner can hold both together in working memory and therefore make meaningful connections between them.

If a learner has a high level of prior skill, the problem can be solved independently by using the worked-example as a reference. They do not need to devote limited cognitive resources to scanning among schema that they have already developed. Therefore, from a CLT perspective, learning from examples and problems simultaneously may be an effective method for novices as well as proficient learners.

However, when viewed from a traditional perspective, there is a concern that learners who use examples and problems simultaneously may simply imitate the examples, rather than developing an understanding of them on a deeper level. This may lead to failure in far-transfer tests.

The present study

This study investigated the effectiveness of learning from examples and problems simultaneously. Both approaches were examined in learners with different levels of prior knowledge, and a near-transfer test and a far-transfer test were used to evaluate learning effectiveness. The control group learned from examples and problems separately.

Several researchers have used learner-controlled instruction, in which learners decide how long they need to learn something. Other studies have used learning time to measure cognitive load [22]. However, according to the revised formulation of CLT [9], both intrinsic and extraneous cognitive load are defined as the number of interacting elements that must be dealt with simultaneously. The number of tasks presented to a particular learner remains constant. When the learner has more time, he/she will have more time to deal with interacting elements, such that his/her self-reported cognitive load score may be lower [22]. Therefore, we determined that learning time should be fixed in order to obtain an accurate measurement of cognitive load.

To summarize, this study seeks to assess the effectiveness and efficiency of learning from examples and problems simultaneously, as opposed to learning from example-problem pairs. Furthermore, the problem is addressed with regard to novices as well as proficient learners and we also sought to determine whether learning remains effective in a near-transfer test as well as a far-transfer test.

Methods

Participants and design

Participants included 93 freshmen college students in Taiwan. Among them, 49 participants were majoring in mathematics (30 male), and 44 participants were majoring in business (15 male). In Taiwan, senior high school students are placed in either the science track or the social science track. The tracks diverge greatly in their mathematics and science curricula. Advanced math and science subjects are only taught in science track. In this study, the mathematics majors came from the science track, and the business majors came from the social science track. There exists a significant difference in mathematical knowledge between these two majors. Both groups studied the basic concepts and methods involved in differentiation, but had no experience applying them to solving optimization problems in calculus. The participants were randomly assigned to one of two conditions: (1) simultaneous example and problem and (2) example-problem pairs. The study groups are presented in Table 1.

Table 1 Group assignment by major

Materials

Prior-knowledge test

The prior-knowledge test consisted of three questions with six items regarding basic concepts and methods for solving optimization problems. Examples of items are as follows: “What is a critical number of a function?” and “How do you set up a function from a word problem?” One of the problems in the pretest is presented in Fig. 1.

Fig. 1
figure 1

Sample pretest problem

Training tasks

Examples and problems were selected from “Discussion” of the calculus textbook “Essential Calculus” [23]. The worked-examples consisted of a problem formulation, solution steps, and the final solution. Each worked-example and exercise problem contains four solution steps, including “set up the function,” “obtain the function of one variable,” “find the domain of the function,” and “obtain the critical number and check the answer.” Each step was clearly labeled and visually distinguished from the other steps. The worked-example revealed the correct solution for the solution step, and the participants wrote their solution for the solution step in the exercise problem. The exercise problem in the same pair was highly similar to the worked example with regard to structure, but the surface features were different. Examples and problems from the training task are presented in Fig. 2 (examples) and Fig. 3 (problems).

Fig. 2
figure 2

Example training task

Fig. 3
figure 3

One problem used in the training task

Test tasks

The test consisted of six problems. Four of the problems had a similar structure but surface features different from the training tasks, which were used for the near-transfer test. The other two problems had structural features that had not been learned from the training tasks, and these were used for the far transfer test. The examples of near-transfer and far-transfer problems are presented in Figs. 4 and 5, respectively.

Fig. 4
figure 4

Example of a near-transfer problem

Fig. 5
figure 5

Example of a far-transfer problem

Self-reported mental effort and difficulty rating scale

We employed two commonly used techniques for the measurement of cognitive load: the Mental Effort Rating Scale and the Difficulty Rating Scale.

The nine-point Mental Effort Rating Scale was developed by Paas [24]. In it, participants are asked to rate how much mental effort they had to invest in studying the preceding example or solving the preceding problem. Answer options range from (1) “very, very low mental effort” to (9) “very, very high mental effort.” This scale is widely used in educational research [2, 25, 26].

The Difficulty Rating Scale asks learners to make a retrospective judgment concerning every example or problem with regard to difficulty. It also uses a 9-point scale ranging from 1 (extremely easy) to 9 (extremely difficult). The Difficulty Rating Scale has been used somewhat less frequently in multimedia research [27, 28]. The questionnaire for the measurement of cognitive load is presented at the bottom of Fig. 2.

Procedure

The experiment was conducted in two 45-min sessions at the schools of the participants. Participants first received general information about the experimental procedure, including details related to the general sequence and time-keeping procedure. They then completed the 15-min prior-knowledge test before beginning work on the training tasks under the experiment conditions.

In the example-problem-pairs, participants studied an example for 4 min, whereupon the experimenter removed the paper with the example and gave participants a problem that was similar to the example they had just examined. Participants were asked to solve the problem within 7 min, after which the experimenter collected the papers.

In the simultaneous example and problem situation, the same example and problem used in the example-problem-pairs group were simultaneously presented to the participants. Participants were expected to check the example and solve the problem within 11 min, after which the experimenter collected the papers.

After two pairs of examples and problems had been completed as outlined above, participants were asked to solve two problems as a near-transfer test (7 min each). They were then given 7 min to rest and then presented another two pairs of examples and problems as well as two near-transfer-test problems to be completed as before. Finally, participants were asked to finish two problems (within 7 min each) as a far-transfer test.

Every example, problem, and test was printed on A4 paper, as were the questions used to measure mental effort and difficulty. Participants were asked to complete a mental effort and difficulty rating sheet after each example or problem.

Data analysis

The maximum score on the prior-knowledge test was six points. The questions contained six items: “set up the function,” “obtain the function of one variable,” “identify the concept of the critical numbers,” “apply the critical number,” “obtain the first derivative,” and “check the answer.” A correct answer for an item earned 1 point. The maximum score on each test problem was four points. The problem contained four items: “set up the function,” “obtain the first derivative,” “obtain the critical number,” and “check the answer in the domain.” A correct answer to an item earned 1 point. Both exercise and test problems were scored by one of the authors of the study who was not aware of the experimental group to which the participant belonged.

Results

Six participants had missing values on the test tasks and were therefore excluded from analysis. Five of the excluded participants were business majors and one was a mathematics major. The mean performance, mental effort, and difficulty rating are shown by learning condition in Table 2.

Table 2 Means (SD) for prior knowledge, mental effort, difficulty, exercise, and test performance per condition

Near-transfer and far-transfer performance

One-way analysis of variance on the far-transfer score with the near-transfer score as covariate revealed no difference between the two conditions among students with either major: F(2,93) = 0.002, MSE = 0.018, P = 0.960 > 0.05. Learning from examples and problems simultaneously did not lead to failure in far-transfer tests; therefore, we used the total test score as a measure of learning achievement.

Cognitive-load measurements

The total scores and the data related to mental effort and difficulty ratings in the learning phase and test phase were analyzed using Pearson correlation analysis. The correlation coefficient between mental effort in the learning and test phases was 0.904, P < 0.001, and the correlation coefficient between difficulty rating in the learning and test phase was 0.888, P < 0.001. The cognitive load measured in the learning phase was highly correlated with cognitive load in the test phase. Based on prior research, the cognitive load measured in the learning phase is considered a reflection of the quality of instruction, whereas the cognitive load measured in the test phase is considered a reflection of the quality of the schema [29]. Our aim was to investigate the effectiveness of learning methods; therefore, we considered only cognitive load measured in the learning phase.

A negative correlation was shown to exist between the total score and mental effort as well as difficulty. The correlation coefficient between the total score and the difficulty rating was higher than that between total score and mental effort (see Table 3). In this study, learning materials were kept constant under each study condition, in keeping with the CLT perspective; therefore, the intrinsic cognitive load was the same under the two conditions. Learners receiving a better score in one of the conditions was viewed as an indication that the extraneous cognitive load of that situation was lower than that of the other. The measurement that had a highly negative correlation to total score would be a better gauge of cognitive load. Therefore, difficulty ratings may be a better index than mental effort ratings. This study used difficulty rating scale scores in the learning phase to measure cognitive load.

Table 3 Correlation matrix of dependent measures and total score

Prior knowledge test

In the mathematics group, multivariate analysis of variance on total score and difficulty rating with prior knowledge as a covariate showed no difference between the two conditions: Pillai’s trace = 0.009, F = 0.215, P = 0.807 > 0.05. Results in the business group also showed no difference between conditions: Pillai’s trace = 0.080, F = 1.736, P = 0.189 > 0.05. These results demonstrate that simultaneous-examples-and-problems are only marginally more effective and efficient than example-problem pairs in the case of business majors.

Not all of the participants majoring in mathematics were proficient in calculus, and not all the participants majoring in business were novices. In order to investigate the effect of learning from examples and problems simultaneously for novice as well as proficient learners, it was necessary to divide learners into high- and low-level prior-knowledge groups. The median score of the prior-knowledge test was 3 (maximum 6); therefore, mathematics-majors who scored 3–6 on the prior-knowledge test were assigned to the high-level group, whereas business-majors who scored 0–2.5 on the prior-knowledge test were assigned to the low-level group. The means for prior knowledge, performance, and difficulty rating per condition for the high-level and low-level groups are presented in Tables 4 and 5.

Table 4 Means (SD) for prior knowledge, difficulty, exercise, and test performance scores in the high-level group
Table 5 Means (SD) for prior knowledge, difficulty, exercise and test performance scores in the low-level group

Effectiveness of presenting examples and problems simultaneously

In the high-level group, multivariate analysis of variance on mean total score and difficulty rating with prior-knowledge test scores as a covariate showed no difference between conditions: Pillai’s trace = 0.005, F = 0.077, P = 0.926 > 0.5. Among proficient learners, learning from examples and problems simultaneously does not appear to be significantly more effective than using example-problem pairs.

In the low-level group, a multivariate analysis of variance on mean total score and difficulty rating with prior knowledge test scores as a covariate showed a significant difference between conditions: Pillai’s trace = 0.210, F = 3.461, P = 0.046 < 0.05. Subsequent univariate analysis presented significant differences in the difficulty rating, F(2,30) = 5.955, MSE = 10.6, P = 0.022 < 0.05, but in the total score, F(2,30) = 2.910, MSE = 3.12, P = 0.099 > 0.05. In the low-level group, the cognitive load associated with learning from examples and problems simultaneously was significantly lower than that of using example-problem pairs; however, the learning achievement proved to be only marginally better. For students in the low-level group, learning from examples and problems simultaneously proved more effective and efficient than using example-problem pairs.

Exercise and test performance

These results indicate that for novice learners, learning from examples and problems simultaneously produces better test performance and lower cognitive load than does learning using example-problem pairs. Analysis of exercise scores showed a similar outcome. One-way analysis of variance on exercise scores with prior knowledge as a covariate showed a significant difference between conditions in the low-level group: F(2,30) = 8.762, MSE = 3.419, P = 0.006 < 0.05, but not in the high-level group, F(2,37) = 3.416, MSE = 1.118, P = 0.073 > 0.05. Pearson correlation analysis of exercise scores and test performance scores showed a high correlation: 0.722, P < 0.001.

Discussion

On the basis of previous research results [12, 13, 16, 20], we propose that learning from examples and problems simultaneously might be more effective and efficient than learning from example-problem pairs in the initial learning phase, and note that the strategy does not lose its effectiveness for proficient learners. Our results indicate that learning from examples and problems simultaneously requires a significantly lower cognitive-load investment than does learning from example-problem pairs for novice learners. One possible explanation for this is the fact that when learning from example-problem pairs, novice learners do not necessarily understand the problem-solving schema they have just studied. When they solve the problems, they must still use search strategies such as mean-ends analysis to deal with the interactive elements they did not fully comprehend, which can impose a high cognitive load. However, students possessing prior skills may be able to reconstruct the problem-solving schema when they study the examples. For them, learning from examples and problems simultaneously may appear similar to learning from example-problem pairs. This explanation does not conflict with previous studies, and it supports our finding that learners with better exercise scores in problem solving obtain better test scores.

Our results also indicate that learning from examples and problems simultaneously does not lead to failure in far-transfer tests. This challenges the traditional assumption that learning from examples and problems simultaneously will lead learners to imitate rather than understand. Our results are also consistent with CLT, which proposes that learning tasks that impose a lower cognitive load enable learners to obtain better scores in near-transfer problems as well as in far-transfer problems.

Our results demonstrate that difficulty ratings may provide a better index than do mental effort ratings as a measure of cognitive load. In fact, we found that many learners failed to understand the meaning of the mental effort scale, and that they treated it as an assessment of attitude. When they learned how to solve a simple problem, they gave a high rating for mental effort because they thought they had learned it very well. A difficulty rating scale is able to avoid this type of error.

To summarize, this study demonstrated that learning from examples and problems simultaneously can be effective and efficient for novices as well as proficient learners, but that the effect is more pronounced in novice learners.

Conclusion

This study investigated the effectiveness of learning from examples and problems simultaneously and example-problem pairs. Furthermore, we examined this problem within the context of novices as well as proficient learners. Our results demonstrate that for novice learners, learning from examples and problems simultaneously is more effective and efficient than learning from example-problem pairs. Using this strategy, learners earned better scores and they reported a significantly lower cognitive load. Among proficient learners, the difference between the strategies was not significant.

Abbreviations

CLT:

cognitive-load theory

References

  1. Sweller J (1988) Cognitive load during problem solving: effects on learning. Cogn Sci 12:257–285

    Article  Google Scholar 

  2. Paas F, van Merriënboer JJG, Adam JJ (1994) Measurement of cognitive load in instructional research. Percept Mot Skills 79:419–430

    Article  Google Scholar 

  3. Sweller J (1994) Cognitive load theory, learning difficulty and instructional design. Learn Instr 4:295–312

    Article  Google Scholar 

  4. Sweller J, Chandler P (1994) Why some material is difficult to learn. Cogn Instr 12:185–233

    Article  Google Scholar 

  5. Clark R, Nguyen F, Sweller J (2006) Efficiency in learning: evidence-based guidelines to manage cognitive load. Pfeiffer, San Francisco

    Google Scholar 

  6. Pass F, van Gog T (2006) Optimising worked example instruction: different ways to increase germane cognitive load. Learn Instr 16:87–91

    Article  Google Scholar 

  7. Renkl A (2005) The worked out example principle in multimedia learning. In: Mayer RE (ed) The Cambridge handbook of multimedia learning. Cambridge University Press, New York, pp 229–245

    Chapter  Google Scholar 

  8. Kalyuga S (2005) Prior knowledge principle in multimedia learning. In: Mayer RE (ed) The Cambridge handbook of multimedia learning. Cambridge University Press, New York, pp 325–337

    Chapter  Google Scholar 

  9. Sweller J (2010) Element interactivity and intrinsic, extraneous, and germane cognitive load. Educ Psychol Rev 22:123–138

    Article  Google Scholar 

  10. Beckmann J (2010) Taming a beast of burden—On some issues with the conceptualisation and operationalisation of cognitive load. Learn Instr 20:250–264

    Article  MathSciNet  Google Scholar 

  11. Renkl A, Atkinson RK (2010) Learning from worked-out examples and problem solving. In: Plass JL (ed) Cognitive load theory. Cambridge University Press, New York, pp 91–108

    Chapter  Google Scholar 

  12. Trafton JG, Reiser BJ (1993). The contribution of studying examples and solving problems to skill acquisition. In: Proceedings of the 15th annual conference of the cognitive science society. Lawrence Erlbaum Associates, Hillsdale, pp 1017–1022

  13. Sweller J, Cooper GA (1985) The use of worked examples as a substitute for problem solving in learning algebra. Cogn Instr 2:59–89

    Article  Google Scholar 

  14. Carroll WM (1994) Using worked out examples as an instructional support in the algebra classroom. J Educ Psychol 86:360–367

    Article  Google Scholar 

  15. Cooper G, Sweller J (1987) The effects of schema acquisition and rule automation on mathematical problem-solving transfer. J Educ Psychol 79:347–362

    Article  Google Scholar 

  16. Kalyuga S, Chandler P, Tuovinen J, Sweller J (2001) When problem solving is superior to studying worked examples. J Educ Psychol 93:579–588

    Article  Google Scholar 

  17. Rourke A, Sweller J (2009) The worked-example effect using ill-defined problems: learning to recognize designers’ styles. Learn Instr 19:185–199

    Article  Google Scholar 

  18. Hausmann RGM, Van de Sande B, VanLehn K (2008) Are self-explaining and coached problem solving more effective when done by pairs of students than alone? In: Love KMBC (ed) Proceedings of the 30th annual conference of the cognitive science society. Cognitive Science Society, Austin, pp 2369–2374

    Google Scholar 

  19. Reisslein J, Atkinson RK, Seeling P, Reisslein M (2006) Encountering the expertise reversal effect with a computer-based environment on electrical circuit analysis. Learn Instr 16:92–103

    Article  Google Scholar 

  20. van Gog T, Kester L, Pass F (2011) Effects of worked examples, example-problem, and problem-example pairs on novices’ learning. Contemp Educ Psychol 36:212–218

    Article  Google Scholar 

  21. Mayer RE (2005) Principles for reducing extraneous processing in mutilmedia learning: coherence, signaling, redundancy, spatial contiguity, and temporal contiguity principles. In: Mayer RE (ed) The cambridge handbook of multimedia learning. Cambridge University Press, New York, pp 183–200

    Chapter  Google Scholar 

  22. Paas F, Tuovinen JE, Tabbers H, Van Gerven PWM (2003) Cognitive load measurement as a means to advance cognitive load theory. Educ Psychol 38(1):63–71

    Article  Google Scholar 

  23. Larson R, Hostetler R, Edwards BH (2008) Essential calculus. Houhton Mifflin Company, Boston

    Google Scholar 

  24. Paas F (1992) Training strategies for attaining transfer of problem-solving skill in statistics: a cognitive-load approach. J Educ Psychol 86:429–434

    Article  Google Scholar 

  25. Paas F, van Merriënboer JJG (1994) Variability of worked examples and transfer of geometrical problem solving skills: a cognitive-load approach. J Educ Psychol 86:122–133

    Article  Google Scholar 

  26. van Merriënboer JJG, Schuurman JG, de Croock MBM, Paas F (2002) Redirecting learners’ attention during training: effects on cognitive load, transfer test performance and training. Learn Instr 12:11–39

    Article  Google Scholar 

  27. Kalyuga S, Chandler P, Sweller J (1999) Managing split-attention and redundancy in multimedia instruction. Appl Cogn Psychol 13:351–371

    Article  Google Scholar 

  28. Mayer RE, Chandler P (2001) When learning is just a click away: does simple user interaction foster deeper understanding of multimedia messages? J Educ Psychol 93:390–397

    Article  Google Scholar 

  29. van Gog T, Paas F, van Merriënboer JJG (2008) Effects of studying sequences of process-oriented and product-oriented worked examples on troubleshooting transfer efficiency. Learn Instr 18:211–222

    Article  Google Scholar 

Download references

Authors’ contributions

YHH conceived of the study and discussed the related theory, participated in its design, conducted the experiment, discussed the experimental data and helped to draft the manuscript. KCL discussed the related theory, participated in the design of the study, discussed the experimental data and helped to draft the manuscript. XY participated in the design of the study, conducted the experiment, performed the statistical analysis and discussed the experimental data. JCH participated in the design of the study and discussed the experimental data. All authors read and approved the final manuscript.

Acknowledgements

No other persons have made substantial contributions to the manuscript.

Compliance with ethical guidelines

Competing of interests The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kuan-Cheng Lin.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, YH., Lin, KC., Yu, X. et al. Comparison of different approaches on example-based learning for novice and proficient learners. Hum. Cent. Comput. Inf. Sci. 5, 29 (2015). https://doi.org/10.1186/s13673-015-0048-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13673-015-0048-8

Keywords