Saturday, January 10, 2009

Reducing Extraneous Cognitive Load by Accounting for Individual Differences

“I built a computer in my Electrical Engineering classes…” or “Of course I can use MS Word; I’m an English major…” or “I took a computer class in high school...do I really have to take the CIL tests?” What is our response every time? “Just show me.”

Computer and Information Literacy (CIL) is a series of 6 tests designed for freshmen at Utah State University to display that they have basic skills in regards to using a computer and finding and ethically using information on the Internet, in the library, or from other sources. By gaining these skills early in their college career, these students will have tools at their disposal that will allow them to work more efficiently than they may have otherwise. Of course, many of those students that try the hardest to get out of the tests or simply postpone them as long as possible are the very ones that have difficulties passing one or more tests.

Cognitive Load and Cognitive Information Processing

Students come to college for many reasons. Ask 10 random students on a university campus why they’re here, and you may not be surprised to hear 11 different answers. Of course, that is because everyone is different. We come from different backgrounds, with different interests. But deep down, we’re all computers, at least according to Cognitive Information Processing (CIP) Theory, so why should learning computers be so hard?

CIP theorists postulate that information flows through the brain similar to how data flows through a computer. They describe the brain as a system which receives input in various forms, patterns are recognized and loaded into short term memory if relevant, the information is then processed further to create a response, and finally experiences may or may not be loaded into long term memory for later retrieval (Driscoll 2004).

Tools like chunking, sequencing, imagery, and mnemonics can be utilized by an instructor to help a student process information and build on existing knowledge. Some claim that if no previous knowledge exits, CIP does not apply, since new knowledge cannot be related to any existing schema (Gee). Regardless of whether we actually learn as CIP theorists propose, it is important for instructors to be familiar with the tools available to them, whether technical or theoretical, and use that which is best for their learners.

Some of the most interesting research going on in the field of Instructional Technology is Cognitive Load Theory. The relationship between intrinsic, extraneous, and germane cognitive load has some of its beginnings in CIP, which intuitively theorizes that the brain can only think about or encode a small amount of information at any given time. Intrinsic load is generally thought of as difficult or impossible to manipulate and thus ignored, as it refers to the processing that has to be done by the brain to encode practically anything. The goal should be to minimize extraneous cognitive load, which could be anything that distracts a learner, and increase germane load, or those processes that allow the learner to actively participate in the encoding process. As van Merriënboer and Sweller (2005) point out, however, “instructional manipulations to improve learning by diminishing extraneous cognitive load and by freeing up cognitive resources is only effective if students are motivated and actually invest mental effort in learning processes that use the freed resources.”

So let’s say that we follow Gagné’s Nine Events of Instruction (Driscoll 2004) and we know that we must motivate our learners by Gaining their Attention and Informing them of the Objectives of our instruction. For CIL, we gain their attention pretty easily by standing in the way of graduation if not completed, and our objectives are the 6 tests they must pass. Of course, this is an area CIL could use some improvement. Yes, it is a graduation requirement, but it is more than that. These are important skills which allow students to work more efficiently. If we can help them understand that, they will be more willing to work to increase germane load and learn the material. Motivating students to take the tests sooner is a topic that does need further attention, but the fact is that we do have a captive audience, so I will focus here on how to help students once they come to us.

Learning Styles and Individual Differences

According to Shute and Towle (2003), “the challenge of improving learning and performance largely depends on correctly identifying characteristics of a particular learner.” Some characteristics of a given learner include existing knowledge, cognitive ability, personality, learning styles, and interest in the subject matter. A variety of supports are available to help students pass the CIL tests. Online tutorials are available for students to review, practice tests are available, review sessions are provided twice a week, and the actual tests give a breakdown of the topics with which the student needs some help. The problem with each of these supports is that they require the student to track manually for themselves what it is they need to work on and to decide which resources to study with.

Some of the skills we test on lend themselves to simple memorization/regurgitation through a multiple choice test and others require some critical thinking and demonstration of skills. Given multiple types of content and varying skill levels of test-takers, where do we start when developing instruction? Although it is necessary to dynamically adjust the number and type of examples and guidance for any given learner, this accommodating for individual differences, according to Reiser and Dempsey (2002), “is secondary to the fundamental content-by-strategy consistency required for effective instruction.” The type of content and goals of instructions are primary; learner styles should be reserved as a fine-tuning or adjustment to the content-based strategies. As such, we provide straight-forward text and still images for the concept tests but animated screencasts in addition to a few text explanations in the tutorials for the performance tests. A thorough review of all the CIL tutorials needs to be done to ensure that the content does match the presentation method used.

A review of our instructional materials would also help to avoid issues like the Expertise Reversal Effect (Kalyuga et al., 2003). This phenomenon shows reduced extraneous cognitive load for novices when illustrations and text are physically integrated. As expertise increases, that difference in cognitive load decreases and eventually reverses, with experts becoming distracted by all the extra information they don’t need. This is a perfect place to bring in software that can track learner knowledge and personal preferences and suppress unnecessary and distracting information from the advanced learner who does not need it.

Technology to the Rescue

In their recent book, van Merriënboer and Kirschner (2007) extend their previously published Four Component Instructional Design (4C/ID) model. A central piece of both the 4C/ID and the extended Ten Steps to Complex Learning is that of providing Just-in-Time (JIT) information to the student right when it’s needed. They give the example of a coach who observes her players from the side of the playing field and shouts directions like “remember to bend your knees...” or “no, keep your eye on the ball...” It is difficult to explain just why a certain piece of information is exactly what a learner needs at a given moment, but a master teacher can predict just what the learner needs next. They point out in their book that the more complex and open-ended the learning task, the more difficult it is to create intelligent help and tutoring systems to provide that JIT information.

A great number of online courses are simply a copy of old print materials that have been converted to PDF or HTML and placed on the web. “Instead of the page-turners of yesterday, we now have scrolling pages, which is really no improvement at all. Adaptive e-learning provides the opportunity to dynamically order the pages so that the learner sees the right material at the right time,” say Shute and Towle (2003). They go on to provide a list of essential components for an adaptive learning tool to be truly effective:

1. An independent and robust delivery system and a predictably structured independent content system that can be adapted to each learner.
2. Embedded assessments, delivered to the student during the course of learning, which triggers the presentation of more of the same topic or a new topic. Assessment should really be integrated throughout to guide the instruction, rather than be simply tacked on at the end.
3. Genetic programming, which can take an initial set of human-designed rules, perhaps set with data from a pilot study, and evolve as the system is used to increase its accuracy.


Koedinger and Anderson (1997) describe a study in urban Pittsburgh high schools helping at-risk students learn algebra skills in the context of real life situations, such as comparing the prices of two moving companies or rental car agencies. The learning environment included a grapher, calculator, spreadsheet, and an organized curriculum of problem situations. The system contained psychological modeling techniques for what they called model tracing and knowledge tracing. Model tracing is used to monitor student progress through a problem solution. Knowledge tracing is used to monitor student learning from problem to problem. Together, the software could individualize problem selection and optimally pace students through the curriculum giving immediate feedback while working on a problem. Stress from making errors was reduced because others in the class didn’t see their mistakes. The program was so successful that all students were sent through the more advanced algebra class than the basic math class and standardized test scores at the end of the year increased 15%. Additional high schools were brought on board. An important observation from this study that CIL needs to take into consideration is making our examples and test questions reflect real situations that a student may find himself or herself in. Rather than use spreadsheets, for example, to manipulate data on yearly sales figures for a store, they should be calculating grades in a class, budgeting for school and living expenses with just a part-time job, or tracking basketball box scores.

What next?

Given the large number of students we see, and the large variance in individual learning styles, there is a lot we could do at CIL to improve both our testing and teaching methods. We can currently track performance on the actual CIL tests already, but by implementing a tutoring system that would track student behaviors as they prepare, we would be able to compare that preparation data to test performance data and figure out if the students are simply practicing Selective Attention (Driscoll 2004) or if there is a problem with the tutorials. By tracking students through the tutorials, we can provide additional assistance to the students that barely slip through with the minimum score after expending a lot more effort than was necessary. Currently, the only way to really collect that kind of data would be to “ask test takers to reflect aloud on the cognitive and evaluative processes” used to study and take tests (Gall et al., 2007). Of course, asking students to track their own actions and thoughts adds extraneous cognitive load, which decreases their performance. The next step is to load our content and practice items, after whatever updates need to be made, into an adaptive tutor system to find out what’s really going on.

REFERENCES

Driscoll, M. (2004) Psychology of learning for instruction (3rd ed.). New York: Allyn & Bacon.

Gall, M., Gall J., Borg, W. (2007) Educational Research (8th ed.) Boston: Allyn & Bacon.

Gee, D. Learning Theories. eLearning Source. Accessed February 2007.

Kalyuga, S., Ayres, P., Chandler, P., and Sweller, J. (2003). The expertise reversal effect. Educational Psychologist 38(1).

Koedinger, K., Anderson, J. (1997). Intelligent Tutoring Goes To School in the Big City. International Journal of Artificial Intelligence in Education, 8.

Reiser, R., Dempsey, J. (Eds.). (2002) Trends and issues in instructional design and technology. Upper Saddle River, NJ: Merrill/Prentice Hall.

Shute, V., Towle, B. (2003) Adaptive E-Learning. Educational Psychologist 38(2).

Van Merriënboer, J., Kirschner, P. (2007) Ten steps to complex learning: A systematic approach to Four-Component Instructional Design. Mahwah, NJ: Lawrence Erlbaum Associates.

Van Merriënboer, J., Sweller, J. (2005) Cognitive Load Theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17(2).

2 comments:

Nathan Toone said...

I'm so glad to hear that even at the "big" schools you have to deal with people like that.

We give a preassessment test that people can use in order to get out of taking different parts of the class. It's really easy - pass the test, pass that part of the class. Fail the test, no big deal, it doesn't affect your grade in any way - you just come to class and then take the final at the end. Some of my favorite (actual) exchanges that I've had:

Student: "I don't need to take the test - I already know how to use <INSERT PROGRAM HERE>"
Me: "OK - then you shouldn't have any problem taking the test"

Student (while taking the test): "Can't I have just a few more minutes? I only got to question 10 [out of 16] - but the ones that I've done so far, I've done right"
Me: "Part of taking this test is doing it within the desired time limit"

(I wanted to add on that last one "My 8-year-old daughter could pass this test given an unlimited time...that's the whole purpose of the help system")

Student (*SAME* one as the other two quotes above): "Well, I didn't do so well on the test - do you think I could take it again?"
Me: "Yes - that is the purpose for that test...come to class, do your homework, and take the final. This one doesn't affect your grade."
Student: "But can I just take the preassessment again - I *know* I can pass it next time"

-- TIME PASSES...Student doesn't attend a single class session between the preassessment and the final - but manages to drag himself out of bed on the day of the final --

Student (when I give the "5-minutes remaining" notice for the test): "Can I have just a little more time? I'm SOOOO close" (he's on question 10 of 16 again...the test is identical to the preassessment)
Me (in my mind - not out loud): "Sure - why don't you take the class again next semester...and I'd try one that's not at 7:30am - because you obviously can't get up early enough to come to this one."

(not to mention the fact that I only teach the 7:30 class - and I don't want you in my class again...you're a pain)

robmba said...

Yeah, we currently don't have a time limit on our tests, but we've been talking about putting them into place. We have people who have never used PowerPoint before, and they come in and spend 3 hours putting together a 6 slide presentation, learning as they go. I pulled some stats, and on all the MS Office tests, people who spend over an hour average 10 points lower.

The important thing, I think, is tracking student behavior so we can use that information in making sure we're doing all we can to help them learn.

But I've had people forget they were enrolled in my class or claim they dropped it and show up at the end of the semester looking for mercy. So, there's only so much we can do.