A company gets excited about an eLearning project. Funding is secured. The two most difficult challenges – getting support and getting money – are met. The rest is downhill. Simple steps, really.
Perhaps not so simple.
The analysis begins…
How should people behave when the training is concluded? What are the work tasks that need to be accomplished? What are the instructional goals that will help someone execute those tasks? What kinds of goals do we have? Do they relate to the employees’ or students’ attitude, their knowledge base, procedures to follow, problems to solve – what? What activities should the learner complete that will increase the likelihood of learning? How do we know that learning has indeed taken place? And so on. Pretty straightforward; pretty textbook instructional design.
So what, too often, goes terribly wrong. (And it does, so often, go terribly wrong.) In the end, why are learners not engaged – training outcomes not realized – money and time wasted.
In my experience, any number of things can stand in the way of effective analysis, design and deployment. Here are just some of the highwaymen that tend to ambush projects:
Tug of War – Conflicting goals
The proposed eLearning project is like a blank slate or naked canvas. All of the stakeholders project onto the canvas their own aspirations for what the training should do – what it should accomplish. After all, this is their one shot to get it done. The project funds have been secured. There may not be another opportunity.
When there are conflicting expectations between stakeholders, the debate seldom gets started over the stated objectives of the instructional design document. It starts during development or over a nearly finished product.
The solution, practiced by only a few companies, in my experience, is rapid and iterative design. If the ISD paper isn’t real; then let’s see it on screen, quickly and cheaply. Then we can ‘duke’ it out – early in the game, before it’s too late. Dr. Michael Allen, in his Guide to eLearning, promotes and supports the concept of rapid and iterative design. Dr. Greg Sales, at Seward Incorporated, includes a rapid prototype in his company’s instructional design process. I can only say that, from my experiences, their wisdom holds weight.
For whatever reason, it isn’t real until it’s on the screen.
One of my projects got caught up in a titanic struggle between the director of operations and the director of training. One individual wanted the training to be more conceptual; the other, more practical and grounded in everyday concerns like passing certification tests . One person wanted simulations; the other, drills and practices.
Despite what the instructional design document stated, each individual expected the final project to fulfill his personal wishes for the training product. A rapidly produced prototype enabled each person to visualize where the project was really headed. It became an opportunity to work out their differences and adjust their expectations.
This may be an extreme case – but certainly, a rapid prototype helps stakeholders to visualize the strategies and get a sense of functionality and scope. If the prototype raises concerns, not too much has been invested in production. Changes can be made.
Content, content – too much content.
In my youth, one of our pastimes was packing people into a Volkswagen for fun. Yes, we actually did this. The challenge was to see how many people could squeeze in. Of course, the simple function of a vehicle was lost: no one could actually drive. I often think of eLearning projects in that way — packed with too much content in a way in which no learner can function.
Subject matter experts love the currency of their expertise – all those facts, figures, policies, rules … procedures. If only they could do a brain dump on the newbies – the correct behaviors would be achieved. And so there it is. The blank eLearning page – with no limit to its scrollability, pageablility, hyperlinkability, and no regard to its learnability.
My son hesitantly confided with me on a recent visit home. He hated to say it. His first online learning class at college was boring. He imagined that I was the flag bearer for the whole eLearning industry – and he was apologetic.
He was in a literature class. The eLearning was poorly formatted text on screen, followed by quiz questions. Text and questions. Text and questions. The joy of literature was squeezed out of this course. Resolved to complete the course, my son called up his instructor to offer some suggestions. In response, the instructor suggested that he withdraw from the course.
Learner, learner – too little learner
If Django Reinhart, one the jazz greats of the 30s and 40s, came to my house one day to instruct me in guitar, he would listen to me play once and then re-evaluate his entire teaching approach. Nobody who teaches skills would factor the learner out of the equation – and yet we do it in eLearning all of the time.
For years, we did this in elementary mathematics education. We created engines and pumped content into those engines. We threw in a pre-test, presented a concept, provided a practice and then post-tested. We mass-produced lessons across all strands, across all grades – with every learner in mind, and no learner in mind. The whole approach was content centric but didn’t account for the different ways that the learner processed the information. The end product was almost cheap enough for districts to purchase. Some, of course, did purchase – but when standardized tests came along, they realized that they wasted their money. Student performance was not commensurate with the investment.
eLearning can’t factor out the learner. Instructional designers need the opportunity to meet learners in their work context. Designers need to understand the work setting, the training setting, learner’s typical experiences and their limitations. Designers need to learn about the learner firsthand and not filtered through the perceptions of others.
Years ago, I designed some simulations for a utility. The workers were taught the principles underlying processes so that they could tweak some controls and save the utility some money. The training was designed for one city. Decision-makers saw the training and thought that it could be applied nicely to their utility. When I first met with the workers in the second city, I realized that no one trusted them to change the controls. The processes were running on auto pilot. The workers played cards at a table in the control room. The system ran itself. The training was useless in this second context.
Conclusion:
For eLearning to be engaging, its design needs to be free of the conflicts and compromises that come from too many stakeholders with opposing views and unaddressed expectations. If it can’t be free of conflict, it should at least expose the conflict at an early and inexpensive stage. eLearning should also stick to what it does best: simulate, stimulate and individualize. It shouldn’t push learners through pages and pages of content with limited interactivity; and finally it should be designed not with the needs of a phantom learner but the real, ‘live’ one who will actually use the training.