LodeStar 7 Lite Coming This Fall

LodeStar 7 Lite is the first complete redesign of LodeStar since LodeStar 1.0 appeared in the fall of 2001. At that time we envisioned a system that could adapt easily to java applets,  Flash,  eBooks and whatever new technology came along.  The initial design lived up to its promise.  When HTML 5 became all the rage in 2011, LodeStar easily moved in the direction of full HTML 5 support.

LodeStar 7 screenshot

The new interface for LodeStar 7

But we also added many features that were not well integrated into the initial design. Vector graphics editing and the interactive graphic are two examples. Vector graphics editing requires launching a specialized editor. Making a graphic interactive requires an author knowing which menus to select and what parameters to change.  The features are there, but are not intuitive.  They seem like add-on’s to the initial design.

Compounding our dissatisfaction with the original design is the need to integrate new feature requests. For example, from Normandale College, we received the request to display pages and their branches in an interactive visual.

We also received requests for accessibility on the authoring side as well as mobile authoring and web-based authoring.  These were difficult to achieve through the original design.

The design of LodeStar 7 stems from our experience of 12 years and learning what faculty really care about. We have insight into what aspects of LodeStar faculty truly love and what aspects are less important.  We have a laser focus on the features that really make a difference and, as a result, have better integrated them into the product.  Features that have become important and that were added subsequent to the original design will now be first class citizens in the product.

When LodeStar 7 Lite is released in late fall of 2013, it will not have all of the rich features that are present in LodeStar 6.7.  The HTML editor, for example, will be simpler.  The ability to draw vector graphics will be missing.  Spell checking will be missing.

In future releases, LodeStar 7 will slowly migrate from a lite version to a full featured version.  The missing features will be more tightly integrated with the product and will result in a better authoring experience.

Early releases of LodeStar 7 will be made available to our Facebook followers in early to mid fall. Please follow us at https://www.facebook.com/LodeStarLearn Early releases will become available free of charge to our Facebook community.

LodeStar 7 Lite relies on your good word of mouth.  Rather than expending energy  and resources on product marketing, we will continue to build upon the new design and hope for good word of mouth to announce its availability.  The full-featured version of LodeStar 7 will be primarily marketed by RiverTown Communications when it becomes available.

Please follow our developments on Facebook and help us spread the message that LodeStar 7 Lite is coming this fall.  We will be noticeably absent from the Minnesota eLearning Summit next week – but please assure your colleagues that a new release is on the horizon.

As always, I am deeply committed to this work.  My guiding principle has been to make it easy for faculty to employ a variety of strategies that support learning outcomes.

We hope that LodeStar 7 makes it easy for faculty to deepen their tool chest of effective online learning strategies. I’ll look forward to your feedback on Facebook.

eLearning Projects — What can go so terribly wrong!

A company gets excited about an eLearning project.  Funding is secured.  The two most difficult challenges – getting support and getting money – are met.  The rest is downhill.  Simple steps, really.

Perhaps not so simple.

The analysis begins…

How should people behave when the training is concluded?  What are the work tasks that need to be accomplished?  What are the instructional goals that will help someone execute those tasks? What kinds of goals do we have?  Do they relate to the employees’ or students’ attitude, their knowledge base, procedures to follow, problems to solve – what?  What activities should the learner complete that will increase the likelihood of learning?  How do we know that learning has indeed taken place?  And so on.  Pretty straightforward; pretty textbook instructional design.

So what, too often, goes terribly wrong.  (And it does, so often, go terribly wrong.) In the end, why are learners not engaged – training outcomes not realized – money and time wasted.

In my experience, any number of things can stand in the way of effective analysis, design and deployment.  Here are just some of the highwaymen that tend to ambush projects:

Tug of War – Conflicting goals

The proposed eLearning project is like a blank slate or naked canvas.  All of the stakeholders project onto the canvas their own aspirations for what the training should do – what it should accomplish.  After all, this is their one shot to get it done.  The project funds have been secured.  There may not be another opportunity.

When there are conflicting expectations between stakeholders, the debate seldom gets started over the stated objectives of the instructional design document.  It starts during development or over a nearly finished product.

The solution, practiced by only a few companies, in my experience, is rapid and iterative design.  If the ISD paper isn’t real; then let’s see it on screen, quickly and cheaply.  Then we can ‘duke’ it out – early in the game, before it’s too late.  Dr. Michael Allen, in his Guide to eLearning,   promotes and supports the concept of rapid and iterative design.   Dr. Greg Sales, at Seward Incorporated, includes a rapid prototype in his company’s instructional design process.  I can only say that, from my experiences, their wisdom holds weight. 

For whatever reason, it isn’t real until it’s on the screen. 

One of my projects got caught up in a titanic struggle between the director of operations and the director of training.  One individual wanted the training to be more conceptual; the other, more practical and grounded in everyday concerns like passing certification tests .   One person wanted simulations; the other, drills and practices.

Despite what the instructional design document stated, each individual expected the final project to fulfill his personal wishes for the training product.   A rapidly produced prototype enabled each person to visualize where the project was really headed.  It became an opportunity to work out their differences and adjust their expectations.

This may be an extreme case – but certainly, a rapid prototype helps stakeholders to visualize the strategies and get a sense of functionality and scope.  If the prototype raises concerns, not too much has been invested in production.  Changes can be made. 

 Content, content – too much content.

In my youth, one of our pastimes was packing people into a Volkswagen for fun.  Yes, we actually did this.  The challenge was to see how many people could squeeze in.  Of course, the simple function of a vehicle was lost: no one could actually drive.  I often think of eLearning projects in that way — packed with too much content in a way in which no learner can function.  

Subject matter experts love the currency of their expertise – all those facts, figures, policies, rules … procedures.  If only they could do a brain dump on the newbies – the correct behaviors would be achieved.  And so there it is.  The blank eLearning page – with no limit to its scrollability, pageablility, hyperlinkability, and no regard to its learnability.

My son hesitantly confided with me on a recent visit home.  He hated to say it.  His first online learning class at college was boring.  He imagined that I was the flag bearer for the whole eLearning industry – and he was apologetic. 

He was in a literature class.  The eLearning was poorly formatted text on screen, followed by quiz questions.  Text and questions.  Text and questions.  The joy of literature was squeezed out of this course.  Resolved to complete the course, my son called up his instructor to offer some suggestions.  In response, the instructor suggested that he withdraw from the course.

Learner, learner – too little learner

If Django Reinhart, one the jazz greats of the 30s and 40s, came to my house one day to instruct me in guitar, he would listen to me play once and then re-evaluate his entire teaching approach.  Nobody who teaches skills would factor the learner out of the equation – and yet we do it in eLearning all of the time. 

For years, we did this in elementary mathematics education.  We created engines and pumped content into those engines.  We threw in a pre-test, presented a concept, provided a practice and then post-tested.   We mass-produced lessons across all strands, across all grades – with every learner in mind, and no learner in mind.  The whole approach was content centric but didn’t account for the different ways that the learner processed the information. The end product was almost cheap enough for districts to purchase.  Some, of course, did purchase – but when standardized tests came along, they realized that they wasted their money.  Student performance was not commensurate with the investment.

eLearning can’t factor out the learner.  Instructional designers need the opportunity to meet learners in their work context. Designers need to understand the work setting, the training setting, learner’s typical experiences and their limitations.  Designers need to learn about the learner firsthand and not filtered through the perceptions of others.

Years ago, I designed some simulations for a utility.  The workers were taught the principles underlying processes so that they could tweak some controls and save the utility some money.  The training was designed for one city.   Decision-makers saw the training and thought that it could be applied nicely to their utility.  When I first met with the workers in the second city, I realized that no one trusted them to change the controls.  The processes were running on auto pilot.  The workers played cards at a table in the control room.  The system ran itself.  The training was useless in this second context.

 Conclusion:

For eLearning to be engaging, its design needs to be free of the conflicts and compromises that come from too many stakeholders with opposing views and unaddressed expectations.  If it can’t be free of conflict, it should at least expose the conflict at an early and inexpensive stage.  eLearning should also stick to what it does best:  simulate, stimulate and individualize.  It shouldn’t push learners through pages and pages of content with limited interactivity; and finally it should be designed not with the needs of a phantom learner but the real, ‘live’ one who will actually use the training.

Climbing Bloom’s Taxonomy

In support of online learning, we often write about climbing Bloom’s taxonomy with the help of learning objects created from templates. Bloom’s taxonomy refers to the work of Dr. Benjamin Bloom who wrote his Taxonomy of Educational Objectives in 1956. Since then the taxonomy has been widely used in curriculum and instructional design to classify the types of educational activities that require students to think. Those activities engage students in remembering, understanding, applying, analyzing, evaluating and creating.

Instructors understand that a curriculum should not only engage students in recalling facts, but should involve students in understanding principles and concepts, applying their knowledge in novel situations, analyzing information with their new understanding, making critical choices and creating something. ‘Creating something’ requires knowledge of the facts, understanding of the concepts and, possibly, an analysis of a problem situation and judgment about what solutions might best apply. ‘Creating something’ is a synthesis of all these things and an observable outcome of what the student has learned.

Climbing Bloom’s taxonomy means helping students progress through the recall of information to higher orders of thinking such as understanding, applying, analyzing, etc. As online learning instructors, we look for opportunities to help students ‘climb the ladder’. In pathophysiology, we might create activities that help students recall the various toxins produced by bacteria or the normal ranges expected from blood tests, but that wouldn’t be enough unless students understood how that information should be applied.

In Biology in Bloom: Implementing Bloom’s Taxonomy to Enhance Student Learning in Biology, the authors provide a Blooming Biology Tool, a tool that lists the levels of Bloom’s Taxonomy. They also provide examples of biology exam questions aligned to the taxonomy. What follows is a snippet of their work. Their table aligns to Bloom’s original levels of knowledge, comprehension, application, analysis, synthesis, evaluation. Since Bloom wrote his original taxonomy, a revision has been offered that labels the categories with student-centric action verbs rather than nouns, and places the act of creating at the top of the taxonomy. I have added labels in parentheses to show the alignment to the revised taxonomy.

Knowledge (Remembering)
Identify the parts of a eukaryotic cell; identify the correct definition of osmosis. 

Comprehension (Understanding)
Describe nuclear transport to a lay person; provide an example of a cell signaling pathway.

Application (Applying)
Predict what happens to X if Y increases

Analysis (Analyzing)
Interpret data, graphs, or figures; make a diagnosis or analyze a case study; compare/contrast information.

Synthesis (Creating)
Develop a hypothesis, design an experiment, create a model.

Evaluation (Evaluating)
Critique an experimental design or a research proposal; appraise data in support of a hypothesis. Instructors should analyze their courses in much the same way. They should understand what cognitive level each course element requires of the student from its objectives and content to its activities and assessments.

Biology in Bloom makes several critical points about the use of Bloom’s taxonomy in higher education.

Most faculty would agree that academic success should be measured not just in terms of what students can remember, but what students are able to do with their knowledge. It is commonly accepted that memorization and recall are lower-order cognitive skills (LOCS) that require only a minimum level of understanding, whereas the application of knowledge and critical thinking are higher-order cognitive skills (HOCS) that require deep conceptual understanding (Zoller, 1993).”

A second critical point:

“If classroom activities focus on concepts requiring higher order cognitive skills but faculty test only on factual recall, students quickly learn that they do not need to put forth the effort to learn the material at a high level. Similarly, if faculty primarily discuss facts and details in class but test at a higher cognitive level, students often perform poorly on examinations because they have not been given enough practice developing a deep conceptual understanding of the material. Either case of misalignment of teaching and testing leads to considerable frustration on the part of both instructor and student.”

We see effective climbing of Bloom’s Taxonomy in some of the best materials produced by book publishers. Lippincott, for example, accompanies its nursing texts with multimedia lessons that present factual information and then require the students to apply the facts and understanding of concepts to case studies. One strategy that Lippincott uses is to move students through a case study where an emergency room patient presents a history of complaints that are ultimately related to the topic of study.

Instructors don’t always have access to publisher lessons in their field of study or license to use them. So that begs the question: can an instructor use learning objects to create what the publishers have created: presentation of information, followed by a case study in which students apply the information learned. More specifically can instructors use LodeStar Learning’s templates to help students climb Bloom’s taxonomy?

The answer is yes. As this Web Journal gets developed and new articles get published, we’ll explore many examples of lessons in each of the levels of Bloom’s taxonomy.  So stay tuned to this journal.

References: Crowe, A., Dirks, C., & Wenderoth, M. (2008). Biology in Bloom: Implementing Bloom’s Taxonomy to Enhance Student Learning in Biology. CBE – Life Sciences Education, 7(4), 368-381. Zoller U. Are lecture and learning compatible? Maybe for LOCS: unlikely for HOCS (SYM) J. Chem. Educ. 1993;70:195–197