**Introduction**

As online instructors, we recognize that students benefit from interacting with content in a manner that truly makes them think. And yet we find the task of creating interactive, meaningful content to be extremely challenging and time-consuming.

For some subject matter, interactive content that lets students manipulate the data and see different outcomes can be highly effective. Marketing students can test the principles of the marketing mix by adjusting the amount invested in the quality of the product versus its advertising. Civil engineering students might control the amount of ammonia in a wastewater treatment pond or the food to microorganism ratio. Sociology students might explore the consequences of unequal distribution of wealth. Health care students might explore the implementation variables of chronic care management.

To tease out the benefit of interactive content, let’s find a good example. Suppose we pick the principles of composting. That seems like an odd place to start, but we all understand composting at some level. How would an online instructor design an interactive lesson on composting that is effective and teaches the underlying principles?

Composting is bug farming. Effective composting results from the right combination of carbon and nitrogen-rich material, water, and heat. Students can learn composting by *doing*, but that might take weeks and without careful measurements and some guidance, they may not come to understand the underlying relationships and their effect. They can learn from a handbook that teaches procedures, or from a science text that teaches principles. In either case their readings may or may not lead to real understanding.

In contrast, in an online environment, the principles of composting can be taught through interactive models. Students could be presented with an interactive model and challenged to generate the most compost in the shortest period of time. In response, student might add more carbon-rich materials such as dry leaves to the compost. Or change the moisture content. Or change the ambient temperature. Once students tweaked and played with the parameters, their instructor could assess their understanding – do they truly understand the relationships, the principles, the cause and effect — and then invite students to apply their knowledge to building a compost of their own.

As mentioned, students could follow the procedures of composting without understanding the underlying principles. Students could recite textbook statements without really thinking about them. Online instructors must constantly ask the question: how much thinking are my students actually doing in my course. Not reading. Not quizzing. Not reciting. But thinking.

When we write about time-worn concepts such as interactivity and engagement, that is what we are driving at. Interactive engagement affords us the opportunity to get students to *think*. Discussions, projects, group projects, online examinations can certainly challenge students to think, but how can we, without computer programming knowledge, facilitate interactive engagement between students and the content in a manner alluded to above and in a manner that fosters curiosity, promotes genuine interest in the content and *puzzles* students?

**The Explore – Validate Design Pattern**

The Explore – Validate Design Pattern gets students to think. It is a form of interactive engagement that has, as one element, intense student-to-content interaction.

Interaction is a key word in online learning. Successful, effective online learning happens through students interacting with each other, their instructor and the course content. Each type of interaction demands of the instructor special skills and intention. With respect to student to student and student to instructor interaction, instructors can draw from their ability to foster interpersonal communications. Good teachers know how to facilitate group discussions and engage students in Socratic dialog. Although instructors must learn how to adapt their strategies to an online environment, many of them have a good starting place. The third type of interaction, however, student-to-content, may arguably be the most challenging for instructors new to online learning.

Not all student-to-content interactions are equal. At the lowest level, passive eLearning involves very little interaction. Clicking buttons to page through content does not constitute interaction. Clicking through a presentation on composting, for example, constitutes a very low level of interaction. A higher level of student-to-content interaction might involve multimedia in the form of animations and video, drag and drop exercises and other basic forms of interaction. A moderate level of interaction might involve scenarios, branched instruction, personalized learning, case studies, decision making and the instructional design patterns that have been the basis of our past web journal articles. The highest and most technical level of interaction might involve virtual reality, immersive games, simulations, augmented reality and more.

That said, the highest level of interactivity is not necessarily the best level for students. Interaction is essential insofar as it helps students achieve a cognitive goal, whether that relates to remembering, understanding, or applying. Interactions are useful only if they help students remember better, or understand a concept or a principle or apply their learning. One can’t categorically say that fully immerse interactive games are better than animated videos or drag and drop interactions. If the objective is that students will remember essential medical terms, then a fully immersive environment may hinder that accomplishment. Richard Mayer refers to extraneous processing. Extraneous processing is the attention that the learner must give to features of the learning environment that do not contribute to learning goal achievement. If extraneous processing is too high then it impedes the student’s ability to focus on relevant information.

**How it works**

Considering the type of learning that students must activate is critical in determining whether or not instructors should plan on higher levels of interaction. In my second example, students are introduced to Isle Royale. Students examine data related to the wolf and moose population. They must draw inferences on how the rise and decline of one population affects the other. If this were a declarative knowledge lesson, students would simply need to recite the critical facts. How many moose were introduced to Isle Royale? How many wolves? What are the population numbers today? What were they at any given point? Students can simply recite those numbers without understanding the true nature of the interaction between the wolf and moose population on the island. The real objective of the lesson is to understand feedback loops in ecological systems. Students arrive at this understanding not by reading facts and figures, but by asking what-if questions and manipulating the inputs on a simple simulation.

Asking what-if questions is an inductive approach. Rather than being given a description of a law, for example, or a principle or concept, students infer the needed information from a simulation or a set of examples.

The deductive approach is the opposite. Perhaps an overly negative view is that instructors who use a deductive approach simply state a principle or concept. All of the students’ cognitive work is in listening and, perhaps, taking good notes.

Faculty may be skeptical or wary of inductive learning. It takes considerable time to set up; it seems less efficient. Conversely, in my experience, faculty commonly engage students in deductive learning. The instructor presents on and explains a concept. Students take notes. Lectures are often characterized by the deductive learning approach.

The inductive method makes use of student inferences. Instead of explaining concepts, the instructor presents students with a model or examples that embody the concept. The student manipulates inputs and ‘infers’ what the underlying rules are.

Instructors who are critical of inductive approaches fear that students will make incorrect inferences. In my experience, inductive learning is more challenging to facilitate. It is easier to state facts than to set up examples for students to infer facts. Especially, given the hazard that students could infer the wrong facts.

In recognition of this, the instructional design pattern called Explore and Validate features a check-for-understanding activity. Explore and Validate is one form of interactive engagement.

**An example**

Explore and Validate offers an environment in which students manipulate models or examine examples, draw inferences and check their understanding in some manner in order to validate their conclusions.

For example, students may read cases in which victims express feelings toward their oppressors. In a deductive approach, the instructor can simply define the Stockholm syndrome. The instructor may explain that hostages afflicted with this syndrome express feelings of empathy toward their captors. An assessment might ask students to define Stockholm syndrome. An inductive approach might involve students with reading brief summaries of cases in which they “notice” that the victims become empathetic or sympathetic toward their oppressors. Students can describe the syndrome, offer explanations and even label the syndrome. The instructor would then contrast the students’ descriptions with a more formalized, clinical description. The first part of the activity is the explore phase. The second part is the validate phase.

In our example below, students are told about Isle Royale. In the early 1900s moose swam to Isle Royale from Minnesota. 50 years later a pair of wolves crossed an ice bridge to the island from Canada. In a lesson designed with the Explore-Validate instructional design pattern, an optional strategy is to ask students to think about and predict the outcome of a given scenario. In this example, what happens when a pair of wolves are introduced to an island with a finite number of moose. Students might conclude that the moose population would eventually be annihilated – but that is not what happened historically. As the students contrast their original predictions with the simulation results, they may be struck by the difference between their prediction and the simulation results. As I’ve written many times before, this is cognitive dissonance – and when applied correctly may stimulate learning. When applied correctly, students will say ‘I didn’t know that“ and want to probe more. When applied incorrectly, students will simply be overwhelmed and shut down.

The key exploration in the moose-wolf example is with a model. The model was generated by Scott Fortmann-Roe with a tool called InsightMaker. InsightMaker is a free simulation and modeling tool. It is easy to use and yet powerful. It is cloud-based and works with the LodeStar authoring tool as either embedded content or linked content. Models created with InsightMaker can be used to promote critical thinking in students. The model can expose input parameters as sliders. Students can change the value of an input and see the change in the output after they click on the ‘Simulate’ button. InsightMaker is made up stocks, variables, flows, converters and more. Stocks are simply containers for values such as population. Variables can hold values such as birth rate, death rate and interest rate. Flows are rules that can perform arithmetic operations on variables and affect the value in stocks. Students can click on the flow affecting the value of a stock and see the rules. They can explore all of the relationships. In the case of a feedback loop where the output is combined with the input to affect a new output, students can study the relationships and gain insight into dynamic systems. Instructors can also simulate the spread of diseases through populations. They can control the probability of infection and the degree to which the population can migrate away from the infected. They can control the length of infection and the transition to a recovered state. The instructor can model one person and then generate a population of such persons.

Models are an excellent way to engage students – to get them to explore, to ask what-if questions and notice patterns. In public health, students can change the parameters of specific disease like the Zika virus. In economics, students can increase supply or demand. In engineering, students can work on wind resistance models.

With the LodeStar authoring tool, instructors can link to or embed an InsightMaker model. They can then insert a series of questions to check students’ understanding and provide feedback. The link below shows a simple example of the Isle Royale model and the Explore-Validate pattern.

www.lodestarlearning.com/samples/Isle_Royale_Mobile/index.htm

**Conclusion**

We have been listening to students. The way they describe their online learning experience seems pretty humdrum. Instructors don’t need to rely on publishers to create stimulating interactive lessons. They can take matter into their own hands with tools like InsightMaker. InsightMaker fulfills the *Explore* part of the activity. LodeStar fulfills the *Validate* phase.

Pingback: What’s new in LodeStar 7.2 Build 36 | LodeStar 7 Help