The Problem with Simulations

Introduction:

As a student I can be told about central tendency in statistics and the properties of a normal distribution.  I can memorize the difference between mean, mode, and median.  I might even do well on an exam, if asked to calculate the standard deviation.  I could, by rote, follow the four steps, and produce an accurate number – and yet have no concept of variance, and the significance of samples and of sample sizes and understanding the complete picture of central tendency.  Or I could play with a simple simulation as illustrated on the following website:

http://onlinestatbook.com/stat_sim/sampling_dist/index.html

In this simulation I see the real population distribution and I see the output of mean, median and mode – and standard deviation.  I can see that I need a lot of samples in order to start to see the low frequency outliers from the mean.   Rather than being told something, or memorizing a formula, I get to manipulate numbers and see the story of central tendency and variation play out.

I’ve moved from the lower level memorization of a definition and the lower level performance of a procedure to the higher level conceptual understanding that is so important in the field of statistics.

The problem with the term simulation

Perhaps, my example doesn’t quite measure up to your idea of a simulation. That indeed is one problem associated with the concept of ‘simulation’.  We have one word that describes a wide range of things.  A simulation can be any number of things ranging from this ‘smart’ animation of samples to an immersive virtual world, complete with body suits and head gear.  The Inuit, reportedly  had different words for snow, including “matsaaruti” for wet snow and “pukka” for powdery snow. Instructional designers have one inadequate term for a full range of activities:  the simulation.

But that’s not the only problem.

I’ll set out in this post to outline categories of simulations, champion their value, and help clear away some of the obstacles to their adoption.

The need for higher order strategies

Despite their value, simulations represent a very small percentage of online learning activities.  Many business, medical and engineering programs engage their students in simulations, but the ratio of simulation-based activities to all online learning is small.

More than ten years ago, research at Cornell University (Bell, 2008)  cited a study that places simulations at a ‘relatively small percentage (approximately 2-3%) of the total e-learning industry’.  The study states that the costs of producing simulations is high and the effectiveness of simulations has had mixed reviews.  That’s the heart of the problem. The authors of the study suggest that “instructional designers are left with little guidance on how to develop an effective system because the factors that influence the effectiveness of simulation-based training remain unclear.”  Not a very promising start.

But, in my view, simulations are an important strategy for online instructors.  In order for online learning to have any significant impact on learning performance, we need instructors to be skilled at selecting strategies that promote higher order thinking.  Too much of online learning replicates the worst of the classroom experience, in which students passively receive a lecture.  The interactive portion is resigned to a quiz. There are significant alternatives – but they are not easy to implement.  The simulation, as a strategy, is the most challenging.

Simulations are effective because students enjoy engaging in simulations and being challenged to think.  Instructional designers often prescribe or design simulations to promote higher order thinking that helps students synthesize facts, concepts, principles, rules and procedures.

Educational psychologists recognize the value of simulations to promote cognitive complexity – which is the student’s ability to detect nuances and subtle differences that might impact their decisions or judgement.

In a meta-analysis conducted by Dr. Traci Sitzman at the University of Colorado, computer-based simulation games promoted students’ retention of content, belief in their own capacity to complete the tasks, recall of facts, and their procedural knowledge (Sitzmann, 2011).

But what is a simulation – and how can busy, online learning instructors leverage this strategy?

As mentioned, the term ‘simulation’ covers a broad range of activities – from the very simple, to the very sophisticated.  We’ve all seen the complex training simulators used in space and flight training.  A commercial aircraft simulator can run from ½ million to several million dollars.  Clearly outside of our budget.  We’re also familiar with high fidelity simulations in nursing.  They range from virtual reality systems to high fidelity mannequins.  These are two categories of simulations that require significant investment.  There are other types of simulations, however, that are simpler and affordable.  And they can positively impact every discipline.

A Range of Types

Under this umbrella of simple and affordable, we can include a range of simulation types.  In past articles, I’ve written about interactive case studies.  In interactive case studies, students are presented with a case and some resources. They have to do something as a result such as create a business plan, solve a problem, uncover underlying issues…whatever.  In the past, I contributed to a team working on an interactive case study that involved assessing a student’s eligibility for credit for prior learning.

In decision-making scenarios (a type of interactive case study), a student is placed in a situation, must collect information, make a decision and then evaluate that decision based on the expert answer, which may come in the form of feedback from a coach or from the revealed consequence of the decision.  I’ve written about a decision making scenario that placed the student in Abraham Lincoln’s shoes when southern states were threatening to secede.  As a student, you consult the same advisers who Lincoln consulted.  You make a decision and then contrast that with what Lincoln actually did.  The whole idea behind this decision-making activity came from a professor of history at Tulane University.

Kognito  (https://kognito.com/) produces a wide variety of simulations for different audiences, including mental health professionals and school personnel.

One of their products educates faculty, staff, and students about mental health and suicide prevention.  In their simulations, the company employs a variety of strategies:  users interact in an environment made up virtual characters and virtual settings.  The learners role-play by selecting the most appropriate thing to say in a simulated conversation.  Learners get immediate personalize feedback as they engage in decision making in an interactive case study.

Another type of simulation involves students tweaking the values of parameters and seeing the result graphed.  For example, an Isle Royale simulation has students tweaking the initial number of wolves and moose on an island.  After the simulation is started, students watch the wolf and deer population rise and fall until the populations fall into a pattern.  The InsightMaker site hosts thousands of this type of simulation.

Another popular type of simulation is found in the interactions featured at the University of Colorado. https://phet.colorado.edu/  Student learn concepts by changing parameters.  In learning about Ohms law students can increase or decrease voltage, increase or decrease resistance and then see the resulting amperage.  The display is highly visual with all parts of Ohm’s Law graphically illustrated. Even the equation is illustrated, with parts that grow and shrink in size.

 

Screenshot of a LodeStar Learning Activity on SIR modeling

Screenshot of LodeStar activity with embedded InsightMaker on SIR modeling: Infectious diseases

A Range of Purpose

Simulations fulfill a range of purposes or functions.  The purposes aren’t mutually exclusive.  Simulations may involve one, several or all of these.

Functional or Procedure simulations help learners perform a function in a given situation.   Software simulations, for example, require learners to perform tasks in the software environment.  Vehicle simulators and high fidelity mannequins require learners to do the right thing at the right time.

Conceptual simulations helps learners view a concept in isolation and, in some cases, change the parameters, see the effect and be able to recognize the concept in action.  For example, in a simulation of predator-prey relationships, students see a unique pattern that always develops regardless of the initial number of predators or the initial number of prey.

Process Oriented simulations often include underlying mathematical models – mathematical representations of a real-world system. ‘What-if’ process simulations ask students to make a change to a process and see its outcome.  Students change inputs and immediately view outputs.

Synthesis Oriented simulations involve learners in gathering information, making observations, recalling key principles, concepts and facts and then putting it all together to make the appropriate choices.   Decision-making and interactive case studies are examples.

Behavior Oriented simulations engage students in the affective domain and require students to choose the appropriate behaviors and demonstrate the right attitude given a situation.  Choosing to recycle garbage or choosing to manage time are examples.

In short, types of simulations align nicely with types of knowledge.  Less important is the technology – virtual world, versus two-dimensional animation, versus text narrative – and more important is the behavioral and cognitive change.

By focusing on what is important and eliminating what is not important, we can pare away cost and remove one of the obstacles to using simulations in our curriculum.

Definition

It is difficult to sum up simulations in a single definition and so I offer these attributes.

An educational simulation:

  • Loosely or closely represents reality (low versus high fidelity)
    • Represents or models the behavior or characteristics of a system
    • Mimics the outcomes that happen in the natural world.
    • Pares away unnecessary detail
  • Stimulates a response in the learner
  • Presents learners with a situation that causes them to think – that is, draw upon their knowledge and procedural and analytical skills to make decisions, to form hypotheses, to draw conclusions, to state rules or act in some way
  • Provides feedback

Under this broader definition, a disease model that shows a population that is susceptible to, infected by, and recovered from a disease is a simulation.  It is a particularly useful simulation if its underlying math and logic represents a real world phenomenon – even if it is an over-simplification. It is also useful if it allows the student to change parameters of the model, such as population size, the number who are initially infected, the proximity of members of the population and so forth and then make inferences about the outcome.  In this way, the simulation invites learners to ask ‘what if’ questions.   The results of student input cause learners to think and, perhaps draw their own conclusions about general rules and principles.  Changing parameters and running the simulation provides immediate feedback.

General attributes that make simulations an effective learning strategy

From a meta-analysis (Cook, 2013)  focused on simulations involving virtual worlds, high fidelity mannequins, and even human cadavers, we learn about the positive effects of key learning strategies including:

  • range of difficulty
  • repetitive practice
  • multiple learning strategies
  • individualized learning
  • feedback
  • longer time

In short, students benefit from interactions that vary in difficulty, present opportunities for repeated practice, engage them in different ways, adapt to student performance and confidence level, give them time, and, importantly, provide meaningful feedback.  Those are useful characteristics of any eLearning.

Much of eLearning doesn’t include any of these characteristics — not one!   A lot of eLearning is built on voice over PowerPoints that have been imported into an eLearning authoring tool.  The feedback is limited to a score on a final quiz.  More finessed eLearning comes in the form of talking head videos with chapter quizzes.  Many of the learning platforms that allow instructors to market their courses don’t even bother with the import of interactive learning objects.  They support video and audio files and PDFs – that is, presentation formats, not interaction formats.

By necessity, the corporate world relies on voice-over PowerPoints.  High-end eLearning development shops bristle at the prospect of creating a voice-over PowerPoint.  They are often engaged in making highly creative learning objects that impact a lot of employees and yield a high return on the investment.  When I worked for these companies, we developed six figure learning objects that would reduce service calls, for example, and save a company tens of thousands of dollars or cut down on the use of natural gas, to cite another example,  and save a utility tens of thousands of dollars.  But the economics don’t always support such high-cost investments.  The continuing education industry for medical and accounting professionals, for example, is characterized by literally thousands of voice-over PowerPoints.  These industries change so fast.  The demand far outpaces our ability to create quality learning experiences.

Instructors may recognize or accept that simulations are important, but don’t know where to begin. Obviously, building a half-million dollar simulator is out-of-reach, but there is something that instructors can do to make use of this strategy.  The next section is dedicated to some practical suggestions.

Simulation tools

There are a number of web sites that provide free authoring, hosting, and viewing of simulations. One of my favorite cloud-based simulation tools is InsightMaker. (https://insightmaker.com) InsightMaker supports a variety of different simulation types.  Instructors can build their own simulations and models or use one of thousands that have been created across many disciplines. I want to emphasize that last point.  You will be able to find a simulation that you can use – but it may take a little patience and perseverance.

In biology, an instructor can find simulations on food chain, prey/predator population dynamics and much more.  In business, one might find sales forecasting, or marketing simulations.

In ecology, an instructor can simulate the tipping effect of climate change when shrinking icecaps accelerate climate change with bodies of water absorbing radiation rather than reflecting it.   Students can change the values of parameters and see change accelerate.

Here are other sites and examples worth investigating:

https://blog.cathy-moore.com/resources/elearning-samples/

https://phet.colorado.edu/en/simulations/category/chemistry

https://elearning.cpp.edu/learning-objects/organic-chemistry/tlc/

 

And for the engineer:

https://www.youtube.com/watch?v=iOmqgewj5XI

https://www.youtube.com/watch?v=CFwrfoyRE6c

 

Conclusion

There are a number of ways to get started using simulations.  Finding simulation websites is one; finding cloud-based modelling tools is another.

There are a lot of elements to a simulation.   The authors of the Cornell study suggest that all too often we focus on the technology of simulation rather than on the critical educational elements that are found in the content, the level of immersion (fidelity related to the real world), the interaction, and communication.  The cost is strongly associated with the design and the production of content – the imagery, music, the interface, etc.  The interaction, however, may be accomplished relatively inexpensively with text narratives and decision-making (supported by authoring tools).  The last element, communication, can certainly be facilitated through the learning management system discussion board or group discussion in the classroom.  If we can study these elements discretely and evaluate their impact on learning, as instructional designers, we can separate high cost artwork and media production (that may have little instructional value) from low-cost instructional strategies that provide great value in terms of learning outcomes.

References

Bell, B. S., Kanar, A. M. & Kozlowski, S. W. J. (2008). Current issues and future directions in simulation-based training (CAHRS Working Paper #08-13). Ithaca, NY: Cornell University, School of Industrial and Labor Relations, Center for Advanced Human Resource Studies. http://digitalcommons.ilr.cornell.edu/cahrswp/492

Sitzmann, Traci, (2011) A Meta-Analytic Examination of the Instructional Effectiveness of Computer-based simulation games

Cook DA, Hamstra SJ, Brydges R, Zendejas B, Szostek JH, Wang AT, Erwin PJ,
Hatala R. Comparative effectiveness of instructional design features in
simulation-based education: systematic review and meta-analysis. Med Teach.
2013;35(1):e867-98. doi: 10.3109/0142159X.2012.714886. Epub 2012 Sep 3. Review.
PubMed PMID: 22938677.

 

Postscript: A Proposed Low-fidelity, low cost simulation

On a personal note, for the last several years I’ve been thinking about low-cost simulations that pay high dividends in terms of student outcomes.  As mentioned, I’ve written about decision-making scenarios and interactive case studies.

My latest experiment has been with a model that I call a State Response Engine (SRE).  In the future I hope to write extensively about it.  Briefly, SRE presents the learner with a randomized state and requires the appropriate response.

To better understand SRE, let’s imagine this eLearning activity.  The learner is an online instructor.  The situation is that the college dean has presented the instructor with a set of learning goals.  The online instructor must follow the appropriate process in order to select, develop and evaluate activities and assessments that will align to the goal and help students achieve that goal.

The random state comes in the form of a specific student audience and learning goal.  The engine (the computer program) selects an audience (e.g. non-majors versus majors or freshman versus capstone students, etc.)  From that point forward all of learner responses and any future random states relate to the first choice.  If the computer chose senior students completing their capstone – all of the future states relate to senior students.  All of the resources that appear relate to senior students.  The learners can then investigate the resources for key situational factors.  The engine then randomly selects a learning goal.  The goal might involve the capstone students in promoting conceptual knowledge or putting it all together – but a goal of recalling some basic facts and figures would not be in the selection pool.

The engine then displays resources connected to the state and options in the form of learner responses.  Some of the options or choices would be valid regardless of the goal and student audience.  Others would be valid only for a specific type of knowledge or a class of learner.

The learner progresses through phases or categories.  The phases might be specific stages in a process or something else.  In this case, the phases relate to recognizing situational factors, developing objectives, designing assessments, and designing activities.  In short, a backward design process.  Some of the response options will be correct; others will be incorrect based on the randomly chosen state.   At every stage, learners will be shown links to resources that will help them make the right decisions.  After learners have chosen what they judged to be the right responses, they submit for evaluation.  They then receive a response by response critique and an overall score.

That’s it in a nutshell.  It may or may not be a useful arrow in the instructor’s quiver, but we must continue to search for low-cost high-yield strategies that promote higher-order thinking.  I’ll continue in this pursuit and celebrate other attempts to create effective strategies.

Advertisements

Online Learning Trends: Risks and Opportunities

Introduction:

Our web journal focuses on specific instructional design strategies for online learning.  But in this post, I step back and address something much more fundamental – and at risk.

Online learning has tremendous potential.  I am encouraged by faculty who really want to do a great job in their online courses and continuously strive to do better.  Chances are very good that you are in that group.  You are taking the time to read this blog and explore new ways of engaging students.

Next month I’m retiring from my position as Director of the Center for Online Learning from a state university. This gives me occasion to reflect on the eight years I’ve served in this role and on current trends.  As trends would indicate, the immediate future presents faculty with both risks and opportunities.  Faculty who are invested in quality online learning should think about the immediate future very carefully and help direct policy and best practices at their institutions that advance the state of teaching and learning in this relatively new medium.

Online learning can be an instrument of good.  But because of its technological nature, it is susceptible to scale, mechanization and bad practice. At risk, at the very least,  is the autonomy and self-determination of faculty.

In our university, faculty make the critical decisions related to their courses.  They are free to make choices related to activities, assessments, instructional materials, teaching methods and course support.  When faculty are free to decide and exercise that freedom, individually and collectively, they exercise self-determination.  With self-determination comes leveraging of faculty strengths and recognizing their own limitations;  responsibility for decisions; and substantial personal reward for success.  Self determination means faculty can apply their competency, and effect positive change in their students.

Risks to self-determination may appear in many forms.  Today, a few of the potential sources of risk include:

  • Highly competitive and large-scale online programs that discourage or eliminate fledgling entrants
  • A billion dollar Online Program Management industry that can dictate the design of courses from entrance requirements to curriculum and course design.
  • Turn-key publisher platforms that demote the decision-making of instructors

Competition, Online Program Management (OPM), and publisher resources are not inherently bad things.  I view them as risks only when they subvert faculty control. OPMs, for example,  have successfully ramped up online programs and built university enrollment.  Publisher platforms have provided course content and resources where, perhaps,  none existed.  Each of these trends, however, does impact faculty self-determination and needs to be carefully considered.

 

1024px-Adult-coding-collaborate-1181472

Photo Credit: Christina Morillo,  Creative Commons CC0 1.0 Universal Public Domain

 

The Nature of Change

The nature of change in online learning can be misleading.

Many changes in this space get hyped and then disregarded when they don’t achieve immediate, high impact.  But, then, over time they have profound, long-lasting impact.  The MOOC is a good example.  2012 was the hype year.  2013 was the year of disillusionment.   Today, MOOCS are a vital enrollment strategy for many universities.

(See https://en.wikipedia.org/wiki/Hype_cycle for a definition of this phenomenon.)

In a somewhat related manner, many of the changes in the last decade happened incrementally without cataclysmic impact and disruption.  And yet eLearning is in a very different place today because of them.

The Recent Past

It is eye opening to consider just a few things that the past decade has brought to us.  I’ve intentionally omitted a deeper discussion on many things such as Virtual Reality, Augmented Reality, eBooks, artificial intelligence, and so much more.  I’m sticking to a few basic things that have had profound impact on just about everyone.

Online enrollments have steadily increased

The Babson Survey Research Group showed us year after year that distance education enrollments continued to grow, even as overall higher education enrollments declined. Today, nationally, nearly a third of all higher education enrollments are online. (Seaman, Allen & Seaman, 2018).

(For more on the Babson reports see: https://www.onlinelearningsurvey.com/highered.html)

At our state university, nearly a third of our credits are earned by students in fully online classes.  More than forty percent of the credits are earned in either online or hybrid classes.  Most of our students take at least one online class each year.

Over the past eight years, online enrollments kept climbing as did the perception of faculty that online courses were qualitatively on par with face-to-face courses.  As more faculty became engaged in online learning, perceptions changed in favor of online learning.

Today, imagine the negative impact on your university if online enrollments were removed overnight.

Tools have become cloud-based

In addition to online enrollment increases, most of our tools today have become cloud-based.  Our IT department, in a metaphoric sense, is spread across many for-profit companies who host our learning management system, media system, collaboration tools, office applications, remote proctoring, and more.  Where you won’t easily find a cloud-based service is in how to improve teaching and learning experiences for your own students.  Universities will need to keep online pedagogy/andragogy  in their wheelhouse of expertise.

(See article that recognizes shift away from technology-focused professional development to pedagogical-focused:  https://www.insidehighered.com/digital-learning/article/2018/02/28/centers-teaching-and-learning-serve-hub-improving-teaching)

Accessibility, Mobility and Interoperability have become critical

In the past decade, legislation and compassion have demanded that we pay greater attention to accessibility for all students, including those who are visually and hearing impaired.  Our courses play on mobile devices and are adaptable to smart phones, tablets, and desktop computers.  Cloud-based services talk to one another.  The learning management system survived obsolescence by partnering with other service providers.   Our university learning management system, because of integrations with other providers, can display media from a library, check originality of student papers, remotely proctor, engage students in a discussion over a PowerPoint, and perform other services that are not innate to the platform.

It is a different world – and yet it didn’t seem to change overnight or particularly startle anyone with its abruptness.  It didn’t feel like an eruption or disruption.

The Near Future

Current trends suggest that the future won’t be any different.  It will change incrementally, but one day instructors will wonder what happened!  Related to faculty autonomy and self-determination, specifically, here are some of the critical market forces faculty should observe:

Market dominance

The annual Babson report tells us that nearly half of online students are served by five percent of higher education institutions.   Only 47 universities enroll almost one-quarter of fully online students.  Those universities will presumably have the resources to reinvest in curriculum development, instructional design, enrollment management and aggressive digital marketing.  Smaller institutions and new entrants to the marketplace may be forced out or forced to partner with each other and with external organizations in order to compete.  The challenge to faculty comes with a perceived gap between well-resourced and under-resourced programs, unnatural alliances and forced partnerships.

On a side note, the encouraging news for smaller public universities is that the majority of online students take at least one course on campus.  Most online students come from within 50 miles of campus.  Distant education is local, which means that the university can cultivate relationships with partnering two-year colleges, local employers, and community groups and market through both traditional and digital methods.

In short there is hope for smaller institutions – but only if the following are diligently and vigorously supported:

  • Strong faculty support for online development, both pedagogically and technically (instructional designers, instructional technologists, learning management specialists)
  • Strong student support (orientations, mentoring, advising, tutoring, high impact practices like first year seminar and electronic portfolio)
  • Integrated, team-based approaches to enrollment management, marketing, advising, online program development and professional development.
  • Communities of practice that encourage faculty to share best practices with one another and especially with other members of their discipline

In my opinion, the days of working in silos are numbered.  If programs are developed without market analysis and attention to enrollment/communication strategies from the start, they will not compete and will not be available to faculty and students in the future.

Instructional Design Support

In the past, the tide of instructional design has ebbed and flowed.  Today and toward the future, it is cresting.  A quick scan of Indeed.com will convince you of that. The best programs now have a phalanx of instructional designers.  My chats with educational leaders has underscored the fact that instructional designers provide university programs with a competitive advantage.

The Online Learning Consortium (OLC) reports that as online learning has grown there has been an equivalent increase in demand for instructional designers in higher education institutions (Barrett, 2016).

(To learn more about OLC and the evolving field of instructional design, visit https://olc-wordpress-assets.s3.amazonaws.com/uploads/2018/07/Instructional-Design-in-Higher-Education-Defining-an-Evolving-Field.pdf)

Fulfilling that demand has not been consistent across universities.   In a recent survey, fewer than half of those who taught online said they had worked with an instructional designer.  The following article provides one interesting approach to sizing the number of designers to the institution.

https://www.insidehighered.com/blogs/technology-and-learning/many-instructional-designers-librarians

In my opinion, we typically don’t have enough instructional designers. Designers play a critical role in helping faculty match instructional strategies to the level and type of learning and can draw from a tool chest of techniques, applications, methods and evidence-based practices.  A recent survey of instructional designers, cited by OLC, showed that 87% of respondents have masters’ degrees, and 32% have doctoral degrees.  Most higher education instructional designers provide faculty with direct support in design and professional development (Intentional Futures, 2016).  The result is increased student performance and satisfaction as evidenced by research studies on specific practices.

At our university, through extensive professional development we saw a growing body of faculty adopt the skill set of instructional designers.  We saw faculty who could critically evaluate online courses and discuss issues of course alignment, integrated course design, accessibility, student engagement and many of the issues that concern instructional designers and make a difference to students.

In the past, in instructional design and other areas of online learning, higher ed institutions failed to build their core competence.  Several sources identify the number of instructional designers employed by colleges and universities as 13,000. But, as the report from the Online Learning Consortium states, “There is still a certain mystery surrounding who instructional designers are.”

In short, instructional designers in a good relationship with faculty will strengthen the faculty’s ability to make good decisions and produce a good, impactful course.  Over time, faculty who design and develop online courses should acquire many of the skills of an instructional designer.  That can happen through seminars and workshops and communities of practice, learning circles, brown bag lunch sessions – all of it sponsored by faculty groups and the centers focused on faculty development and online learning.

Online Program Management

Wherever we have failed to build our core competence, external providers are ready to flood in and assist us at great cost to the university.

One category of external provider is the online program management company.  Online Program Management companies (OPMs) provide expertise and services in instructional design, enrollment management, digital marketing and other areas in support of online learning.  They provide the support through a number of revenue-sharing mechanisms.  An online program manager, for example, might help plan a program, design courses, produce courses and manage enrollment and marketing.  In exchange for these services, the Online Program Management company might receive revenue equivalent to 40 to 60 percent of the tuition dollars earned from the program for a contracted number of years.  A typical number is 10 years.

The following Eliterate article estimates that 27 companies currently provide Online Program Management.

https://eliterate.us/online-program-management-market-landscape-s2018/

The alternative is that there are external providers who will provide a needed service for a fee.  For example, if the university is weak in digital marketing, an external fee-for-service organization can help. In this arrangement, the university pays the fee up front but keeps the tuition revenue.  A growing number of companies provide services and then recover the fees through tuition revenue sharing – but only until the initial costs are covered.

Faculty need to be aware of all of these flavors of services because faculty are invested in the future of the university and its their autonomy that is at stake.

One of the founders of the original Online Program Management companies (but who now has a vested interest in a different business model) describes a growing dissatisfaction with the OPM revenue-sharing model:

“He compared revenue-share OPMs to the businesses in the early 2000s that built websites for millions of dollars. At the time, they were the only people who knew how to do it, but as more workers learned HTML, these companies went from ‘very valuable to pretty much out of business’ in a very short span, he said.”

Inside Higher Education, 2018

 

According to Inside Higher Ed, the bottom line is one that all faculty should recognize:

“To launch a successful online degree, institutions need expertise in instructional design, must be skilled in identifying areas where there is student demand, and must have enough funds to develop and market the program, which several sources said could cost upward of $1 million each.”

 

https://www.insidehighered.com/digital-learning/article/2018/06/04/shakeout-coming-online-program-management-companies

 Publisher Platforms

Business analysts predict that the US digital education publishing market will register a compound annual growth rate of close to 12% by 2023. (Research And Markets, 2019) The digital education business is a huge and growing market.

Online faculty can choose to use digital publisher resources for part or all of their courses.  Textbooks often come with a publisher-based online learning platform where students can engage with course material.  In many cases the publisher platform is integrated with the university learning management system.  Students log in to their university online course and seamlessly connect to the publisher resources without a second log in and in many cases with no awareness that they are accessing the publisher platform. In some cases, the reverse is true.

Key players in the U.S. digital education publishing market are Cengage Learning, Inc., Houghton Mifflin, McGraw-Hill Education, and Pearson.

The upside to publisher platforms is that they save instructors time and that publishers are continuously improving their offerings, which, in some cases, include adaptive learning.  (McGraw Hill’s LearnSmart, for example.)  The downside is that, for some platforms, answers to quizzes and solutions to problems are discoverable on sites that students use in order to cheat on their assignments and exams.

The more insidious downside to publisher platforms is that they can lead to an instructor acquiescence to all of the critical design decisions of a course.  In some, hopefully rare, cases instructors substitute publisher PowerPoints for their own advance organizers, explanations, guiding questions, graphical illustrations, and materials that are contextualized for the specific circumstances of the students, program and environment.

As one online program manager cautions:  “Never allow publisher-made materials to be the meat of your course!“

Learning House

Adaptive Learning

Adaptive Learning has huge potential and should be continuously monitored and repeatedly evaluated – but again, the role of the faculty member should be carefully considered.

Contrasted with traditional Learning Management System content, adaptive is not a ‘one size fits all’ learning product.  Typically,  we structure topics within a learning management system in a sequence.  All students, regardless of knowledge, experience or ability move through the same sequence.  Adaptive Learning, in contrast, assesses students on what they know and what they need to learn.  Students then view or engage in the content that they need.  If students miss items or lack confidence, then the adaptive system connects them to the appropriate prerequisite skills.

Adaptive Learning solutions are available in a variety of forms.  For one, they are available as turnkey systems.  McGraw Hill’s ALEKS is a popular product that assesses and teaches math subjects that range from pre-algebra to calculus.  They are also available as open platforms in which an instructor or department can build content and sequence learning pathways that capture the prerequisite relationships between topics.  Examples of open adaptive learning systems include Acrobatiq, CogBooks, and BrightSpace LeaP™  .  Many of these platforms can be integrated with learning management systems through an interoperability standard called LTI (Learning Tools Interoperability).

(For a glimpse into adaptive learning, visit: https://campustechnology.com/articles/2019/04/24/new-frontiers-of-adaptive-learning.aspx)

Once the adaptive system has been designed/adopted and deployed, faculty need training on how to facilitate a group of students who are progressing at their own pace but still need the academic and social support of their peers and instructor.  There are many design decisions related to how an adaptive system dovetails into a course – and faculty need to be at the center of that decision-making.

Open Educational Resources (OER)

Open Educational Resources are already impacting us in so many ways.  You might be surprised to hear faculty denounce open textbooks, for example, and yet find them in your book store.  Faculty can engage with OER on so many levels.  They can find open resources cataloged in dozens of repositories such as OER Commons (https://www.oercommons.org/ ) and Merlot (https://www.merlot.org/merlot/).  They can purchase completely assembled OER-based courses from, ironically, publishers who earn more from their digital platforms than from underwriting and maintaining original content.  They can use repositories like OpenStax (https://openstax.org/ ) to find complete textbooks or sign up for a free account in OpenStax CNX (https://cnx.org/), which gives granular access to open material at the page and module level.  Finally, faculty can participate in the creation of OER by creating content, assessments, learning objects and supplementary material and posting them to a repository.  In our state, we’ve just launched Opendora (http://www.opendora.com/ ) that houses materials created by MinnState faculty.  Faculty can also participate in textbook reviews.   In other words, faculty can engage in the use of OER in many ways before even considering authoring a book and making their intellectual property freely available.

Conclusion

Current trends and practices offer support to faculty, but also have the potential of rendering instructors passive bystanders in their own courses.  The online learning space is becoming more competitive and expensive.  To many, this seems counter-intuitive. After all, online learning should be opening up new markets and it should be cheaper.  Universities can decrease their physical footprint!

The reality is that universities will either invest internally in multifaceted teams in support of strategic program development or pay outsiders to design, build and market online programs.  Potentially, instructors could be supported or sidelined.   We will either invest in instructors populating adaptive systems or purchase off-the-shelf solutions that may not, in the end, be well adapted to our learners.  We will either support rich curriculum development or populate online courses with publisher materials and, in the end, pass on the cost to students.   We will either use OER in new ways of engaging students or purchase turn-key solutions built entirely on OER.

Faculty have the greatest stake in the future direction of the university and the impact of these key trends.  Their own autonomy and academic freedom is at stake.  Faculty need to be aware of the issues and be present wherever decisions that impact curriculum development are made.

References:

Michael Feldstein’s Blog (industry observer) eLiterate
https://eliterate.us/

Phil Hill’s Blog (industry observer)
https://philonedtech.com

Wil Thalheimer’s Debunker Club (research to practice)
https://debunker.club/debunking-resources/

Online Learning Consortium
https://onlinelearningconsortium.org/

Inside Higher Ed
https://www.insidehighered.com/digital-learning/views/2018/04/04/are-we-giving-online-students-education-all-nuance-and-complexity

Publishing Market Research
https://www.researchandmarkets.com/reports/4764929/digital-education-publishing-market-in-the-us?utm_source=CI&utm_medium=PressRelease&utm_code=4lszwc&utm_campaign=1237781+-+US+Digital+Education+Publishing+Market+Report+2019+-+Increasing+Number+of+E-Learning+Enrolments+in+the+Higher+Education+Sector&utm_exec=joca220prd

The Challenge of Online Learning is Challenge

Introduction: Placing Students at the Center

We know the capacity for good movies to stimulate our curiosity, make us alert, shock us, and tug on our emotions.  Movies are carefully scripted to evoke those audience experiences and maintain our interest.  The script and the production can’t be contrived without careful attention to their impact on the audience.

In higher education, we ought to be thinking about eLearning and our impact on students in a similar way but, evidently, we don’t.  We ought to be thinking about student emotion, curiosity, and motivation. Instead, we focus on content and we appeal to the student’s sense of order, and stability.  We reduce surprise and meet expectations.  In a somewhat cynical sense, it is a transaction.  Students need to know what to expect to plan their time, block out their schedules, and reduce the risk of failure. If they do these things correctly and put in the time, they get credit.

And so, we design accordingly.  As instructors, we introduce ourselves.  We then focus on good housekeeping.  We tell students about the goals and objectives.  We tell them what they are expected to do.  Each assignment has a clear due date and a point value.  And so on.

We think that good eLearning should have few surprises.  Good eLearning should meet all of the criteria of a good housekeeping rubric.  Objectives are stated.  They are measurable.  They align to activities and assessments.  All materials support the goals.  Technical support is identified.  And so on.

And it’s not that these things aren’t important.  It’s just that our rubrics don’t probe the depth of student experience in an online course.  In one quality review rubric, the learning activities occupy only one section.  The goal of active learning is but one essential standard out of many.

Some of our thought leaders decry the present state of online learning.  M. D. Merrill called it shovel-ware.  Michael Allen calls it boring.   Cathy Moore calls it an information dump.  Will Thalheimer says that ‘eLearning has had a reputation for being boring and ineffective at the same time it is wildly hyped by vendors and eLearning evangelists.’

(https://www.worklearning.com/wp-content/uploads/2017/10/Does-eLearning-Work-Full-Research-Report-FINAL2.pdf)

But more and more of higher education is being delivered online.  Students demand it.  So then, what is the remedy to this boring, ineffective, information dump hyped by eLearning evangelists like me?

The critics give us the answers – if we would only listen.  Michael Allen proposes learner challenge as a source of motivation and interest.  M. David Merrill, in his first principles of instructions, puts learner problem solving at the center.  Others have written about discovery or inquiry or active learning or constructivism.  All of these things put the learner at the center, and not the content.  As Michael Allen says, ‘Content is not king.’  No, content is not king; it’s not even prince.

Will Thalheimer, based on his extensive research writes:

In general, providing learners with realistic decision making and authentic tasks, providing feedback on these activities, and spreading repetitions of these activities over time produces large benefits.

https://www.worklearning.com/wp-content/uploads/2017/10/Does-eLearning-Work-Full-Research-Report-FINAL2.pdf

So, what can we do as instructors to think in terms of student experiences, active learning, problem-based learning, emotion, curiosity, surprise, novelty, realistic decision making, authentic tasks, constructive feedback, repetition – and all of things that place learners at the center of our design rather than content?

One answer lies in challenge.

Mystery Skull Interactive Challenge

The screenshot below represents an activity that engages students with a challenge designed by the Smithsonian Institute.  In the activity, students drag skulls into boxes, rotate them, and try to identify the species of the skull.  When stumped, students can ask for hints.

The same content without the challenge is covered in hundreds of courses.  You can picture the familiar old pattern. You can anticipate that there would be a topic named ‘Homo Habilis’.  The topic would feature several paragraphs of text, perhaps with pictures, that describe the distinguishing features of this species. Homo Erectus would be dutifully covered and then on to  Neanderthalensis, and Homo Sapiens.  The instructor might even link to the Smithsonian activity.

What if, instead, the instructor designed the course with the challenge at the beginning or at the metaphorical center of the course.  The course content would serve as a resource to help students master the challenge.  In the challenge students examine and compare skulls.  When stumped, they consult the hints – and look up resources.   In this scenario students play an investigative role.  They are immediately challenged and immersed in the heart of the course. Their natural curiosity is piqued.  They experience the ‘pain’ of failure when they make incorrect choices.  Their imaginations are stirred as they role play the scientist.

http://humanorigins.si.edu/evidence/human-fossils/mystery-skull-interactive

It may be difficult to imagine instructors designing challenges. The Smithsonian Institute obviously had a budget.  It is evident in the media. The skulls are in 3D.  Students can rotate them.  The interface is beautiful.  The learning object was done in Flash, which took some scripting.

Let’s set aside the media production for a moment. (We’ll return to that in the conclusion.)  Let’s focus first on the value of challenge.  If we had the resources and the creativity to design our courses around a challenge, is there value in that?

mysteryskull

Screenshot of Smithsonian Institute Mystery Skull Interactive

 

CCAF Model

Michael Allen would say there is tremendous value in challenge – and challenge is an important element in his CCAF model of design.  CCAF represents context, challenge, activity and feedback.

The CCAF model should be a source of inspiration to instructors who design online courses.  In brief, CCAF is where the fun and, forgive me, the challenge of instructional design begins.  Let’s explore CCAF for a minute.

Context

What motivates your students?  What is the situation in which they will be able to use and apply the learning?  Your course isn’t some abstract, impractical thing.  It has relevance.  It has meaning.  It will impact your student lives.  It will make a difference. Imagine the context in which these things are true.  Imagine the setting that makes the course material relevant, interesting, appealing, and life-like.

Dr. Linda Rening, an instructional designer, writes “What would the learner see, feel, and experience while he or she performed the correct behaviors?”

In nursing, we might place students in the context of a surgery or an outpatient’s home.  In managerial accounting, we might place students in an organization and ask them to gather and analyze information in support of the organization’s strategic goals.   In history, we might place students in Britain in the 30s, faced with the challenge of appeasement versus aggression.

Challenge

‘Challenge’ turns traditional course design on its ear.   As I’ve said, many courses follow a tired pattern.  That pattern invites this prescription:  provide housekeeping details, state objectives, present content and assign readings, elicit performance, provide meaningful feedback (sometimes), and assess.

A ‘Challenge’ activity engages students immediately in thinking about the course content and using it in some way – perhaps unsuccessfully at first.  When developed artfully and skillfully, the ‘Challenge’ will immediately cause the learner to ‘feel’ the relevance of the course material and recognize the difference between what they know and don’t know. They will feel pain.   If the ‘Challenge’ presents too little pain, the student may develop a false sense of security about what they know.  If too much pain, the student may be scared off.

Challenge addresses all aspects of motivation.  Challenge causes students to act – to solve problems, to make decisions, to consult resources, etc.  The right level of challenge causes students to persist.  They engage because the challenge is not too easy and not too difficult.  It’s the Goldilocks engagement.  Just the right level.  Finally, Challenge engages students with a level of intensity.  That vigorous engagement promotes retention.  The things we work harder on are those that are remembered.

In a course on public leadership, the challenge might be to write a testimony in support of a provision in a legislative bill.  Students would have to draw upon their knowledge of history, the law, the public sentiment, and other things in support of their testimony.  In computer science, the challenge might be to write code to perform a task in as few lines as possible.   In law enforcement, students might play the role of a parole officer who needs to assess risk of recidivism without offending the client.

In all of these cases, when students are challenged early in the course, they might recognize what they don’t know and be more open to learning.

Activity

The challenge connects to activities that students must do to increase their level of knowledge and skill.  In a sense, the challenge provides enough cognitive dissonance to motivate the learner to learn.  Cognitive dissonance is a perceived inconsistency between what the learner knows and ought to know to realize the course outcomes.  That difference leads to discomfort that the learner is motivated to reduce.  Too much dissonance can be debilitating.  The right amount is motivating.

Loosely, Dr. Allen’s CCAF model is like M. D. Merrill’s ‘First Principles of Instruction’, which begins with the principle that learning is promoted when learners are engaged in solving real-world problems.   The models are similar in that they are problem-centered. The challenge causes learners to pull in knowledge as needed.  Enough cognitive dissonance is created that motivates learners to seek the resources that will lessen the discomfort of ‘not knowing’.  This is quite different than and an improvement over, for example, a presentation event of instruction. ‘Presentation’ suggests a push to students of relevant information, much like Robert Gagne’s ‘Present the content’ event of instruction.

In sharp contrast, many of our courses are content-centered and not challenge or problem-centered.  Many of our online courses start the same.  Students meet the online instructor through some form of an announcement and then get promptly led to the course housekeeping documents that spell out course title, instructor contact into, course prereqs, description, objectives, reading list, etc.   Another section or document may spell out what is due at what time and for what number of points.  The transaction. Then there’s the boiler plate technical and disability support information and so on.

Before even reaching course content, students have run the gauntlet of course housekeeping information.  One horrific development is that occasionally students will run the gauntlet only to find in the main course content a series of publisher Power Points, interspersed with quizzes and a major project.

A cynic might look upon online courses as entirely transactional.  Students will say ‘if I do the work, I’ll get the grade.’  Or ‘I take online courses because it’s convenient.’   Today’s students balance work, life, family, and … school.  They understand that a certificate or a degree will lead to a job or better pay.  They will exchange their time and effort for earned credits.  They will accrue enough credits to graduate.  And then they will redeem that time and effort in the form of credits for a job or better pay.

Unlike the cynic, our inner instructional designer says that we can create a better learning experience for students online.  Why better?   We can individualize instruction.  We can programmatically add instruction that will help learners overcome obstacles.   We can challenge students in a ‘safe’ environment where their lack of knowledge isn’t exposed.  We can encourage students to take chances without the risk of embarrassment.

Will Thalheimer writes that often online and hybrid courses outperform traditional face-to-face courses.  He asserts that its not the modality of learning that makes the difference.  It is the teaching and learning methods used in the course.

The bottom line is that eLearning in the real world tends to outperform classroom instruction because eLearning programs tend to utilize more effective learning methods than classroom instruction, which still tends to rely on relatively ineffective lectures as the prime instructional method.  (Thalheimer)

 Feedback

Feedback that is detailed and specific and directly related to the learner’s action is a critical element to any learning and particularly important in online instruction.  In the CCAF model, feedback is best when it can be applied to future actions.   Generally, in online learning, students benefit when they receive feedback from one quiz, project or activity that they can apply to the next.   In a CCAF challenge, students act and then receive guiding feedback that will help them with future actions.

So how do I apply CCAF to my course?

Briefly, a strategically placed challenge toward the beginning of the course might provide the level of motivation that causes students to act, to persist, and to work toward their goals with intensity.  This challenge might be in the form of a case study, or a decision-making scenario or an analysis.

Early challenges can be holistic and realistic.  By holistic, I mean that they resemble life itself and bring together the entire scope of the course in terms of facts, principles, rules, concepts, and problems.   Developmental challenges can be more focused on some subset of the course — building skill that can be later applied to the holistic challenge.  Students should get better at repeated attempts as they draw upon the course content.  Realistic challenges should help students transfer skills from the classroom to the real world.

Returning to the Fossil Challenge

Granted, the Smithsonian Institute Fossil Challenge is both engaging and beautifully designed.   But its real value is in getting students to think.  Low tech alternatives can equally engage students.

All instructors with a little ingenuity have the tools available to them.  With the following skills, one can incorporate challenges into learning management system content pages:

  • Creative Story telling
  • Editing and Importing of Images
  • Editing and Importing of audio
  • Creating Hyperlinks
  • Embedding Web 2.0 content from cloud-based applications

 

Instead of rotating objects, for example, instructors can embed a SlideShare viewer into their course or creatively display a series of photos in a film strip.

In the example below, still photos were dropped into a PowerPoint Online template, saved to a public OneDrive folder,  and then embedded into a course.

 

mysteryskullPowerPoint

Screenshot of PowerPoint Online Template, which can be saved to a OneDrive public folder and embedded into a content page

 

Conclusion

Context, Challenge, Activity, and Feedback are all critical to motivational and effective online learning.  Online instructors can reorganize their courses with the contextually relevant challenge at the center of the course, complete with activities and feedback to build student skill.  Most of the ingenuity is in the story telling, the setting up of scenarios, and enabling students to make choices.    If challenges cause students to think or to be motivated to learn more, then the online course will be effective and students will benefit.

 

Aligning Strategies to Types of Knowledge

The challenge:

Cross-country skiers use one type of wax for all conditions.  After all, snow is snow.

That statement is obviously absurd.  Snow varies in age and moisture.  Waxes behave differently given the temperature.  Skiers have different objectives; they may want their skis to grip or glide or both.

Similarly, instruction is instruction.  Course content dictates what must be taught and how.  Again, obviously absurd – but perhaps not so ‘obvious’.  Instructional strategies should vary based on the students, the situational factors as well as the level of learning and type of knowledge represented by the learning outcome.  Instructors need different waxes and techniques based on the conditions.

Successful online course design requires a fundamental shift from instructors being content-centric to being aware of the snow, the temperature and the outcomes.

When I attended university, there was little attention to the pedagogy of instruction.  Snow was snow. It was up to the student to work out the strategies for success.  The professors were knowledgeable and inspiring.  The best of them provided coherent and sometimes fascinating lectures.  History teachers could conjure up the 1905 Winter Palace revolution; biology teachers got animated over the hermaphroditic activity of earthworms; and so-on.  The stage was set but it was up to us to make sense of the lectures, strategize on how to understand, remember, and recall the pertinent information; and perform well on the assignments and exams.  It was college. It was expected.

Given that more than one-quarter of the students drop out of college after their freshman year, clearly something isn’t working.  The reasons might be primarily social and financial, but they certainly include the academic.  Students who don’t have the strategies to learn in a university environment get academically disconnected very quickly.

Online learning doesn’t inherently help the situation.  In fact, it might accelerate a student’s problems.  Online faculty find it more challenging than traditional on-campus instructors to facilitate true and genuine discourse between students and to facilitate engagement of students with the subject matter.

Online faculty also find it more challenging to gauge how their students are doing.  Faculty don’t get that immediate feedback from students online as they do in the classroom.  Is this content reaching students?  Is it going over their heads.  What questions are they having?  That immediacy doesn’t inherently exist in an asynchronous online environment.

It is therefore more critical than ever to take a teaching and learning approach to online instruction.  By ‘teaching and learning’, I mean that we need to understand the component skills (Ambrose, Bridges, Lovett, DiPietro and Norman) that we are trying to develop in our students.  We need to understand what type of knowledge those skills require and what strategies are best matched to the types of knowledge (Smith, Ragan).  What level of learning are we hoping our students will achieve?  Are they to remember key facts, understand important concepts, apply their learning to new situations?  Are we trying to promote retention of information or application of knowledge in novel situations?  What precisely are we trying to do?

Online environments, because of their remoteness, require that students practice and perform.  They require that students receive periodic feedback – feedback that they can apply to future assignments.  So, rather than one high stakes test, an online course might include multiple assignments that help the students develop in stages.

Instructors may need to be analytical about the course content.  What levels of learning: remembering, understanding, applying, analyzing, evaluating, creating?  What types of knowledge: declarative, conceptual, procedural, attitudinal, and/or strategical?  What strategies will promote that knowledge?

This post provides a simple example based on photography.  The art and science of taking good photographs involves many types of knowledge and thereby invites different instructional strategies to help students acquire that knowledge.  Hopefully, you’ve taken pictures and enjoy looking at photographs.  Some simple technical elements are introduced in this example.  Many people will recognize them.  But for those who don’t, I’ll provide a short explanation along the way.

camera

Declarative knowledge

The camera diagram above presents two labels:  aperture and shutter.   Both of these things feature prominently in the making of photographs.  At a declarative knowledge level, students should be able to identify an aperture and shutter, given an illustration.  This alone, however, is unlikely to be the end goal of instruction.  Labeling parts of an engine doesn’t mean you can fix an engine.  Labeling parts of a camera doesn’t mean that you take creative photographs.  The ability to label is a ‘stepping stone’ type of objective – but a necessary stepping stone to understanding the concepts of exposure and depth of field and the use of those concepts in the composition of a photograph.

As a result of our design, we might choose to focus first on labels and definitions and then on concepts.  Or we might choose to deal with concepts and definitions, concurrently, in a more integrated manner.   In the former, we might choose to reduce cognitive load on students to not overwhelm them.  In the latter, we might want to show the immediate relevancy of these things toward a conceptual understanding.  It obviously depends on the students and the context. Those are decisions that an instructor is in the best position to make.

Whether or not we tackle declarative and conceptual knowledge as discrete instructional steps, we must recognize that they are separate.

A student demonstrates declarative knowledge when s/he can point to the opening in a camera lens and identify it as an aperture or see an illustration of a shutter and identify it as such.  When a student can define an aperture as a controllable variable opening in a lens, or a shutter as a device that lets light pass through for a precise length of time, then the student is demonstrating declarative knowledge.  In fact, the student can include these terms in organized discourse that makes the student appear very knowledgeable.  Use focal plane shutter in a sentence.  Sounds quite technical.

In fact, organized discourse might be quite misleading.  A student may have no knowledge of the underlying concept of exposure or how to use aperture and shutter strategically to solve a composition or exposure problem.   An assessment that requires students to label and define things or use the terms in an essay might only be assessing declarative knowledge. Again, probably not the end goal.

 

Conceptual knowledge

As mentioned, both aperture and shutter relate to the concept of exposure.  Exposure, in photography, is the amount of light that reaches a digital camera sensor or the light sensitive crystals on film.  Controlling exposure with aperture and a shutter is a balancing act.  The larger the lens opening (aperture) the shorter the time the shutter should open (shutter speed) to achieve proper exposure.  The smaller the lens opening, the longer the time the shutter should open.    If the shutter were opened for too long without a balancing small aperture or if the shutter were opened for too short a time without a balancing large aperture then the picture would be over or under exposed.   That is the concept of exposure and its related measure: exposure value.  It can be understood mathematically as EV = log2 N2/t or metaphorically as a balancing seesaw.  In either case the understanding is a conceptual understanding.

The strategies for declarative and conceptual knowledge will be different.  For declarative, we might help students relate to what they already know: the pupil of an eye for aperture; a window shutter for a camera shutter. We’ll also come up with strategies for retention and retrieval.  For conceptual, we might use the analogy of seesaw or have students craft an equation that requires an increase in one variable to offset the decrease in another.

In addition to exposure, aperture and shutter speed have significant impact on the composition of a photograph.  The larger the aperture, the less the depth of field, which means that objects in the foreground and background will be blurrier.  The shorter the shutter speed, the blurrier moving objects will be.  If you want to focus on a goldenrod and blur out the plants in the fore and background (left), you choose a large aperture.  If you want to focus on a single branch and blur the background (top right), you choose a large aperture. If you want to sweep across pines against the sunset and essentially paint with light, you set the shutter to a very slow speed (bottom left) and prevent over exposure with a very small aperture.

collage

Compositions with aperture and ahutter Speed.  Left image and top right images show small depth of field. Bottom right image shows slow shutter speed.

Students can be presented with photographs that show these concepts in play.  They can be asked to guess at the aperture and shutter speed setting.  They will look for exposure and blurriness in the foreground, background and subject.  This is analysis.  The types of knowledge (declarative and conceptual) now interrelate with remembering terms, understanding concepts, applying concepts and analyzing.  We remember the definition of aperture; we understand that exposure value is a relationship of aperture to shutter speed; we apply our knowledge of aperture to blur a background; and we analyze a photograph for evidence of camera settings. In short, levels of learning (Bloom’s taxonomy) intersect with types of knowledge. Richard Mayer wrote about this in Applying the Science of Learning.  Patricia Smith and Tillman Reagan wrote about this in Instructional Design, as did many after them.

Strategic knowledge

Declarative knowledge supports conceptual knowledge, which supports strategic knowledge.  We might present the students with a composition problem that can only be solved by using aperture and shutter speed strategically.  Perhaps it is unsolvable problem that requires yet another element:  film speed or film sensitivity.

Some instructors might choose to begin with a composition problem – requiring students to work backwards to the underlying concepts and underlying declarative knowledge.  Some instructors will combine types of knowledge and reveal the interrelationships of things sooner rather than later.  Whatever the overall strategy, a clear awareness of types of knowledge will help in the instructional design.

 Conclusion

When instructors think about the component skills, levels of learning and types of knowledge and all of the factors that will impact students acquiring, assimilating and applying new knowledge, instructors are practicing instructional design.  Instructional design places the learner rather than the content at the center of focus.  Intentional, instructional design promotes better courses and increases the probability that students will be successfully engaged in achieving the course outcomes.

 

References

Smith, P. L., & Ragan, T. J. (2005). Instructional design. Hoboken, NJ: J. Wiley & Sons.

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (n.d.). How Learning Works. Josey-Bass

Mayer, R. E. (2011). Applying the science of learning. Boston, MA: Pearson/Allyn & Bacon.

 

 

 

Interactive Case Studies: First Steps

Introduction

The complaint against eLearning is all too common:  eLearning applications are boring page turners.  The implication is that students flip through the material, learn enough to pass the exam and move on.  The experience is transactional; not transformational.  No behavioral change.  No cognitive change.

Interactive case studies are one strategy to remedy the problem – but, frankly, they are a bit of a challenge to create.  In past articles, I’ve introduced some of the research that supports the use of case studies.  I also introduced interactive fiction as a way of getting started.  If you haven’t read those posts, I’ll introduce a new example in this article and then move on to ‘first steps’.

Interactive Case Studies aren’t a recent tech fad.  The example that I cite in this post dates back to 2006, but it is as relevant today as it was then.  The strategy stands the test of time.  More importantly, the ‘interactive’ nature of the case study is easy to reproduce technically.  I chose this example because it demonstrates that even the simplest approaches can be effective.

The example is taken from case studies that were created in the Department of Rheumatology, School of Medicine, University of Birmingham.   30 interactive case studies were created all together.  The following is a description of one of them.  There are several critical points that are illustrated by this example.  Hopefully, they will motivate you to take the first steps in creating your own case study.

Background

The authors developed an interactive learning tool for teaching rheumatology. Their reason for doing so is best explained in their own words:

“The existing medical curriculum requires that medical students have a large factual knowledge base, and as such teaching has traditionally been through lectures and rote  memorization paying little attention to nurturing key problem-solving skills.

 

The existing medical curriculum requires that medical students have a large factual knowledge base,  and as such teaching has traditionally been through lectures and rote memorization paying little attention to nurturing key problem-solving skills.

 

Problem solving and decision analysis are essential skills for medical students and practitioners alike. The existing medical curriculum requires that medical students have a large factual knowledge base, and as such teaching has traditionally been through lectures and rote memorization paying little attention to nurturing key problem-solving skills.” 1


1. S. Wilson J. E. Goodall G. Ambrosini  D. M. Carruthers  H. Chan  S. G. Ong  C. Gordon S. P. Young

Rheumatology, Volume 45, Issue 9, 1 September 2006, Pages 1158–1161, https://doi.org/10.1093/rheumatology/kel077

 

Description of case studies

The rheumatology cases are short, reducing the burden on both authors and students.  In the graphical user interface, button clicks bring up resources.

The skill required to place buttons or hyperlinks on a web page is minimal.  Many authoring tools (Adobe Captivate, Articulate Storyline, and LodeStar) provide the ability to connect pages through button clicks or links.  Alternatively, you can partner with your computer science, technical communications or web design department and request a student who knows HTML and is comfortable with some basic JavaScript coding. (JavaScript is a popular scripting language that is commonly taught in schools.)

In the rheumatology case studies, buttons link to a physician letter, or a library that provides a range of background information.

Screenshot of Case Study, features links to resources

Rheumatology Case Study: Department of Rheumatology, School of Medicine, University of Birmingham

As pictured in the screenshot above, students can request patient details, ask questions, examine the patient, order tests and so forth.  In thinking about how you might replicate this in your own course, you should know that this is relatively simple to produce.  Patient details can be listed on a web page, contributing to the complete picture the student needs in order to make a diagnosis.

In the case study, the student navigates through a series of screens, each providing critical clinical information.

The user can order tests, but they come with a ‘real-world’ consequence:  a financial cost is incurred that gets tallied by the program.  This type of thing requires some simple JavaScript coding.   The costs are assigned to a variable that is shared by all pages.  If you wish to avoid that technical hurdle, you can state the cost of a test and still make an impact on the learner.

After the student collects information on the patient, s/he makes a diagnosis and prescribes treatment.  After completing the case study, the student is provided with feedback and a tally of the expenditures.

Formative Assessment

Students also take an assessment in the form of multiple-choice questions that test their knowledge about rheumatology. The student can repeat components that match missed test items.

Summative Assessment

Undergraduate students were asked to produce a written report based on one of the clinical cases.

“As part of this assessment the students were expected to:

Apply their investigative skills to diagnose a range of clinical rheumatological conditions.

Explain the use of a range of clinical and scientific investigations that are required to make a successful diagnosis.

Reports were marked by two independent rheumatologists according to the reporting of the approximately 30 pieces of information or actions relevant to the case, which they were able to find, and the explanation of how these were used in the diagnosis and treatment. Student marks ranged from 55 to 95% with the majority of students gaining 70% or more on their assignment reports. All students achieved both the learning outcomes, indicating the usefulness of the approach used.”

Student satisfaction

In a survey, twenty-eight undergraduate students out of thirty-one responded positively to the interactive case study.  Only thirty-eight out of fifty-three graduate students found the program useful enough to use in the future.  The sharp difference between undergraduate and graduate students may be attributed to access that students had to the case studies.  Graduate students were restricted to one case study.

“Both groups agreed that the program was well organized and clear, the cases were of appropriate difficulty (complexity), that it was realistic and that they had learnt from it.”

Now Your turn

This and past posts have made it evident that interactive case studies can be useful.  But given your time and technology constraints, how can you create your own case studies?

To get started writing interactive case studies, follow these suggestions:

  1. Consider patterning your first case study on what that is offered through Open Education Resources (OER). In most cases, the author has thought through the case study and has done the hard work of including just enough detail to make the case educative and realistic.
  2. Keep it simple. Use button clicks or hyperlinks to enable students to navigate through the case or bring up resources.
  3. Include an analysis activity that requires the learner to consider the ‘evidence’ of the case and offer an opinion.
  4. Include the ability of the learner to compare his/her analysis with that of the expert or peers.
  5. Use the case study to prompt discussion.

Authoring an interactive case study might be a challenge at first.  It’s a bit like creative writing – crafting a story that reveals critical information at the right time.  Terse, yet engaging.  Focused on one important requirement: the case study must help the student achieve an outcome.

Interactive Case Studies require ingenuity, time and a little technical know-how.  To help faculty and instructional designers get started, I offer a simplified method.  The intent is to get students immersed in the story, drawing upon their knowledge to choose paths, make decisions, offer an analysis and share with other students.

Interactive case studies can offer lots of bells and whistles.  In contrast, this is a simplified approach – more like an interactive story or a choose-your-own adventure.  Our inspiration came from a finance professor at our university.  We started with an Open Educational Resource titled Personal Finance by Rachel Siegel, which our finance professor selected.

An important side note:  Personal Finance is now in its 3rd edition, and is available from Flatworld.  Flatworld’s stated mission is:  We are rewriting the rules of textbook economics to make textbooks affordable again.

Personal Finance begins with the story of Bryon and Tomika a young couple who are currently in school and plan to get married soon.  Both students will earn at least $30,000 in their first jobs after graduation and will likely double their salaries in fifteen years – but they are worried about the economy and about their job prospects after graduation. They have critical decisions to make to secure their financial future.

Rachel Siegel follows the case study with questions that the young couple or a financial advisor should answer about their situation.  She then proceeds to outline the macro and micro factors that affect thinking about finances.

Set a Learning Goal

Before getting to work on patterning an interactive case study on the story in the text, we need to be clear on the learning goal.  You shouldn’t start any eLearning development without a clear goal in mind. You need to answer what learning outcomes the learner will achieve by engaging in the case study.   Rachel Siegel’s intent was to use the case study to make a point: there are a lot of factors to consider.  Our goal in the example was to use the case study to help the learner identify the macro and micro factors that affect finances.  In other words, we narrowed the scope.

Find an existing case study

In an effort to keep things simple, we patterned a case study closely on the one well thought out and communicated by the author.  This might help you get started.  Find a case study narrative in your own field and pattern your own case study closely on that one.  If it’s a good case study it will be short but feature enough detail to provide interest and a learning situation.

Case studies are found in open education resources (eBooks, PDFs, learning repositories like Merlot.org).  They are also found in case study websites like:

Science Cases

AMA Case Studies

But be careful.  Case studies are usually copyrighted.  Seek permission or ensure that the case study or text is offered under a Creative Commons license.

So, to recap, the first step is to set a learning goal.  The second step is to find a case study that already exists in the literature or on the web that you can pattern your case study on.

The design of our interactive case study is to provide the reader with a story (closely patterned on the text) and challenge the reader to determine which factors threatened the financial health of the characters.

This is a stepping stone case.  It is not a ‘putting it all together’ where there are numerous factors, no clear cut right and wrong answers, and plenty of room for interpretation.   In our case study, the learner is presented with the facts; parts of the story are revealed based on learner choice; and, at the conclusion, the learner answers some objective questions, performs an analysis, submits the analysis and compares his/her submission to the ‘expert’ analysis, which is revealed.   Alternatively, the end-product could have been an analysis that was submitted to a drop box or to an online discussion.

How we built it

First, we didn’t use Rachel Siegel’s story – but one closely based on it.  As an easy first step, you have the option of converting an existing case study into an interactive case study or creating a derivative case study that changes some critical details to challenge the learner.  If there is any doubt about the legality or the ethics of copying the intellectual property, please contact the owner of the creative work.

Once we chose our characters, we licensed images to match the characters.  Alternatively, you can use Wikimedia or find images licensed under Creative Commons.    I did a search at https://search.creativecommons.org/  and immediately found images of student couples that I could easily have used.

On the first page, we provided instructions.  The instructions tell learners about the built-in notepad and the transcript button.   These aren’t necessary.  Students can take notes in any way they prefer.  The transcript button shows a report of all of the feedback – but these items are more for the convenience of the learner.  Traditional methods are just as effective.

Screenshot of instructions and first introduction to Chris and Divya, the characters in the case study.

Screenshot of instructions and first introduction to Chris and Divya, the characters in the case study.

Provide Choice of Paths

Put the learner in control.   The characters Chris and Divya share a lot of personal financial details.   The readers (learners) of the interactive story can decide what details they want to read and pay attention to.  As depicted in the screen below, the reader can read details from Chris’ or Divya’s background or decide at any time to assess their situation.  The reader will obviously not provide an accurate or meaningful analysis until s/he reads most or all of the facts of the case.

The essential thing here is choice.  Adult learners like choice.  The more complicated the case study, the more that choice matters.   Given the objective, the answers to some questions will be important to pursue; others will be irrelevant.

Depending on the software that the interactive case study author chooses, choices can be presented as hyperlinks, buttons or hot spots.

If the author were using a PDF or HTML authoring tool (like Word or PowerPoint), then choices can be presented to the reader as hyperlinks.  If the author were using Captivate or StoryLine, the choices can be presented as hot spots (clickable areas).  In LodeStar the choices can be presented as hyperlinks or buttons.

Screenshot that shows choices presented to the reader as hyperlinks.

Screenshot that shows choices presented to the reader as hyperlinks.

 

Make Resources Available

In the case study, one of the important financial factors comes from economic data. Economic data is represented as a resource that is always available to the reader.

In the screenshot above, economic data can be accessed with a button click.  The button is visible on the screen at all times.

(In some of our more evolved case studies, resources are shown only when they are needed. Some behind-the-scenes scripting allows us to show the right resources at the right time.)

Performance and Feedback

At any point in the case study, the learner can opt to complete the assessment of the characters’ financial situation.  A link to the final analysis exists on every page.  It can also be presented as a permanent button on the screen.  This is akin to a supermarket.  The shopper can go down any aisle in any order and check out at any time when the buyer is ready.

Screenshot featuring Objective questions

Screenshot featuring Objective questions

In this case the ‘checkout’ is the final analysis.  It can be presented as series of multiple choice questions (objective) or essay question or both.  In our example, we chose both.

The objective items provide quick feedback.  The essay item comes up after the learner clicks on ‘Complete your analysis’.  The essay question reads as follows:

Write a very short report in the space below.  Include the macro and micro factors that are likely to contribute to Chris and Divya’s financial security and what factors represent a possible threat to their security.  Click on the ‘Submit’ button when you are done.  You can always amend your report and re-submit.

The learner can consult his or her notes and base an opinion on the facts.  The learner can cite the case study to support the findings.  When the learner clicks on the ‘submit’ button, the expert analysis is revealed.

At our university, the student may see immediate feedback or they are asked to copy and paste their analysis into either a discussion post or into an assignment folder text field (a drop box that not only allows attachments but text entries.)

Both of these options can be supported with the same basic technical know-how used in the rest of the case study.   If an eLearning authoring system like LodeStar were used, the essay submission could appear in the SCORM report of the learning management system.

Screenshot showing a prompt for an open-ended analysis based on the case.

Screenshot showing a prompt for an open-ended analysis based on the case.

Conclusion

The interactive case study presented here relies more on story-telling than it does on technical know-how.  In this type of case study, the learners can choose the information that they wish to read, or ask questions that they choose to ask.  In response to their decisions, information is revealed that will be used in the final analysis.

The final analysis can include objective test questions or essay items or both.  In a simple low-tech situation, the learner can write the essay in Word and then submit that to a drop box or assignment folder.  The interactive case study simply provides the background information and the essay prompts.

Low-tech or high-tech, the learner is asked to examine the information and consider its importance in the final analysis related to the learning outcome.  The learner is being asked to ‘activate’ thinking rather than mindlessly store and retrieve content.  The result is better outcomes for students.

 

Online Instructors and Design Thinking

Introduction:

For me, the excitement of building online courses comes from the power of design.  I love the idea of designing with intention.  Perhaps that is why I’m drawn to the work of Frank Lloyd Wright, Apple Computer, MIT Media Lab, modern architecture and, as you read in my last post, art galleries.   When faculty treat online courses less as the assembly of course documents and more as the product of thoughtful design, students benefit.

Stanford’s d.school (Design School) with its origins in mechanical engineering may seem like an odd source of inspiration for instructors who design online courses.  However, it turns out to be not only inspirational but quite practical.

d.school is the fountainhead of Design Thinking.  Design Thinking helps us to apply human-centered techniques to solve problems in a creative way.  It is used to make art, design products, solve business problems – and even to create online courses.

 

design thinking

The Five Steps of Design Thinking: Empathize, Define, Ideate, Prototype, Test

 

What is Design Thinking?

Stripped down to its essentials, Design Thinking requires empathy – it requires us, for example, to ask who our current or prospective students really are, what do they need, what drives them, what do they know, and what are their constraints.

Secondly, it requires definition.  After information gathering on student needs, program needs and employer needs, what is the problem that the course is intended to solve?  What will the students be able to do that they haven’t been able to do before – cognitively, physically, emotionally?

Thirdly, it requires idea generation.  What are all the strategies available to help students master a type of knowledge or skill at a particular level to a defined degree of success?

Fourthly, it requires playing around with ideas – sketching on white boards or on paper.

Finally, we need to test the usability and effectiveness of our ideas on real people.

That is Design Thinking in a nutshell:  Empathize, Define, Ideate, Prototype, and Test.

 

Design Thinking and Instructional Design

For the last several years, instructional designers have written about Design Thinking and its interrelationship with various traditional and agile design approaches. Corporations have used it in building user-friendly products that meet needs.  But the benefits of Design Thinking and even of Instructional Design have bypassed online learning instructors.  Why?

For one, online instructors can be fiercely independent.  They are the subject matter experts – the content experts. Of more than two thousand faculty members who responded to the Inside Higher Ed’s 2017 Survey of Faculty Attitudes on Technology, only 25 percent said they have worked with an instructional designer to develop or revise an online course.  That is a very low number but not completely unexpected.  Jean Dimeo in her article Why Instructional Designers Are Underutilized, cites possible reasons why:

  • Faculty are busy
  • Institutions have few or no instructional designers and/or learning support personnel
  • Instructors may be unaware that instructional design services exist
  • Faculty don’t want to be told how to teach

 

Design Thinking Applied

In a Design Thinking approach, with the help of an instructional designer, faculty don’t need to develop a course alone.  At our university, we have a conference space surrounded by white boards.  Our training space is clad in white boards.  The instructor can invite colleagues and we can invite team members who understand design, the technology, the media and can help get things done.

Empathize

The instructor, with some help, can gather background information on the students, the curriculum, the program goals, the employer and community needs, and whatever information will drive the curriculum.  A large part of this is human factors.  The table of contents of a textbook may not be the best place to start.  Understanding the learner is a much better starting point.  Dee Fink describes this as shifting the center from content to the learner.  José Antonio Bowen describes this as finding the entry point.  That means the student entry point. Instructors already know and love their content; but how will student be first introduced….and hooked?

Define

The definition phase is like holding a magnifying glass to paper on a sunny day.  It is where something so broad and diffused as goals, aspirations, needs, and requirements sharpens to a focal point.  The course author focuses on the objectives of the course or the problem that must be solved or the task that students must master.  M. David Merrill in his first principles of learning places the problem at the center of everything.  The activation of prior knowledge, the presentation of new information, the practice and feedback, the application of knowledge outside of the course, etc.  are all centered on the definition of the objective, task or problem.  This is tricky work.  Most of our less stellar efforts can be traced to poor definition of what the course or module or learning object needed to do.

Ideate and Prototype

After this hard work, the fun begins.  The white boards come alive with ideas and quick prototype sketches.  Instructors can benefit from folks who really understand the breadth of strategies that can help students achieve an outcome.  In our conference space, we talk about everything from journals to electronic portfolios, VoiceThreads, interactive case studies, simulations, electronic books, OER, publisher resources, to whatever.  The challenge is to find strategies that help students with a certain level of learning (apply, analyze, synthesize, etc.) and a type of knowledge (procedural, problem-based, conceptual, etc.) and degree of mastery.  Is this something we are introducing, reinforcing or, indeed, mastering?  Do we involve students in discussion?  Does the instructor model a practice?  Does she observe student performance and provide feedback?  Do students interact with the content – check their mastery, build their skills?  Faculty may have one or two favorite strategies.  Centers for Online Learning or Centers for Teaching and Learning (CTL) or Centers for Teaching Excellence or eTeaching Services or Innovation labs — or whatever they are called — have a much deeper tool chest to choose from than the individual instructor.  Seeking their help is a critical first step.

Test

These ideas then need to be tested.  We can design websites or interactive content and theorize how effectively students will use them.   The validation comes with the testing.  We can simply observe students interacting with course elements.  We can assess students for performance and survey them for attitudes.  We can do simple control and experiment group comparisons.  The scope and effort will vary but we need some form of validation and feedback before faculty commit to full development of the project.  A recent faculty project featured a very long survey.  It is one thing to anticipate and imagine the wear on students after many minutes of survey taking; it is quite another to observe students complete a long survey.

The First Step

The first step for some faculty can be to seek out their institution’s instructional designers.  Many professionals with different titles play the instructional designer role.  In some places, instructional technologists, learning management specialists or curriculum specialists may be instructional designers. As mentioned, they also live in places with different names.  Seek out the places with all of the whiteboards. Finding the instructional designers may lead to finding other professionals who can help with idea generation.  Oftentimes, the instructional designers can bring the right people together.

Faculty can begin with Define and Ideate.  An instructional designer and her colleagues can help them sharpen objectives and brainstorm strategies that help students achieve the outcome.  Think of it as just hanging out with people and brainstorming with two very very important requirements.  Faculty must do their homework and supply critical background information.

Next Steps

From there, faculty can engage the instructional design team to whatever level they feel comfortable.  Maybe they walk away after getting some ideas.  Perhaps they engage in the testing of ideas.  If the instructor’s locus of control is respected, more of the benefit of Design Thinking will be realized.

The beneficiary is always the student.

Visual Design for eLearning

Introduction

In eLearning, good visual design is yet another challenge.  As instructors, we want our interactive lessons to look good – but we aren’t trained in layout and graphic design.  In many of my own projects, I’ve relied on graphic designers – but often I’ve had to make do with my own limited skills.  I’ve learned a couple of things over the years and am happy to share what little I know – more as a starting than an ending point.

Let’s begin with the premise that we want our pages to be visually appealing to students.  Of course, more importantly, we want our pages and layouts to support our instructional objectives.  We want things to look good and function well.  At the very least, we don’t want our design to distract the students or confuse them.

Fortunately, visual design is a combination of art and science.   We can draw from a body of knowledge that is evidence-based and not as subjective as we might imagine.

To describe visual design, I can start with the basic concepts of  flow, color, style, reading order, consistency, contrast and structure.

When in doubt, simplify

Whenever I’m in any doubt about visual design, I think about the art gallery.  In most galleries, the walls don’t compete with the art work.  Plain walls.  Open spaces. Strategically lit rooms.  The labels and interpretive text are positioned so the information is easily associated with the art work. The label doesn’t compete and isn’t crammed.  The text is printed in high contrast to the background.  I can move easily from piece to piece all around the room and then onto the next.  The flow is well thought out.

art_gallergtufts

Tufts University Art Gallery

Our interactive lessons can be designed similarly.  Text can be cleanly separated  from imagery – with an adequate margin between text and image.  Margins can provide clean separation of the other page elements. The page background can be selected to not compete or distract from the lesson.  The developer can be intentional about guiding the eye from one thing to the next.

Or not

Or sometimes, for effect, we can do the exact opposite.  Agitate, provoke, move students out of their comfort zones.  But, in either case, visual design requires intentionality.

Visual flow

Screen elements have different visual weights or powers of attraction based on the size, color, and even shape.  Unusual things attract the student’s attention.

Instructors should decide where students should look first.  If one element is larger than the others, students’ eyes might be drawn there.  If all elements are in black and white but there is a splash of color somewhere on the page, the student’s eye will go there.  We’ve known these things for some time, but recently, usability labs have provided us with eye tracking sensors, which produce heat maps. Heat maps graphically display how people look at a software screen, for example, and which elements they look at. Areas that attract the most attention appear in hot red.

From usability studies and from age-old observation, we know that visual designs have an entry point. We need to plan or consider where that entry point might be.

We also know that visual designs can have unintended exit points. As an example, hyperlinks can be hugely counterproductive to visual flow control.  For good reason, we think of hyperlinked information as being highly useful to students (another resource) but they introduce the risk of students losing the flow, being distracted, perhaps never returning to the lesson.

If our visual design is a simple text page, our job is easier.  We can use headings, sub-headings, text wrapped around images as well as size, italics and color to signal very important information.  If a page is a free-form layout, we need to plan visual flow more carefully.  In that planning, we need to note that the eye is attracted to color, strong contrasts, and follows along thick lines or elements that are composed in a way that suggests directionality.

Color

Color can be used to direct the eye and to attract the student’s eye to key information.  Richard Mayer, in his book Multimedia Learning (Cambridge Press, 2001), describes the signaling principle.  The signaling principle states that people learn better when cues that highlight the organization of the essential material are added.  Instructors can use color to provide that cue, but color-blind students will not benefit.  Multiple cues are needed to highlight essential material.  Italics for example.

2_illustration of color

Color used sparingly to draw the eye.  Layout created by Clint Clarkson

I’ve always been cautious of the ‘circus’ effect of too many colors.  One color will clearly signal important information or draw the student’s attention if s/he is not color blind.  Two and three colors can be used effectively.  Introducing more colors leans toward a circus effect, where color ceases to attract attention.  Graphic design sites describe a 60-30-10 rule, which states that:

The dominant color should be used 60% of the time, your secondary color 30% of the time, and an accent color 10% of the time. Typically, the most dominant color should also remain the least saturated color, while your bold or highly saturated accent color should be saved for your most important content.

http://www.eyequant.com/blog/2013/06/27/capturing-user-attention-with-color

 Style

Style may be the most fickle thing to embrace in your visual design approach.

In the early 20th century, graphic designers were influenced by modern art, the Bauhaus school, posters, the De Stijl movement (think Piet Modrian), constructivism, architecture and more.  Today graphic designers are as likely to be influenced by styles on the web.

Just a couple of years ago, instructional screens featured gradients, beveled buttons, drop shadows, textured backgrounds and an attempt to imitate the material world in the digital medium.  Microsoft and Apple, in the redesign of their graphical user interfaces, reflected the sudden change away from material world imitation.  Buttons lost their three-dimensionality and became flat, single-color, texture less features.  The new look became, in a sense, minimalist and, perhaps, more functional.  The rise in mobile computing favored flat designs over both texture and minute detail as well as other features that didn’t translate well to the small screen smart phone.

4-apple

Apple Interface: Shift to a flat design

Flat design is a thing.

“Flat design is a minimalistic design approach that emphasizes usability. It features clean, open space, crisp edges, bright colours and two-dimensional illustrations.”  –Tom May, 2018

But styles change.  So, what is an instructor to do?  My hunch is that we should focus on evidence-based practices and embrace minimalism not for its trendy appeal but for its functionality.    We should probably pay attention to the world around us.  Pay attention to styles on the web.  Pick your favorite website and think about the underlying elements that make it visually appealing and functional.  Visit the website of a college of art and design.  Follow it over time.  But don’t get too hung up on style.  It is a black hole.  Once you pass the event horizon, you’ll never return to creating anything useful for your students.

Reading Order

Focus instead on some simple things – such as reading order.  Highlight important words to ‘signal’ their importance.  Use headings and sub-headings to expose the organizational structure of your page and to help students with visual disabilities who rely on a screen reader.  (Students with screen readers scan pages by moving from heading to heading.  A blind student who used JAWS (popular screen reader) can hit the 1 key to navigate to a level 1 heading  to get a sense of the structure and organization of the document.  He can hit the 2 key to move to a level 2 heading.

Use bulleted lists and numbered lists where appropriate and reduce the amount of writing.   The traditional wisdom was to ‘chunk’ writing by separating it into pages – but mobile devices may be affecting students’ habits.  They are accustomed to endless scrolls.  More research is needed on the effects of cognitive load of endlessly scrolling pages.

Again, when in doubt, simplicity is preferable.

Consistency

Consistency is key. As students navigate the lesson, they shouldn’t burn brain cells on figuring out each page.   Pages that function the same should be styled the same.   For example, imagine that your page summarizes key concepts with a bulleted list.  Summarizing key concepts is an important strategy.  Our  pages may dive deeply into the details – but we want students to emerge with a clear map of the key ideas.  A bulleted list can be set off to the side of the page (left or right) or placed underneath, separated by space, color, and possibly a border.   The placement should be consistent so that students know where to find the summary in each part of the lesson.  They’ll look for it.

Contrast

At all times we need a strong contrast between the text and the background.  Lack of contrast affects readability.   Strong contrast also directs the eye.   I break this rule too often when I style hyperlinks to be colored in something other than the standard, boring blue with no decorative underline.  And I always regret it.  I strive for elegance and create a problem instead.

Some of these key principles relate to work done on perception by the Gestalt psychologists of the early twentieth century.  One of their principles, ‘Figure-Ground’ relates to an object and its surroundings.  Photographers embrace this principle when they want the subject of a photograph to be clearly known – in other words separation of the subject from the background.  Photographers will use a large aperture setting to blur the background (reduced depth of field) and thus create a clear distinction between figure and ground.  All elements in the lesson need to be distinct from the background – and that especially applies to text and the background.

Structure

Structure relates to the organization of elements on the screen.  It is concerned with proportion, symmetry, asymmetry, and balance.  These concepts are expressed in so many ways.  In photography, artists may think in terms of the rule of thirds – whether they are following or breaking the rule.  Two-thirds land; one-third sky.  One-third rocky foreground; two-thirds blurred valley background.   Two-thirds of blank space on the left; one-third of birds on the right.  Halves, in symmetry has quite a different effect and can be a statement in and of itself.  The parliament buildings of London reflected in perfect symmetry in the Thames, for example.

We can make similar decisions with the placement of images on the page.  They can be set with a width of 66%, which means that they will always scale to two-thirds of the page, regardless of page size.  Or the image can be set to 33% with text wrapping the image and taking up the remaining space.  Or they can be wrapped in negative space (e.g. white background) with the ratio of image to negative space a very deliberate choice.  Again, photographers might subdivide the plane in a three by three grid, which gives them 9 spaces in which to organize the structural elements of the photograph.  Traditional layout artists, similarly, had grids that subdivided the page.  Instructors can get a sense of their layout by abstracting the visual elements on the page as shapes.  The paragraph becomes a dark block.  The negative space becomes a white block.  What proportion of the overall space do the blocks occupy?  What is their relationship to one another?  Are they pleasing and pure?  Are they distracting and confusing?

Ratios or proportions reduced to formulas probably doesn’t explain why some layouts are pleasing to the eye and others are not – but it is still interesting to consider the use of math in the pursuit of beauty. The divine proportion or the golden ratio was probably used to plan some of the great pyramids and it is being used evidently today to construct websites.  We know that from, again, abstracting web elements into dark and light shapes. The ratio is defined by a simple equation:

a/b = (a+b)/a = 1.6180339887498948420

So, if our text block was denoted by ‘a’ and our image block was denoted by ‘b’, the ratio of text to image would be the same as the ratio of text plus image to text alone.  So, the secret to all good learning is in the golden ratio?  Not quite.  The only point I am making is that the proportion of things will have an effect.  We should at least be aware of how things are laid out on the screen. Proportion matters.

3-Proportion.png

Layout created by Lauren Franza

Conclusion

The instructor who consciously and conscientiously includes visual design in the planning of his or her eLearning lesson will reap the reward.  Students will benefit from being guided through the lesson, and not being distracted by colors, crammed elements, inconsistency, poor readability, and an off-putting layout.  Visual design is a large study – but the application of a few principles will greatly improve one’s eLearning design.