Learner Experience Design has captured the attention and the imagination of just about everybody. Some have cast learner experience design (LXD) as a discipline in direct opposition to instructional design; others consider LXD as a rebranded instructional design.
My own perspective comes directly from my community of practice. For one, I worked as an instructional designer for creative studios who practiced learner experience design well before it became a thing. We worked in teams that blended the disciplines of user experience design, cognitive psychology, learning technology, and design thinking, which included ideation and prototyping. LXD as a discipline captures the very best of the principles that are espoused in the CCAF (Context-Challenge-Activity-Feedback) Model, the processes of design that include situational and user analysis, successive approximations, sketches, quick prototypes, a focus on the user, and a focus on doing. The process of creating Allen Technology’s ZebraZapps, an eLearning authoring tool, included the best of design thinking and user experience design.
So what is Learner Experience Design?
So for me, LXD is what we’ve being doing for years and that is:
- Centering on the learner versus the content (Dee Fink)
- Focusing on the experience of the learner — on the doing (CCAF, problem-based learning)
- Applying how people learn (cognitive science)
- Empathizing, defining, idea-generating, prototyping, and testing (Design Thinking)
- Following the principles of User Experience Design (Human Factors)
- Collecting and analyzing data (Data analytics with the help of SCORM, and now xAPI, CMI5)
- Using learning technology as enablers or affordances
- Recognizing that formal training is but one part of improving human performance
In my view, LXD is the power of all of these things combined under one label. To illustrate the interplay of the learner, experience, cognition, behavior, UX, Design Thinking, data, technology, and human performance, I’ll draw upon a current project.
The project goal is to help supervisors act more like coaches than formal evaluators. The context is public accounting. CPAs require deep technical skills and, as they progress in their careers, a host of success skills that include business development, leadership, supervision, and more. In Minnesota, for example, CPAs complete 120 credits every three years to maintain their license. They must also routinely attend trainings and updates related to changes in the law, technology, and business practices.
In addition to this continuous training, the company seeks to improve employee retention, maintain good morale, and continue to grow rapidly. To achieve its goals, the company adopted an employee engagement system that, among other things, helps supervisors collect feedback on employees from their tax reviewers or audit in-charges. More importantly, the company is switching from an annual review to monthly meetings that help supervisors and their reports improve their work.
There’s already a lot going on. Learner Experience Design recognizes that all of these factors come into play:
- Employees train a lot
- New technology is in place
- Industry is experiencing high turnover of staff
- Company wants supervisors to be good coaches
- Company is shifting from annual review to monthly meetings
At the heart of all of this lies a set of experiences shared between supervisors and their reports:
- Requesting, providing, and organizing feedback with the employee engagement platform
- Delivering effective feedback
- Receiving feedback effectively
Let’s focus on one experience to illustrate the power of LXD. Let’s focus on ‘giving feedback’.
There are underlying psychological principles as well as best and poor practices related to the giving feedback. Giving feedback might elicit a perception of threat in the receiver and can easily be dismissed. The feedback provider must use concrete examples, remain non-judgmental, draw from different perspectives, work toward a positive outcome and on and on.
As designers, we can treat the topic of giving feedback in many different ways. We can explain the function of the amygdala in the human brain and underscore its importance in decision making and emotional responses. Feedback triggers those emotional responses and evokes a fight or flight response. We could show video clips of good and bad practice or cartoon strips or excerpts from medical journals or any media that conveys information. Our design might include this type of information sharing and then some form of assessment – a quiz or essay.
In contrast, LXD tends to favor placing the experience at the heart of the lesson. In this case, the experience is the giving of feedback. One design treatment might place the learner in a first-person scenario or simulation. The context is the office with a new employee who is not performing well. The learner acts as supervisor and selects the best thing to say in a conversation with the employee. If the learner’s choices disagree with the principles and best practices of providing feedback, then the instruction may come in the form of an employee thought bubble, a pop-up outlining best practices, references to a text or a video, and other visual indicators of success or failure.
In the prototype below some of these ideas come together. The learner has selected one of three options. The choice causes a change in the employee’s outward expression (full figure on the left), inward expression and thoughts, and in the information that is collected on the interaction. In this prototype, the learner can access a transcript or review it at the end. At this point in the scenario, the employee came in with the expectation of being coached only to be confronted by the reality that she is being evaluated (because of what the learner chose). She outwardly smiles while inwardly expressing her concern about being evaluated. A meter shows generally how things are going.
At the bottom of the screenshot, the learner has access to feedback given about the employee from two sources. Just as in real life, the learner can consult that feedback to get different perspectives on the employee’s performance.
The Design Thinking that led to this prototype included, to start, an analysis. We must know something about the audience, their situation and the processes that were in place in the past. In fact, while thinking about the actual problem we are trying to solve, we placed feedback ‘training’ on the back burner. Other things needed to be in place first: clear processes; and role definition between supervisors, audit in-charges and tax reviewers, and other personnel. We also needed to work out how the workplace engagement platform will be used optimally to solicit and collect feedback in preparation for the one-on-one meetings between supervisors and their employees.
As we continue to think about people and processes, we’ll come up with new ideas, build new prototypes and test them out.
Well…admittedly, to a point. For a mid-sized company the return on time and effort is calculated quite differently than for a creative agency that plans training for thousands. Design thinking still plays a role, but perhaps at a smaller scale.
The cognitive aspects of this training relate to how we can help the learners acquire and retain new knowledge without overload, how they can assimilate that new knowledge, and how they can apply the knowledge to their daily lives. Human Performance Improvement considers any job aids or prompts that support the learner’s application of the principles and procedures. User Experience Design challenges us to think about a lot of things on the screen (fonts, colors, layout, flow, navigation, interactive elements, accessibility, desire paths) and off (cognitive overload, attention, memory, and more).
All of these things interplay and intersect. Cognitive load might cause us to scaffold or plan out the curriculum differently (instructional design), or create a job aid (human performance), or map out the experience (UX) so that it doesn’t overwhelm the learner. As we build prototypes or test the product, we collect data and analyze it. Learning technology (xAPI, CMI5, SCORM) helps us collect the data from the learning experience. xAPI and CMI5 are standards that are centered on experience. (As I’ve written in the past, the x in xAPI is ‘experience’.) Statistical methods help us make sense of the data. For example, are learners benefiting from one design over another.
Since the term Learner Experience Design was first introduced, it has become part of our vocabulary and a rallying cry against content-centric designs, training-centric human performance improvement, and ineffective user interfaces. LXD may not be anything new and yet it feels new and it feels exciting.