Mobile Learning

Mobile Learning means much more than easy access to responsive educational applications from a smartphone or tablet.   It is an amazing confluence of technologies that represents a new era in technology-assisted instruction.  Researchers have a name for technologies that bring us new capabilities.  They call it affordances.  I once hated the word.  But now I embrace it. Recent advances in technology afford designers new opportunities to engage students.

New technologies bring new capabilities and help us redefine what is possible. When we had our shoulder to the wheel, working with computer-based training, floppy disks and stick figures, we looked up and saw the approach of interactive video disc players, and imagined the possibilities.  We worked with videodiscs for a time and then saw the virtue of CDROMs.  We gave up full-screen full-motion video for the ease of use of the CDROM and bought our first single speed burners for $5,000.   The CDROM gave way to the internet and the web application.  Flash based applications on the web gave way to HTML 5.  And now, the desktop is making room for the mobile app and the mobile browser experience.

We always lose something – but gain something more important in return.  New technology affords us new capabilities, new opportunities

Organization

To make best use of these capabilities, mobile learning demands that we think about old ideas in new ways.  To use a simple example to start, our current projects may have forward and back buttons that chunk the content in nice bite-sized pieces.  We recognize that chunking can be useful to learners.  But mobile users are in the habit of swiping up and down and sideways.  Content is laid out for them in one long flow or in slides.  Chunks on the screen are the result of how aggressively users swipe their fingers. It challenges us to think about organizing content in a new way.

Responsiveness

Mobile apps, whether run in a browser or natively on the mobile device’s operating system, must conform to all sorts of device shapes and sizes.  They call that form factor.  The iPhone alone comes in multiple sizes ranging from 4 to 5 ½ inches.  There are smartphones, phablets, mini-tablets and large tablets.  There are wearables and optical displays. An application may be run on anything from a multiscreen desktop configuration to the smallest smartphone.  An application may be viewed in portrait mode (vertical) or landscape (horizontal). The ability of a single application to conform to all of these display configurations is called responsiveness.  Responsively designed applications automatically size and scale the views, pick readable font sizes, layout components appropriately and provide for easy navigation.

mobile_learning

Responsive Application Created with LodeStar Learning FlowPageMaker

 

Designing Mobile Learning Experiences

But the challenge of mobile is not just in screen sizes and navigation.  It is in the appropriate design of applications pedagogically.  When we moved from computer-based training to videodisc we considered the power of full motion video and the ability of the learner to make decisions and indicate those decisions by touching the screen and causing the program to branch.  When we moved to CDROM we made use of 640 megabytes of data – which seemed massive but afforded us embedded encyclopedias and glossaries and other information and media at our fingertips.  When we moved to the web, suddenly WebQuests harnessed the full power of the internet and sent learners on inquiry-based expeditions for answers.

But what now?  What are the opportunities that mobile devices give us – in exchange for extremely small screen sizes, slower processors and slower connectivity?

Part of the answer lies in student access to resources when they are on a bus or on lunch break – spaces in their busy lives.   The more interesting answer is access to resources and guidance from environments where learning can happen: city streets, nature trails, museums, historical and geographical points of interest – in short, from outside of the classroom and the home office.

This is what mobile learning – M-Learning – is all about.  M-Learning requires much more from applications than being responsive.  They should support students being disconnected from the internet. They should support a link back to the mother ship – the institutional learning management system – once students are reconnected.  They should report on all forms of student activity.  They should report on not just quiz scores – but what students have read or accomplished or what a trained person has observed in the performance of the student.

Responsiveness is an important start – but this added ability to report remotely to a learning management system is facilitated by one of several technologies that are somewhat closely related.  You may have heard these terms or acronyms:  Tin Can, xAPI, IMS Caliper and CMI5.

To really appreciate the contributions of these standards to the full meaning of mobility, we need to do a deeper dive into the standards.  Bear with me. If you haven’t heard of these terms, please don’t be disconcerted.  They represent a tremendous new capability that goes hand-in-hand with mobile devices that is best explained by the Tin Can telephone metaphor.  If you haven’t heard of these terms, you are in good company.  We’re only on the leading edge of the M-Learning Tsunami.

Tin Can

Tin Can was the working title for a new set of specifications that will eventually change the kinds of information that instructors can collect on student performance.   To explain, let’s start with the basic learning management system.  In the system, a student takes a quiz.  The score gets reported to the grade book.  The quiz may have been generated inside the learning management system.  The student most likely logged into the system to complete the quiz.   But quizzes are just one form of assessment and no learning management system has the tools to generate the full range of assessments and activities that are possible.  Not Blackboard.  Not Moodle.  Not D2L.   Hence, these systems support the import or the integration of activities generated by third party authoring tools like Captivate, Raptivity, StoryLine, LodeStar and dozens and dozens of others.  With third-party tools, instructors can broaden the range of student engagement.  Learning management systems support tool integration through standards like Learning Tool Interoperability (LTI), IMS content packages and a set of specifications called SCORM. SCORM has been the reigning standard since the dawn of the new millennium. SCORM represents a standardized way of packaging learning content, reporting performance, and sequencing instruction.  SCORM is therefore a grouping of specifications.  Imagine packages of content that instructors can share (Shareable Content Object) and that follow standards that make them playable in all of the major learning management systems (Reference Model).

But SCORM has its limitations.  The Tin Can API is a newer specification that remedies these limitations.  A SCORM based application finds its connection (an API object) in a parent window of the application.  That’s limiting.  That means that the application has to be launched from within the learning management system. Tin Can enabled applications can be launched from any environment and can communicate remotely to a learner record store.  Imagine two tin cans linked by a string.  One tin can may be housed in a mobile application, and the other tin can in a learner record store or integrated with a learning management system.  The string is the internet.

SCORM has a defined and limited data set.  An application can report on user performance per assessment item or overall performance.  It can report on number of tries, time spent, responses to questions and dozens of other things but it is ultimately limited to a finite list of data fields.  (Only one data field allowed arbitrary data, but it was really limited in size.)

Tin Can isn’t limited in the same way. Tin Can communicates a statement composed of a noun, verb and object.  The noun is the learner.  The verb is an action.  And the object provides more information about the action.  Jill Smith read ‘Ulysses’ is a simple example.  Imagine the learner using an eBook Reader that communicates a student’s reading activity back to the school’s learner record store (housed in an LMS).  Tin Can is M-Learning’s bedfellow.  The mobile device gives students freedom of movement.  Tin Can frees students from the Learning Management System. Any environment can become a learning environment. Learning and a record of that learning can happen anywhere.          

lodestar_lrs

LodeStar Learning (LodeStar 7.2) Ability to configure an Learner Record Store Service (LRS) and Export to a Tin Can API enabled Learning Object

The next acronym, xAPI, is just the formal name for Tin Can.  Tin Can was a working title.  When I was at Allen Interactions working on ZebraZapps our team provided early comment related to this evolving specification – which became xAPI.  The eXperience API is a cool term for a cool concept, but Tin Can has stuck as a helpful metaphor.

The openness of Tin Can, however, presents its own challenge.   If one application reports on student reading performance in one way, and another application reports on a similar activity but in a different way, it is hard to aggregate the data and analyze it effectively.  It’s hard to compare apples and oranges.

IMS Caliper attempts to solve this problem.  IMS Global is the collaborative body that brought us standards for a variety of things, including learning content packages and quiz items.  IMS Caliper is a set of standards that support the analysis of data.  They define a common language for labeling learning data and measuring performance.

Which leads us to the last standard: CMI5.   CMI5 bridges Tin Can with SCORM.  Applications still benefit from the grade book and reporting infrastructure built around SCORM – but are free to connect remotely outside of the confines of the LMS — once again supporting M-Learning.

Had I written this entry a year ago, I would have found it difficult to try out various learner record stores.  Today, they abound.  The following link lists tools and providers:  http://tincanapi.com/adopters/

The following two LRS providers give you an inexpensive service in order to test out this technology for yourself.

Rustici SCORM Cloud

https://cloud.scorm.com

Saltbox Wax LRS

http://www.saltbox.com/

So what?

Now that we’re free to roam around the world, what do we do with that?  Mobile applications, even browser based mobile applications, use GPS, cell towers and WIFI to locate our phone geographically.   We can construct location-aware learning. We can guide students on independent field trips. They can collect information and complete assessments of their learning.  All of that can be shipped back to the institution through the learner record store.  Mobile devices have accelerometers and gyroscopes that help the phone detect orientation (e.g. horizontal and vertical) and the rate of rotation around the x, y and z axes.  With that we can create applications that assess the coordination of a learner in completing a task that requires manual dexterity.  Devices have cameras and microphones, both of which can be used to support rich field experiences.

The smart pedagogy for M-Learning is one that recognizes these affordances and uses them – rather than shrinking a desktop experience into a smaller form factor.

An Example

Aside from our work at LodeStar Learning and at the university, my most recent encounter with this technology came from a serendipitous meeting with a local community leader who introduced me to Pivot The World.

Pivot The World  http://www.pivottheworld.com represents an example of a good starting point.  It is a start-up company interested in working with universities, museums, cities, towns and anyone interested in revealing the full richness of a location in terms of history and cultural significance. It combines the freedom of movement of a mobile device with its ability to detect location, overlay imagery and geographical information, and match what its camera sees to a visual database to retrieve related information.   The combination of camera, maps, imagery, audio, location, and other services engage learners in a new kind of experience.

The Pivot The World founders and developers started in Palestine, have since applied their technology to a tour of Harvard University and are currently working with a volunteer group of history buffs to create a Pivot Stillwater experience in our own hometown.  At the north end of town, where there are condominiums, a simple swipe of the finger can reveal the old Stillwater Territorial Prison with elements of the prison preserved in the design of the new site.

If a university or museum wished to keep a record of student or visitor experiences with the application, then an integration with the Tin Can (xAPI) would add that dimension.  As users engaged with the content, statements of their experience could be sent to a Learner Record Store.

Conclusion

LodeStar Learning’s mission is to make these technologies and capabilities accessible to instructors. We have done that with the addition and improvement of our templates.  We have incorporated the ability to export any learner object with Tin Can capability.  Now instructors can choose between SCORM 1.2, SCORM 1.3, SCORM CLOUD, SimpleZip (for Schoology and other sites) and, most recently, TinCan 1.0.

We have improved Activity Mobile Maker and added ARMaker (for geographically located content) and FlowPageMaker for a new style of mobile design.

We’ve already gone global.  Now we’re going mobile.  We’re embracing M-Learning and all of its amazing affordances.

 

Augmented Reality For Educators

Introduction

The New Media Consortium predicts the sharply rising use of Augmented Reality (AR) in higher education over the next five years. As with any new technology, I am always interested in how AR can be made viable for busy instructors – so that a reasonable effort yields a commensurate return. I’ll introduce a prototype project that can be replicated by instructors. But first, let’s take a broad look at AR.

Augmented Reality covers a wide spectrum of applications, which is reflected in the consortium’s description of AR as “the incorporation of digital information including images, video, and audio into real-world spaces. AR aims to blend reality with the virtual environment, allowing users to interact with both physical and digital objects.” (NMC, Horizon Report, 2016 Higher Education Edition)

In this article I walk through the making of a simple AR application with the LodeStar authoring tool, which now includes the ARMaker template. Any intrepid instructor can create something similar for his or her own course.

Our use of AR fits closely with a common use that is defined by a research article that appeared in Computers and Education in March 2013, titled “Current status, opportunities and challenges of augmented reality in education”

First, AR technologies help learners engage in authentic exploration in the real world, and virtual objects such as texts, videos, and pictures are supplementary elements for learners to conduct investigations of the real-world surroundings (Dede, 2009). One of the most prevalent uses of AR is to annotate existing spaces with an overlay of location-based information (Johnson et al., 2010a).

AR supporters make claims of deeper engagement of students, connection of academic content to ‘real world’ and deeper levels of cognition. TechTarget’s definition of Augmented Reality is that it is the “integration of digital information with the user’s environment in real-time. Unlike virtual reality, which creates a totally artificial environment, augmented reality uses the existing environment and overlays new information on top of it. “

You have already seen AR applications outside of education:

In watching football, you’ll notice the yellow first down line painted across the television screen. That has stuck as a useful and accepted addition to the game. Other ideas were not so well received. Fox Sports glowing, streaking hockey puck was the culmination of a $2 million R&D project that got hockey fans…well, glowing mad.

More relevantly, in education, teachers use technology to create their own “auras” around, for example, works of art that suddenly come to life when scanned with the mobile phone camera. An aura can cause music to play, or a video to show, or an animation to display. Math students can point their smart phone at an equation and watch it jump to life on the screen (Aurasma).

The QR tag is a simple form of Augmented Reality. Special QR reader apps enable museum visitors, for example, to scan a QR tag and launch a web site devoted to the art exhibit and its interpretation. JISC, formerly the Joint Information Systems Committee and now a non-profit company, describes a project in England where students scan rare manuscripts with their smart phones and have digital facsimiles appear so that they can turn the pages and get supporting videos, text and images to help them interpret the old texts.

Finally, the University of Oklahoma library created a smart phone app that guides visitors by sensing their physical location, and revealing information about nearby content resources. They placed Bluetooth beacons in strategic places. The beacons are set to transmit data at regular intervals. The smart phone receives the beacons’ unique id and as a result knows precisely where it is and what content should be displayed. Out of doors, the application uses GPS and the smart phone’s location services.

Imagining the Possibilities at a Simpler Level

I recently chatted with an environmental science professor at our university. Near our main campus we have a wonderful natural treasure called Swede Hollow. Swede Hollow is a wooded ravine at the foot of Dayton’s Bluff in East Saint Paul. Poor immigrant families settled in the hollow starting in the late 1800s. Phalen Creek once ran through it in full force. At the top of the bluff stood the Hamm’s Mansion until it burned down in the 1950s. At one end of the hollow stood the Hamm Brewery.

Swede Hollow is rich with historical, geological and natural interest. Of course, the environmental science prof had the knowledge to uncover the layers of significance of this area. We discussed a mobile application that would do just that. Students could visit the area with their cell phones and be presented with location-specific information that may not be readily apparent to the casual observer. For example, Phalen Creek is now “entombed’ in an underground tunnel that has attracted a following of urban adventurers.

The instructor has led student tours through Swede Hollow. On her tour, she mentions the changing appearance of trees during the seasons or the tunnel underneath and promises to show the imagery of urban adventurers when students return to the classroom. It is difficult to replace her personal touch with a digital application, but in terms of information and the display of digital assets, in an augmented reality application, the instructor’s expertise could be captured and presented to the students at specific locations. Students would be able to take the tour at their leisure – in a sense, asynchronously — spending more or less time at each location according to their interest. The dependency on the instructors’ availability would be removed.

About twenty miles from Swede Hollow is my home town – Stillwater, Minnesota. That’s where the story of our first prototype begins.

A working prototype

Stillwater is also rich in history, geography, plant and animal life, and politics. The same is true of many areas, and yet we pass through them at fifty miles an hour oblivious to the layers of interest that surround us or… remotely contemplate them from our computer terminal – perhaps in the context of an online learning class.

In Stillwater, we have the history of the saw mills, the bursting of a dam that sent tons of mud and debris down a ravine to reshape the downtown, the sandstone and limestone bluffs, the restoration of prairie grasses and oak savannas along the river, the wildlife, the reign of the lumber barons and the Victorian architecture. As in any area, all of this can be lost on the casual observer.

A walking tour can get us out of the car or away from the computer and into the world – aided by a smart phone and the captured knowledge of an educator like our environmental scientist.

Educators know the points of interest. Depending on their discipline, they know the civil rights history of an intercity area; they know the trees, and plants and shrubs featured in a tucked away ravine; they know the source and destination of streams. With the help of technology, they can now tell their story to all who are interested in a manner unprecedented.

Of course, education aside, Pokemon, portals and anomalies have gotten people out of their chairs and into the world. The company Niantic created Ingress and Pokemon Go to get people away from the game consoles and wandering about their neighborhood and cities in search of game features that are tied to locations through latitude and longitudinal coordinates. In the case of Pokemon Go, gamers are in search of uncaptured Pokemon that are found at specific locations. Gamers must physically go to those locations. In the case of Ingress, gamers find portals that they try to either destroy or restore. In both games, people move about with their smart phones, going to locations, causing the app to display something of interest.

In contrast, the type of interaction that we propose is simpler but rooted in the richness of a particular discipline. We propose something that instructors can create with the help of a template and a little creativity. Students are led on a guided tour of an area where they are introduced to the history or geography of that area or whatever matches the discipline. They are guided from point to point. Their instruction comes from observing the physical thing and hearing or reading about its significance or challenged to take notes and draw conclusions from their observations or any variation thereof.

In the project that we are building as a proof of concept, we explore the history of Stillwater. The City of Stillwater has already produced a walking tour. It is well done with vetted historical content and professionally produced media. Currently, visitors can access the Historic Downtown Walking Tour website and view each location from the convenience of their computers.

We propose that students travel to the location and experience all of the sights, sounds and smells of the location in addition to learning about its significance.

The current tour is concentrated in downtown Stillwater both east and west of Main street.

In our prototype, students are guided to a location and then given information on how to find the next location. In the following screen shots from the prototype, students start at the pergola by the river. Once there, they can access an audio presentation on the preservation efforts at the turn of the last century and the resulting Lowell Park. They are then guided to a mill, old freight house, caves that stored beer kegs, and more.

We created the prototype by launching LodeStar and selecting the ARMaker template.  For each page we put in the precise location with the help of Google Maps and a Google Earth overlay.  For each page, we inserted images, typed text and imported audio that was matched to the location.  In the future, you will see the results of this project.  We are awaiting  permission from the city council for this ‘proof of concept’. In the meantime, we can tell you some of the benefits and challenges of designing this prototype.

2016-10-23_1842

Matching content with Latitude and Longitude Coordinates with LodeStar

Lessons Learned

The theme of the Stillwater walking tour is the ingenuity of humans to eke out their livelihoods from the natural resources of the area: lumber, wheat, and beer, to name a few. The walking tour covers the triumphs and the trials of the various local businesses and enterprises. It’s a sneak peek into the past.

To date, we learned several things from creating this walking tour. We’ll list some of the more important lessons:

  • Stay out-of-doors. Accurate locations come from GPS satellites. The results indoors will vary greatly depending on the location. When GPS is unavailable, locations are achieved through other, less reliable means. Whereas the GPS signals can give us coordinates that are two or three meters off target – in other words, fairly precise – alternative means may give us imprecise coordinates, which may be dozens of meters off target.
  • Add a fudge factor. Set the location with a proximity of 40 feet. That means, when the students are within forty feet of the target, the content will display/play. 40 feet may seem like a wide radius, but once students are on a field trip and approaching landmarks, 40 feet is not a large distance at all.
  • Make it easy for students to know where the next location is. Have students follow a street or a path or a riverbank. Alternatively, give precise directions to the next stop.
  • Use text, images and audio. Video can pose a problem. Students will be connected through 3G or 4G. The data rate for 3G is 2 Megabits per second. The data rate for 4G is 20 megabits per second or higher. 10 times faster. The experience will be quite different for the two users.
  • Use simple questions to check students’ understanding at a site, with feedback.
  • Be careful of making students walk great distances without frequent points of interest.
  • Consider visual and hearing impairments when designing the application
  • Be mindful of students who can’t walk great distances. Distances are short on a map, but not in the field. Consider, an alternative, shorter tour.
  • Instruct students to first load the project website into their browser when they have a good connection to the internet so that images and audio can get cached, resulting in a better playback experience for students.
  • When producing a self-guided tour, use Google maps on the desktop to set locations with at least six digit precision. For example, 45.094156. Google maps will allow you to zoom into a location and click to set a marker. Overlay Google maps with Google Earth to know where you are and get very accurate locations. Copy the coordinates of the marker into your application. If you must walk the tour to set locations, download an app that gives you good coordinates. An example app would be LocMarker Lite, which allows you to add and record locations with six digit precision. The compass on the iPhone, conversely, gives you coordinates in degrees, minutes and seconds, which is not enough resolution. A second of latitude is 80 feet.

Why it works

When we hear, see, read, discuss and reflect upon things we are encoding information and experiences in semantically rich ways that help in the retrieval of this experience and relating it to other knowledge. We experience the moment, the sights and smells. We note the texture of the object, its placement, its size and we ponder the relationship of some newly presented content to this tree or building or river way.

Augmented Reality can also challenge us to think critically about what we are seeing. I remember when I was a boy going on a technology-assisted field trip that I will never ever forget. The technology was the orienteering compass. We moved from location to location by being given a directional bearing and a number of paces. One of the locations was a tree that was obviously diseased. We were challenged to identify the disease and then introduced to Dutch Elm disease. I had never known the devastating effects of disease on trees ….and recalled the experience later in life when our own woods were ravaged by oak wilt.

Conclusion

This is a first attempt at AR. We have already published the ARMaker template with the latest release of the LodeStar eLearning authoring tool. You can download the trial version and immediately access the ARMaker template. Try it for your own class and give us feedback on how you designed your walking tour. Eventually, we will propose an AR assisted walking tour design pattern that reflects best practice.

Download LodeStar at http://www.lodestarlearning.com   Look for the Try link at the top for the trial version.  Select the ARMaker template.

Happy exploration.