mLearnCon is coming up in two weeks! June 21-23, we’ll be doing some intensive deep dives into how mobile devices can (and are) transforming learning across a lot of domains (corporate, government, medical, military, academia, edu, etc). I’m going to (try to) give you a sneak preview of what’s happening on the mLearning Future Zone stage I’m hosting, where some of my “bright lights” about mobile devices and learning will share openly (and deeply if you want) about how we can design richer mobile experiences that result in higher-order learning outcomes (and measurable results).
Kris Rockwell (@hybridkris) of Hybrid Learning Systems and Richard Culatta (@rec54), US Department of Education will chat with me on Tuesday, June 21 about how the ability to add virtual elements and layers onto your physical surroundings provides opportunities and new challenges to how we think about simulations and learning activities and how we design them for mLearning.
Kris was available to answer a few questions to give you a taste of what we’ll be going into at mLearnCon.
Me: As learning people, why should we care about Augmented Reality?
Augmented Reality can extend the world around a user by layering digital reference information on top of the physical world. This experience can something as simple as the first down marker overlaid on a football field during televised games, or as complex as a 3D model rendered in the viewport of a smartphone. While AR content is typically thought of as 3D content that is overlaid over a real-world environment when viewed through the screen of a smartphone, the reality is that AR is any data pushed to a device, be it onscreen text, audio recordings or other data, that is used to augment what the user is viewing. Examples of how content may be used in an educational context may be:
- A user is touring a historic district of a city. The user has a smartphone with a map that shows her position in the city. As she approaches an area of interest, the map notifies her of the interesting site and provides further information about the history of that spot.
- A user is visiting the ruins of the Parthenon. Launching an application on his smartphone, he points the camera at the Parthenon and a 3D model of the building as it appeared when it was completed in 432 BC along with information about the building and it’s history. This allows him to see what the building looked like when it was built and get an idea of how it has changed.
These are basic example of what can be done, but they offer a look at how AR can be used as a learning tool to supplement an experience with additional information in an engaging manner. As smartphones become more widely available the use of AR will continue to grow and become an even more relevant technology for the learning community. As an example of the growth of AR, in one year, the Layar Platform went from launch to having 1000 published applications, 3000 applications in development and 4000 active developers.
Me: What kinds of organizations are using AR for learning or training, and how are they using it?
My experience has been that the corporate world is not adopting AR at a fast rate. Having said that, BMW has produced a video showing a concept of how AR could be used as a job aid for a mechanic conducting repairs to a vehicle (VW has done the same through Metaio, an AR development company). The United States Marine Corps has implemented a prototype of this type of system as a demonstration of AR based maintenance procedures on an LAV-25 Armored Personnel Carrier. In this case the user wears a head-mounted display connected to a wrist mounted device. As the user performs the tasks, 3D instructions are relayed to the user in the HMD based on the step and what the user is looking at. This idea is not new, it originated at Boeing during the development of the 777 aircraft. Having said that, the technology behind it has advanced significantly and enables much more interactive content to run on lower powered devices.In education AR is being used to great effect. For example, the company 7Scenes (www.7scenes.com) uses geolocation to send data to users as they travel. This was notably used in a project titled “Frequency 1550″ (http://freq1550.waag.org/index.html) in which groups of students “competed” against one another to discover and learn about medieval Amsterdam. The game turned the entire city in to a gameboard and pitted teams against one another to demonstrate and capture their knowledge of history. The game was very well received and received a large amount of acclaim for it’s innovative implementation.
There is a project called Pittsburgh: Public Record in the Greater Pittsburgh area by an artist named Justin Hopper (@oldweirdalbion). The idea behind Public Record is that people can walk around the city with their smart phones and, through geo-location, be notified when the are near a point on interest – in this case the points are locations of crimes committed in Pittsburgh in the 19th century. Users are notified on their iPhones through an App when they get close to a location. They can then listen to a poem and information about the crime that took place at that very location.Applications such as Layar also provide a good AR experience. Layar uses geolocation to display information applicable to that area. For example, there is a Layar application the allows users to visually see the most recent Tweets that were published near you. With the aid of the gyroscope and accelerometer in the iPhone the user can see the Tweets including their direction and distance from where the user is standing. There are a great number of Layar apps that are available for download. Layar is available on iPhone and Android platforms.More recently both Nintendo, with the release of it’s 3DS device, and Sony, with the pending release of the NGP handheld device are including AR games as part of the system. In the case of the 3DS, the system include marker cards that can be placed on a surface and, when viewed through the built-in camera on the 3DS, show a 3D model that you can interact with. Additionally, the nintendo device has a game included on the device called “Face Raiders” in which your photo, or that of a friend, is mapped onto flying objects that appear on the screen that you must shoot. The object appear in space as you look at the screen and the accelerometer is used as you turn around to shoot these objects.