Demos

Print

For additional information please see the program overview.

40 Demos have been selected for presentation on 27 and 28 October. All 40 demos are presented on both days. 

 

Demos provide us with our first glimpse of new ideas, a venue for directly interacting with implemented versions of novel concepts, and an opportunity to engage with the people of this community in the context of their latest work. This year we were able to select from a particularly wide variety of submissions.

 

VolkswagenARinteractiveVolkswagen AR interactive

Thomas Balduff (Total Immersion)

 

The demo is based on an iPad2. The Golf Cabriolet is finally back after 9 years of absence, since production stopped in 2002 after 23 years of success. Amongst all brands and types, soft-tops have always enjoyed a significant attraction and the challenge still remains to present the latest vehicles as quickly as possible for the general public to discover and enjoy.

Total Immersion´s D`Fusion is used to put the new model on show at dealers although it is not yet available. Therefore Augmented Reality is used. This is how it is now possible to discover the new version of this old favorite.

In the dealer’s showroom, visitors are either invited to upload the Volkswagen Virtual Golf Cabriolet application on their iPhone or a Smartphone (iPhone and Android), or the authorized dealer provides them with an iPad with the application already loaded. Once the application is launched the user is able to view and experience the vehicle as if it on site. A wealth of possibilities is offered, such as opening the roof, turning the car, checking details, changing the body colour or the style of the rims.

 

MicroARforTouristMicro AR for Tourist: Hybrid Physical/Virtual Mobile Media Using a Loupe Metaphor

Koh Sueda, Jian Gu, Shintaro Kitazawa, Henry Been-Lirn Duh (Interactive and Digital Media Insitute National University of Singapore)

Micro AR is a hybrid physical/virtual hypermedia that integrates paper (a static/tangible media) and Augmented Reality (AR – an interactive/immersive media) using the metaphor of a loupe tool to represent magnification functionality. Making use of the familiar action of observing small objects as part of the metaphor, we can embed intuitive and semantic browsing operations into paper media. This system provides the user with an immersive, interactive experience through the use of a handheld magnification AR device and micro AR makers.

 

ARAIndoorOutoorARA Indoor-Outoor Navigation for Pedestrians 

Mathieu Razafihamazo, Yohan Lasorsa, David Liodenot, Audrey Colbrant (INRIA), Jacques Lemordant (UJF-INRIA-LIG)

Visually impaired people have difficulties navigating indoors and outdoors but they seek independent travel and have considerable success at it. Our idea is that by trying to understand the reasons behind this astonishing successful navigation, we can build an autonomous system for indoor-outdoor navigation of (visually impaired or not) pedestrians. We have build a high precision (at a level of a step) navigation system using path integration, 3D audio cues, and structured environment representation which are probably the navigation aids used implicitly by visually impaired people. This demo will allow to grasp how these interrelated concepts have been implemented using  Pedestrian Dead Reckoning (PDR), Augmented Reality Audio (ARA), and OpenStreetMap Representation of a regular indoor or outdoor layout.

 

ArgonARwebbrowserArgon AR web browser

Blair MacIntyre, Alex Hill, Hafez Rouzati, Maribeth Gandy, Brian Davidson (Georgia Institute of Technology)

This demo accompanies our paper “The Argon AR Web Browser and Standards-based AR Application Environment”.   The goal of Argon, which we hope to illustrate during the demo, is to move us a step toward achieving the popular vision of Augmented Reality (AR), where a person is immersed in a diverse collection of virtual information, superimposed on their view of the world around them. If such a vision is to become reality, an ecosystem for AR must be created that satisfies at least these properties: multiple sources (or channels of interactive information) must be able to be simultaneously displayed and interacted with, channels must be isolated from each other (for security and stability), channel authors must have the flexibility to design the content and interactivity of their channel, and the application must fluidly integrate with the ever-growing cloud of systems and services that define our digital lives.  Argon combines KARML (our extension to KML, the spatial markup language for Google Earth and Maps) with the full range of standard web technologies to create a standards-based web browser for mobile AR. Argon lets users develop 2D and 3D content using existing web technologies and facilitates easy deployment from standard web servers. We will demonstrate a number of projects that have used Argon and point out the ways in which our web-based architecture has made previously impractical AR concepts possible.

 

HandheldARGamesHandheld AR Games at the Qualcomm AR Game Studio at Georgia Tech

Blair MacIntyre, Yan Xu, Maribeth Gandy (Georgia Institute of Technology)

In this demo, we will show a collection of games that have been produced over the past year at the Qualcomm Augmented Reality Game Studio, a partnership between the Augmented Environments Lab  at the Georgia Institute of Technology  (Georgia Tech), the Atlanta campus of the Savannah College of Art and Design  (SCAD-Atlanta) and Qualcomm.  The studio is a research and design center aimed at pioneering new advancements in mobile gaming and interactive media.  The game studio builds upon the creative and technical talents of the Georgia Tech and SCAD students and faculty, and uses Qualcomm’s augmented reality platform  and related graphics technologies to produce new application concepts and prototypes.  Many of the games being demonstrated are discussed in our ISMAR AMH paper, “Pre-Patterns for Designing Embodied Interactions in Handheld Augmented Reality Games.”

 

UseofpanoramicimagesUse of panoramic images for high accuracy urban infrastructure augmentation

Stéphane Côté, Yannick Barrette, Stéphane Poirier (Bentley)

In this demo, we will show a prototype running on a tablet device, that enables onsite users to display stable (jitter-free) high accuracy augmentation through the use of panoramic images.  The demo is not quite augmented reality, but instead augmented panoramas, that are visualized in an emulated augmented reality experience.  It is still very convincing, and offers high quality (jitter free and accurate) augmentation.  Further details can be found at: http://bit.ly/l31F6t

 

AugmentedRealityPopUpAugmented Reality Pop-Up Book, “Who’s Afraid of Bugs?”

Helen Papagiannis

The Augmented Reality Pop-Up Book, “Who’s Afraid of Bugs?” combines hand-crafted paper-engineering and AR on mobile devices to create a tactile and hands on storybook which explores the fear of bugs through narrative and play. It is the first AR pop-up book designed specifically for iPad 2 and iPhone 4 using image recognition. Integrating natural feature tracking in the design, the book can be enjoyed alone as a regular pop-up book, or supplemented with augmented digital content when viewed through a mobile device equipped with a camera. The Augmented Reality Pop-Up Book is a playful exploration of fears using AR in a meaningful and fun way. Rhyming text takes the reader through the storybook where various ‘creepy crawlies’ (spider, ant, and butterfly) are awaiting to be discovered, appearing virtually as 3D models you can interact with. A tarantula attacks when you touch it, an ant hyperlinks to educational content with images and diagrams, and a butterfly appears flapping its wings atop a flower in a meadow. 

website: http://augmentedstories.wordpress.com

 

BurnARBurnAR: Feel the Heat *Best Demo Award

Matt Swoboda (Fairlight), Thanh Nguyen (University of South Australia, Graz University of Technology), Ulrich Eck (University of South Australia), Gerhard Reitmayr, Stefan Hauswiesner, Rene Ranftl (Graz University of Technology), Christian Sandor (University of South Australia) 

In this demo, a user will experience their own hands interacting with complex graphics simulating smoke and fire effects in the environment. A user will look through a stereo head-worn display (HWD) at their own hands which will start to smoke and interact with flames. A real-time fluid simulation running calculates the volumetric effects using the users hand as input for motion and interaction surface. The hands’ location and depth will be estimated from the stereo view delivered by the HWD’s camera pair. Overall, the immersive experience of the user’s own body interacting with the striking, high-quality graphics effects will create an exciting demo.

 

ARMicromachinesAR Micromachines: An Augmented Reality Racing Game

Adrian Clark, Thammathip Piumsomboon (University of Canterbury)

AR Micromachines is a table top car racing game, where three dimensional information obtained using a Microsoft Kinect camera is combined with augmented reality graphics to create a new gaming experience. Users create their own tracks using real objects, and then race virtual cars head to head with realistic physics, occlusion and shadows.

 

ARColoringBookAR Coloring Book

Adrian Clark, Andreas Dünser (University of Canterbury)

In this augmented reality coloring book demonstration, users are able to color in the pages, and these pages are then recognized by the system and used to produce three dimensional scenes and textured models reflecting the artwork created by the users. This three dimensional virtual content is then overlaid on the real book pages, providing a three dimensional experience using the users own content.

 

Interactive3DAudioInteractive 3D Audio Panorama

Yohan Lasorsa, Jacques Lemordant (INRIA Rhône-Alpes)

This demonstration presents technologies to discover and get information about the surroundings using spatialized audio and natural gestures. Navigation in a set of PoIs (Points of Interest) is usually performed through visual AR (Augmented Reality) techniques on mobile devices, using software such as Layar or Argon. Our technology is unique by the mean of giving users the ability to search and navigate trough PoIs in a similar way visual AR browsers do, but using audio only. AR experts being quite familiar with visual AR systems will discover new perspectives by the experimentation of interactive 3D audio in an AR context.

 

ComputerAidedMedicalComputer Aided Medical Diagnosis and Surgery System: Augmented Reality for Ultrasound Procedure Guidance on the International Space Station

Yashodhan Nevatia, Keshav Chintamani, Tangi Meyer, Michel Ilzkovitz (Space Applications Services), Arnaud Runge, Nadine Fritz (European Space Agency)

The Computer Aided Medical Diagnosis and Surgery System (CAMDASS) is a tool that uses Augmented Reality and Virtual Reality technologies to provide guidance and automated diagnostics support to non-experts for performing medical procedures, focusing on ultrasound examinations for diagnostic purposes. Developed with support from the European Space Agency, it is motivated by astronauts on long-term spaceflights where restricted communication makes traditional tele-medicine solutions impractical.

 

PortableVirtualAssemblyPortable Virtual Assembly, Integration and Testing Visualizer

Benedikt Naessens, David De Weerdt, Michel Ilzkovitz (Space Applications Services), Mark Wagner (European Space Agency)

The Portable Virtual Assembly, Integration and Testing Visualizer (PVAITV) aims to assist Assembly, Integration and Testing personnel by providing them with a light-weight, wearable, powerful and multifunctional Augmented Reality system that allows context-sensitive information visualization using highly user-friendly and intuitive user interactions, based on hands-free technologies. The system was initially developed for the European Space Agency for use in the integration and testing of satellites.

 

MassMarketARGamesMass Market AR Games

Ori Inbar (Ogmento)

As the first venture-backed AR Games company, Ogmento spends all its resources and energy on AR Games. Ogmento's AR Games are not only a great demonstration of successful engaging products in the mass market, but also introduce new AR technologies.

 

GravityawareHandheldGravity-aware Handheld Augmented Reality

Daniel Kurz, Selim Benhimane (metaio GmbH)

This demo showcases how different stages in handheld Augmented Reality (AR) applications can benefit from knowing the direction of the gravity measured with inertial sensors. It presents approaches to improve the description and matching of feature points, detection and tracking of planar templates, and the visual quality of the rendering of virtual 3D objects by incorporating the gravity vector. All demonstrations are shown on mobile devices.

 

MultitexturetrackingMulti-texture tracking of planar structures using FIRST, Fast Invariant to Rotation and Scale feature Transform

Rafael Bastos (Vision-Box & ADETTI/ISCTE-IUL), Filipe Gaspar (ADETTI-IUL/ISCTE-IUL), Miguel Sales Dias (MLDC & ISCTE-IUL)

This demo shows the potential of a novel Feature Transform Invariant to Rotation and Scale (FIRST) in Augmented Reality field, namely in multiple texture tracking generalized to near model-based tracking method, using pre-calibrated static planar structures for scene calibration (off-line), allowing, in on-line stage, to register 3D objects with only one plane visible – at real-time with millimetric and sub-degree precision.

 

RGBDcamerabasedRGB-D camera-based parallel tracking and meshing

Sebastian Lieberknecht, Andrea Huber (metaio GmbH), Slobodan Ilic (TUM), Selim Benhimane (metaio GmbH)

This demonstration showcases an approach for RGB-D camera-based tracking and meshing. We investigated how a camera like the Microsoft Kinect could be used to simplify the SLAM problem based on the additional depth information. Besides for tracking the camera’s motion, the available per-pixel-depth is also used to create a meshed and textured reconstruction of the environment at the same time. The meshed version can be used for occlusion of virtual objects, as will be shown using augmented furniture. We further present a live demonstration of how the sparse and meshed map are built. More details on our approach can be found in our accompanying paper „RGB-D camera-based parallel tracking and meshing“ from this year’s ISMAR.

 

AugmentedvideoAugmented video conferencing

Łukasz Boszko, Mathilde Cosquer, Pawel Rubach (Orange Labs) 

Augmented reality technology is little-used for interpersonal communications services. This prototype exploits this new opportunity for video conferencing. The developed augmented reality function allows augmenting one’s video picture with:

Interlocutors then see this added information above one’s own video picture as clickable icons.

For the user, the application permits keeping the focus on the video, to easily distinguish who owns contents and to enhance exchange and expression possibilities. This prototype also permits cooperative work via Google Docs and browser sharing. This prototype isn’t a service. It illustrates a function which can be integrated into any videophone services.

 

iConferencesystemiConference system - Social networking in a conference using handheld augmented reality technology

Weijian Heng, Yuan Wang,  Henry Been-Lirn Duh (National University of Singapore)

This demonstration showcases a mobile application, iConference that helps to facilitate networking and ice-breaking among attendees in a conference using a combination of social networking, facial recognition and augmented reality technologies. Users are able to scan and identify faces in the crowd using their mobile phones, after which they can obtain relevant user profile information retrieved from social networking sites. The faces are tracked and the information augmented on the screen. This allows an easier and more intuitive way for participants in a conference to network.

 

ARsupportedRemoteAR-supported Remote Collaborative Experimentation: AR Circuit

Jian Gu, Nai Li, Henry Been-Lirn Duh  (National University of Singapore)

This research demo describes the implementation of a  handheld AR-supported  collaborative learning system, “AR Circuit”, which is designed to  enhance the effectiveness of remote collaborative experimentation for physics. The application employs the TCP/IP protocol enabling multiplayer functionality in a mobile AR environment. The iPhone functions as the server and the iPad  serves as the client. The server captures, process the video frame, and send the current frame and the markers transformation matrices to the client. 

“AR Circuit” provides hands-on experience with collaborative visualization of electric circuits for  distributed individuals to  engage in collaborative experimentation. Collaborators can access to and manipulate shared virtual electronic components over distance, whilst verbally interacting with each other to jointly solve the problem.

 

ShowcasingLayarVisionShowcasing Layar Vision & Layar Player

Jens de Smit, Ronald van der Lingen (Layar)

Layar is the world’s largest augmented reality platform, available for iPhone and Android mobile phones. Within Layar, AR content is displayed through layers of digital data, whereas Layar Player allows a seamless integration of AR experiences within the context of any other application.

In this demo, we will showcase Layar Vision and its integration into Layar Player. With Layar Vision, digital experiences are launched from visual triggers like magazines, posters and newspapers. Computer vision techniques are being used to augment objects in the physical world, based on extracted visual features. Layar Vision has been added to existing location-based layers in Layar Player, so developers can easily upgrade their layers as well as create completely new, vision-based experiences.

 

TowardAugmentingEverythingToward Augmenting Everything: Detecting and Tracking Geometrical Features on Planar Objects

Hideaki Uchiyama, Eric Marchand (INRIA Rennes)

We propose an approach for detecting and tracking various types of planar objects with geometric features. In the demonstration, we show that classical texture, handwriting, ARToolKit marker and random dots marker are detected and tracked in real time. The details will be presented in the S&T ISMAR track.

 

AreadytouseUSBA ready-to-use USB memory stick for TRAKMARK evaluation and AR learning

Masayuki Hayashi, Itaru Kitahara, Yoshinari Kameda, Yuichi Ohta (University of Tsukuba)

We present a USB bootable linux system that is designated for AR system evaluation and AR technology learning. We release some USB images (“casper cartridges”) on which people can run some famous applications including ARToolKit, PTAM, OpenCV, and basic kit of camera calibration evaluation on a newly introduced dataset; TRAKMARK.

In the demonstration, we will show how the USB memory stick can easily boot up linux on ordinary PC (both desktop and laptop) and show how to run the AR applications and TRAKMARK applications. If you bring 4GB USB memory stick, we will set-up it for you.

 

RealTimeAccurateReal-Time Accurate Localization in a Partially Known Environment: Application to Augmented Reality on 3D Objects

Mohamed Tamaazousti, Vincent Gay-Bellile, Sylvie Naudet Collette, Steve Bourgeois (CEA, List), Michel Dhome (LASMEA/ CNRS)

This demo addresses the challenging issue of real-time camera localization in a partially known environment, i.e. for which a geometric 3D model of one static object in the scene is available. 

We proposed a constrained bundle adjustment framework for keyframe-based SLAM that includes simultaneously the geometric constraints provided by the 3D model, the multi-view constraints relative to the known part of the environment (i.e. the object observations) and the multi-view constraints relative to the unknown part of the environment. We use two different model based constraints to deal with both textured and textureless 3D objects.

Consequently, our solution offers both the accuracy of model-based tracking solution and the robustness of SLAM (fast movements, robustness to partial/total occlusion, robustness to large viewpoint changes, etc.).

 

AugmentedRealityglassesAugmented Reality glasses try-on

Miaolong Yuan, Ishtiaq Rasool Khan, Farzam Farbiz, Zhiyong Huang, Arthur Niswar (A*STAR Institute)

This is an augmented reality system for virtual try-on of eyeglasses. The user can select from the available models displayed on screen with a click of mouse, and see them fitted on his/her face. A webcam is used to capture the user image, and the glasses are fitted on the face in real time in correct pose. The resultant image is displayed on the screen giving a perception to the user as if he/she is trying-on the glasses in front of a mirror.

  

AugmentingMagneticAugmenting Magnetic Field Lines for School Experiments

Florian Mannuß, André Hinkenjann (Computer Graphics Lab, Bonn-Rhine-Sieg University)

We present a system for interactive magnetic field simulation in an AR-setup. The aim of this work is to investigate how AR technology can help to develop a better understanding of the concept of fields and field lines and their relationship to the magnetic forces in typical school experiments. The haptic feedback is provided by real magnets that are optically tracked. In a stereo video see-through head-mounted display, the magnets are augmented with the dynamically computed field lines.

 

SmartARSmartAR

Masaki Fukuchi, Takayuki Yoshigahara (System Technology Lab, Sony Corporation)

In this demo, we will demonstrate our AR technology, named SmartAR, on mobile devices. SmartAR has two main features, object recognition capability and 3D space recognition capability which is based on a single camera SLAM, and is notably fast and robust enough for various AR applications on mobile devices such as smart phones, tablets and game consoles. We will also demonstrate our intuitive interface to get and operate information by simply touching detected objects on the screen. Our demos can be viewed on many video posting sites. http://www.youtube.com/watch?v=XCEp7udJ2n4.

 

HomographyBasedHomography-Based Planar Mapping and Tracking: Monocular SLAM on Mobile Phones

Christian Pirchheim, Gerhard Reitmayr (ICG Graz University of Technology)

This demo will show a real-time camera pose tracking and mapping system which uses the assumption of a planar scene to implement a highly efficient mapping algorithm. We demonstrate our system running on a mobile phone (and laptop PC), mapping and tracking table-sized indoor scenes such as game boards and posters.

 

ARsupporteddocumentAR supported document inspection

Andreas Hartl, Gerhard Reitmayr (ICG Graz University of Technology)

The inspection of documents such as passports or id-cards is carried out a myriad of times each day. Despite involving various tools such as document readers or ultraviolet lamps, this task still requires the operator to have detailed knowledge about the nature of a document. At the time of inspection, such information may already be outdated or even unavailable, however. In this demo we show a prototype system for AR-supported mobile document inspection. A predefined set of documents is recognized by an off-the-shelf smartphone and instantly augmented. The user may interact with the system by manipulating the relative location of the phone with respect to the document.

With this setup it is possible to interactively select and present document information on various levels of detail, just by pointing at it.

 

ARSoccerAR Soccer: A Multi-player Augmented Reality Demo on Mobile Phones

Markus Perndorfer, Lukas Gruber (ICG Graz University of Technology)

We present a demo on mobile phones called AR Soccer. AR Soccer is a multiplayer game, where one or more players (each player has one phone) in two teams play soccer against each other and try to score enough goals to win. This demo is based on a peer-to-peer approach for networking in Studierstube ES, called ICE Connector. The ICE Connector enables an application programmer to register objects of specially designed classes in the network structure, get such objects from other peers in the structure and use them in his application as if they were local objects.

 

ARIndoorNavigationAR Indoor Navigation using World-In-Miniture

Alessandro Mulloni, Hartmut Seichter (ICG Graz University of Technology)

We present an interface that provides continuous navigational support for indoor scenarios, where only sparse localization technology is available. We do so by adapting World-in-Miniature views to the quality of localization through transitions within Mixed Reality. We use Augmented Reality to provide an overview of the whole path at localization points, and Virtual Reality to communicate the upcoming path segment when localization is not available.

 

JavaScriptbasedNaturalJavaScript based Natural Feature Tracking

Christoph Oberhofer, Jens Grubert, Gerhard Reitmayr (ICG Graz University of Technology)

We present a Natural Feature Tracking pipeline written completely in HTML5 and JavaScript. It runs in real-time on desktop computers on all major web browsers supporting WebGL and achieves interactive frame rates on modern smartphones with mobile web browsers. The tracking pipeline will be made available as Open Source to the community.

 

SmartRealitySmartReality

Jens Grubert, Gerhard Reitmayr (ICG Graz University of Technology), Lyndon Nixon (STI International), James Scicluna (seekda GmbH), Thomas Buchstädter (Play.fm GmbH)

We present a mobile Augmented Reality prototype for context-aware access to digital information connected to physical posters. Unlike existing mobile AR browsers with image recognition capabilities, the system does not load predefined assets but uses Semantic Web Technologies to retrieve information relevant to the users' current situation on runtime.

 

SMARTVIDENTESMARTVIDENTE

Stefanie Zollmann, Gerhard Schall, Gerhard Reitmayr (ICG Graz University of Technology)

SMARTVIDENTE is a mobile Augmented Reality system to support surveying in the field. It allows on-site inspection, planning and data capturing. The system targets the visualization of underground infrastructure, such as electrical cables and different kind of pipes. Applying naïve visualization techniques to visualize underground objects may lead to misinterpretation by the technician. To avoid this we apply a range of image-based visualization techniques to support better perceptual differentiation.

Another area of interest is the support of manual user input for surveying tasks. Planning and data capturing requires the user to interact intuitively with geo-referenced data. We use a laser range-finder to measure new objects and mouse-pen-based interaction to manipulate objects.

 

RetrospectiononMobileRetrospection on Mobile Augmented Reality

Tobias Langlotz, Lukas Gruber, Alessandro Mulloni  (ICG Graz University of Technology), Daniel Wagner (Qualcomm)

Since 2003, a number of researchers involved in the Handheld AR group at ICG have demonstrated the feasibility of AR on mobile devices. We will give a small overview over the history of mobile AR, tracking and visualization methods evolving in these years. We will do these demonstrations on devices that were of interest at that time such as a Dell X51, Gizmondo, HTC iMate and the like.

 

HandheldARAVindoorHandheld AR/AV indoor navigation and detailed information with contextual interaction

Masakatsu Kourogi, Koji Makita (AIST), Thomas Vincent (University Joseph Fourier Grenoble 1), Takashi Okuma, Jun Nishida, Tomoya Ishikawa (AIST), Laurence Nigay (University Joseph Fourier Grenoble 1), Takeshi Kurata (AIST / University of Tsukuba)

The demonstration shows a handheld system for indoor navigation to a specific exhibit item followed by detailed information about the exhibit with contextual AR/AV interaction. The system provides the following four key functions: 

  1. Indoor navigation based on a PDR (Pedestrian Dead Reckoning) localization method combined with map matching using the built-in sensors (3-axis accelerometers, gyroscopes and magnetometers) of the waist mounted device.
  2. Coarse estimation of location and orientation by making correspondence between Virtualized-Reality (VR) models of environments and images from the handheld device camera.
  3. Fine estimation of location and attitude of the handheld device based on visual AR tracking methods.
  4. Contextual AR/AV (Augmented Virtuality) interaction widgets (e.g., buttons and menus) that provide detailed information about the exhibit. Widgets are contextual according to the relative position of the user to the exhibit.

Any participant can experience the AR/AV system, by being directed to search for a target exhibit to obtain further detailed information about the exhibit.

What makes our demonstration unique is the integration of indoor navigation capabilities with interactive AR/AV functionalities for augmenting an exhibit.

 

NokiaCitySceneNokia City Scene

David Murphy, Esa Koskinen, Yu You, Ville-Veikko Mattila (Nokia Research Center, Media Technologies Lab), Radek Grzeszczuk (Nokia Research Center, North America Lab)

Nokia City Scene is a stand-alone application for browsing, interacting, and searching for city data in a mixed-reality modality on mobile devices. The application allows users to browse or search for point-of-interest (POI) data in cities, share locations and find friends via social networks such as Facebook Places and Foursquare. Features of the application include per-building interaction, with buildings modeled as separate, clickable 3D entities. Each building’s associated point of interest data can be presented directly on the building façade. Nokia City Scene can be seen as a stepping stone between location-based services and AR by providing mirror world interaction with the city model as a proxy for AR in-situ, and as a compliment future ubiquitous AR by allowing users to interact with remote locations in a similar manner as one interacts locally using AR.

 

TransformativeRealityTransformative Reality: Augmented Reality for Visual Prostheses

Wen Lik Dennis Lui, Damien Browne, Lindsay Kleeman, Tom Drummond, Wai Ho Li (Monash University, Australia)

Visual prostheses such as retinal implants provide bionic vision that is limited in spatial and intensity resolution. Transformative Reality (TR) improves resolution-limited bionic vision by performing real time transformations of visual and non-visual sensor data into symbolic renderings. Our demo will immerse users in a visual world limited to 25x25 binary dots that simulates low resolution bionic vision; what does the world look like in 625 bits?



top