The Use of Augmented Reality in Geomatics

Author(s):  
Michał Bednarczyk

User interfaces are in continuous progress. As the computing power of modern machines grows, they become more user-friendly and intuitive. Not all solutions are widely accepted, sometimes they become only a “curiosity”, while another ones achieve success. Lately, some user interface designers strive for such solutions, in which the user will have the impression of “staying” or “permeation” of the system with reality and therefore some kind of software integration with the environment. This is achieved by various methods utilizing interfaces controlled by voice or touch. Quite spectacular and very interesting are solutions that integrate image generated by a computer with a real view. This technology is called AR – Augmented Reality, and is the core of the author’s considerations about its application in contemporary surveying and GIS practice. In this article, are presented issues related to the possibilities that lie in the use of this technology in the daily work of geo-engineer.

2021 ◽  
Vol 17 (1) ◽  
pp. 247-255
Author(s):  
Konstantinos CHARISI ◽  
Andreas TSIGOPOULOS ◽  
Spyridon KINTZIOS ◽  
Vassilis PAPATAXIARHIS

Abstract. The paper aims to introduce the ARESIBO project to a greater but targeted audience and outline its main scope and achievements. ARESIBO stands for “Augmented Reality Enriched Situation awareness for Border security”. In the recent years, border security has become one of the highest political priorities in EU and needs the support of every Member State. ARESIBO project is developed under HORIZON 2020 EC Research and Innovation program and it is the joint effort of 20 participant entities from 11 countries. Scientific excellence and technological innovation are top priorities as ARESIBO enhances the current state-of-the-art through technological breakthroughs in Mobile Augmented Reality and Wearables, Robust and Secure Telecommunications, Robots swarming technique and Planning of Context-Aware Autonomous Missions, and Artificial Intelligence (AI), in order to implement user-friendly tools for border and coast guards. The system aims to improve the cognitive capabilities and the perception of border guards through intuitive user interfaces that will help them acquire an improved situation awareness by filtering the huge amount of available information from multiple sources. Ultimately, it will help them respond faster and more effectively when a critical situation occurs.


Author(s):  
A. W. W. Yew ◽  
S. K. Ong ◽  
A. Y. C. Nee

It is the goal of ubiquitous computing (UbiComp) to hide computers from the users. Instead, everyday objects embedded with computer processing capability become smart objects that act as interfaces to computer software. A challenge with this new paradigm of computing is to create natural and obvious ways for people to interact with objects and receive output from the computer software that these objects serve as interfaces to. In this chapter, a solution is proposed whereby virtual user interfaces are added to smart objects. These virtual interfaces are viewed in augmented reality through personal viewing devices which also allow people to interact directly with them. The implementation of UbiComp environments and personal viewing devices is described in order to illustrate the use of current technology in creating user-friendly UbiComp environments.


Author(s):  
Hanna Poranen ◽  
Giancarlo Marafioti ◽  
Gorm Johansen ◽  
Eivind Sæter

User interface (UI) is a platform that enables interaction between a human and a machine, a visual part of an information device, such as a computer or software, which user interacts with. A good user interface design makes operating a machine efficient, safe and user friendly in a way that gives the desired result. This paper describes a set of guidelines defined for marine autonomous operations where many actors, devices and sensors are interacting. The UI should manage and present in a user-friendly manner a large amount of data, ensuring situation awareness for the operator/user. The design guidelines of the user interface consist of both a work process part and a content part, also called user experience design (UX). The work process consists of four sections: manage, plan, operate and evaluate, while the content part focuses on how to show the information. Both parts will be detailed and discussed and can be taken as a reference for designing user interfaces in particular for marine autonomous operations.


Author(s):  
A. W. W. Yew ◽  
S. K. Ong ◽  
A. Y. C. Nee

It is the goal of ubiquitous computing (UbiComp) to hide computers from the users. Instead, everyday objects embedded with computer processing capability become smart objects that act as interfaces to computer software. A challenge with this new paradigm of computing is to create natural and obvious ways for people to interact with objects and receive output from the computer software that these objects serve as interfaces to. In this chapter, a solution is proposed whereby virtual user interfaces are added to smart objects. These virtual interfaces are viewed in augmented reality through personal viewing devices which also allow people to interact directly with them. The implementation of UbiComp environments and personal viewing devices is described in order to illustrate the use of current technology in creating user-friendly UbiComp environments.


Author(s):  
Victor Pascual Ayats

The Spatial Data Infrastructure of Catalonia (IDEC) was launched in 2002. From the beginning, the Metadata Catalog (MC) service has been considered as one of the main pieces of the infrastructure. Building a metadata catalog is important for any Spatial Data Infrastructure to foster resource interoperability and integration. In addition to organizing, classifying, and sorting metadata records, one of the hardest parts in the IDEC was to design web applications that allow users to easily discover and access such geospatial resources. This chapter reviews the different trends in building friendly user interfaces of web applications to search and discovery metadata records through the evolution of user interface of the IDEC Geoportal.


2008 ◽  
Vol 1 (1) ◽  
pp. 125-146 ◽  
Author(s):  
D. A. Ham ◽  
P. E. Farrell ◽  
G. J. Gorman ◽  
J. R. Maddison ◽  
C. R. Wilson ◽  
...  

Abstract. The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. In this paper, we present a model-independent system, Spud, which formalises the specification of model input formats in terms of formal grammars. This is combined with an automated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module which minimises the development cost of adding model options. Together, this provides a user friendly, well documented, self validating user interface which is applicable to a wide range of scientific models and which minimises the developer input required to maintain and extend the model interface.


Author(s):  
Giovanni Mosiello ◽  
Andrey Kiselev ◽  
Amy Loutfi

AbstractMobile Robotic Telepresence (MRP) helps people to communicate in natural ways despite being physically located in different parts of the world. User interfaces of such systems are as critical as the design and functionality of the robot itself for creating conditions for natural interaction. This article presents an exploratory study analysing different robot teleoperation interfaces. The goals of this paper are to investigate the possible effect of using augmented reality as the means to drive a robot, to identify key factors of the user interface in order to improve the user experience through a driving interface, and to minimize interface familiarization time for non-experienced users. The study involved 23 participants whose robot driving attempts via different user interfaces were analysed. The results show that a user interface with an augmented reality interface resulted in better driving experience.


Author(s):  
Namratha Birudaraju ◽  
Adiraju Prasanth Rao

There are many customers who shop online. Many of these users interact with the system using well-defined user interfaces only. So, there is a need to develop effective user interfaces that are more user friendly, minimize the keyboard operations, and maximize effective interaction with the system. This chapter focuses on designing effective user interfaces by considering human parameters for e-commerce applications.


2013 ◽  
pp. 465-479 ◽  
Author(s):  
Victor Pascual Ayats

The Spatial Data Infrastructure of Catalonia (IDEC) was launched in 2002. From the beginning, the Metadata Catalog (MC) service has been considered as one of the main pieces of the infrastructure. Building a metadata catalog is important for any Spatial Data Infrastructure to foster resource interoperability and integration. In addition to organizing, classifying, and sorting metadata records, one of the hardest parts in the IDEC was to design web applications that allow users to easily discover and access such geospatial resources. This chapter reviews the different trends in building friendly user interfaces of web applications to search and discovery metadata records through the evolution of user interface of the IDEC Geoportal.


2009 ◽  
pp. 629-644
Author(s):  
Christian Sandor ◽  
Gudrun Klinker

In recent years, a number of prototypical demonstrators have shown that augmented reality has the potential to improve manual work processes as much as desktop computers and office tools have improved administrative work (Azuma et al., 2001; Ong & Nee, 2004). Yet, it seems that the “classical concept” of augmented reality is not enough (see also http://www.ismar05.org/IAR). Stakeholders in industry and medicine are reluctant to adopt it wholeheartedly due to current limitations of head-mounted display technology and due to the overall dangers involved in overwhelming a user’s view of the real world with virtual information. It is more likely that moderate amounts of augmented reality will be integrated into a more general interaction environment with many displays and devices, involving tangible, immersive, wearable, and hybrid concepts of ubiquitous and wearable computing. We call this emerging paradigm ubiquitous augmented reality (UAR) (MacWilliams, 2005; Sandor, 2005; Sandor & Klinker, 2005). It is not yet clear which UAR-based humancomputer interaction techniques will be most suitable for users to simultaneously work within an environment that combines real and virtual elements. Their success is influenced by a large number of design parameters. The overall design space is vast and difficult to understand. In Munich, we have worked on a number of applications for manufacturing, medicine, architecture, exterior construction, sports, and entertainment (a complete list of projects can be found at http://ar.in.tum.de/Chair/ProjectsOverview). Although many of these projects were designed in the short-term context of one semester student courses or theses, they provided insight into different aspects of design options, illustrating trade-offs for a number of design parameters. In this chapter, we propose a systematic approach toward identifying, exploring, and selecting design parameters at the example of three of our projects, PAARTI (Echtler et al., 2003), FataMorgana (Klinker et al., 2002), and a monitoring tool (Kulas, Sandor, & Klinker, 2004). Using a systematic approach of enumerating and exploring a defined space of design options is useful, yet not always feasible. In many cases, the dimensionality of the design space is not known a-priori but rather has to be determined as part of the design process. To cover the variety of aspects involved in finding an acceptable solution for a given application scenario, experts with diverse backgrounds (computer science, sensing and display technologies, human factors, psychology, and the application domain) have to collaborate. Due to the highly immersive nature of UAR-based user interfaces, it is difficult for these experts to evaluate the impact of various design options without trying them. Authoring tools and an interactively configurable framework are needed to help experts quickly set up approximate demonstrators of novel concepts, similar to “back-of-the-envelope” calculations and sketches. We have explored how to provide such first-step support to teams of user interface designers (Sandor, 2005). In this chapter, we report on lessons learned on generating authoring tools and a framework for immersive user interfaces for UAR scenarios. By reading this chapter, readers should understand the rationale and the concepts for defining a scheme of different classes of design considerations that need to be taken into account when designing UAR-based interfaces. Readers should see how, for classes with finite numbers of design considerations, systematic approaches can be used to analyze such design options. For less well-defined application scenarios, the chapter presents authoring tools and a framework for exploring interaction concepts. Finally, a report on lessons learned from implementing such tools and from discussing them within expert teams of user interface designers is intended to provide an indication of progress made thus far and next steps to be taken.


Sign in / Sign up

Export Citation Format

Share Document