scholarly journals GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1123
Author(s):  
David Jurado ◽  
Juan M. Jurado ◽  
Lidia Ortega ◽  
Francisco R. Feito

Mixed reality (MR) enables a novel way to visualize virtual objects on real scenarios considering physical constraints. This technology arises with other significant advances in the field of sensors fusion for human-centric 3D capturing. Recent advances for scanning the user environment, real-time visualization and 3D vision using ubiquitous systems like smartphones allow us to capture 3D data from the real world. In this paper, a disruptive application for assessing the status of indoor infrastructures is proposed. The installation and maintenance of hidden facilities such as water pipes, electrical lines and air conditioning tubes, which are usually occluded behind the wall, supposes tedious and inefficient tasks. Most of these infrastructures are digitized but they cannot be visualized onsite. In this research, we focused on the development of a new application (GEUINF) to be launched on smartphones that are capable of capturing 3D data of the real world by depth sensing. This information is relevant to determine the user position and orientation. Although previous approaches used fixed markers for this purpose, our application enables the estimation of both parameters with a centimeter accuracy without them. This novelty is possible since our method is based on a matching process between reconstructed walls of the real world and 3D planes of the replicated world in a virtual environment. Our markerless approach is based on scanning planar surfaces of the user environment and then, these are geometrically aligned with their corresponding virtual 3D entities. In a preprocessing phase, the 2D CAD geometry available from an architectural project is used to generate 3D models of an indoor building structure. In real time, these virtual elements are tracked with the real ones modeled by using ARCore library. Once the alignment between virtual and real worlds is done, the application enables the visualization, navigation and interaction with the virtual facility networks in real-time. Thus, our method may be used by private companies and public institutions responsible of the indoor facilities management and also may be integrated with other applications focused on indoor navigation.

2002 ◽  
Vol 11 (2) ◽  
pp. 176-188 ◽  
Author(s):  
Yuichi Ohta ◽  
Yasuyuki Sugaya ◽  
Hiroki Igarashi ◽  
Toshikazu Ohtsuki ◽  
Kaito Taguchi

In mixed reality, occlusions and shadows are important to realize a natural fusion between the real and virtual worlds. In order to achieve this, it is necessary to acquire dense depth information of the real world from the observer's viewing position. The depth sensor must be attached to the see-through HMD of the observer because he/she moves around. The sensor should be small and light enough to be attached to the HMD and should be able to produce a reliable dense depth map at video rate. Unfortunately, however, no such depth sensors are available. We propose a client/server depth-sensing scheme to solve this problem. A server sensor located at a fixed position in the real world acquires the 3-D information of the world, and a client sensor attached to each observer produces the depth map from his/her viewing position using the 3-D information supplied from the server. Multiple clients can share the 3-D information of the server; we call it Share-Z. In this paper, the concept and merits of Share-Z are discussed. An experimental system developed to demonstrate the feasibility of Share-Z is also described.


2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


Author(s):  
Ritesh Srivastava ◽  
M.P.S. Bhatia

Twitter behaves as a social sensor of the world. The tweets provided by the Twitter Firehose reveal the properties of big data (i.e. volume, variety, and velocity). With millions of users on Twitter, the Twitter's virtual communities are now replicating the real-world communities. Consequently, the discussions of real world events are also very often on Twitter. This work has performed the real-time analysis of the tweets related to a targeted event (e.g. election) to identify those potential sub-events that occurred in the real world, discussed over Twitter and cause the significant change in the aggregated sentiment score of the targeted event with time. Such type of analysis can enrich the real-time decision-making ability of the event bearer. The proposed approach utilizes a three-step process: (1) Real-time sentiment analysis of tweets (2) Application of Bayesian Change Points Detection to determine the sentiment change points (3) Major sub-events detection that have influenced the sentiment of targeted event. This work has experimented on Twitter data of Delhi Election 2015.


Author(s):  
Yulia Fatma ◽  
Armen Salim ◽  
Regiolina Hayami

Along with the development, the application can be used as a medium for learning. Augmented Reality is a technology that combines two-dimensional’s virtual objects and three-dimensional’s virtual objects into a real three-dimensional’s  then projecting the virtual objects in real time and simultaneously. The introduction of Solar System’s material, students are invited to get to know the planets which are directly encourage students to imagine circumtances in the Solar System. Explenational of planets form and how the planets make the revolution and rotation in books are considered less material’s explanation because its only display objects in 2D. In addition, students can not practice directly in preparing the layout of the planets in the Solar System. By applying Augmented Reality Technology, information’s learning delivery can be clarified, because in these applications are combined the real world and the virtual world. Not only display the material, the application also display images of planets in 3D animation’s objects with audio.


2006 ◽  
Vol 5 (3) ◽  
pp. 53-58 ◽  
Author(s):  
Roger K. C. Tan ◽  
Adrian David Cheok ◽  
James K. S. Teh

For better or worse, technological advancement has changed the world to the extent that at a professional level demands from the working executive required more hours either in the office or on business trips, on a social level the population (especially the younger generation) are glued to the computer either playing video games or surfing the internet. Traditional leisure activities, especially interaction with pets have been neglected or forgotten. This paper introduces Metazoa Ludens, a new computer mediated gaming system which allows pets to play new mixed reality computer games with humans via custom built technologies and applications. During the game-play the real pet chases after a physical movable bait in the real world within a predefined area; infra-red camera tracks the pets' movements and translates them into the virtual world of the system, corresponding them to the movement of a virtual pet avatar running after a virtual human avatar. The human player plays the game by controlling the human avatar's movements in the virtual world, this in turn relates to the movements of the physical movable bait in the real world which moves as the human avatar does. This unique way of playing computer game would give rise to a whole new way of mixed reality interaction between the pet owner and her pet thereby bringing technology and its influence on leisure and social activities to the next level


1996 ◽  
Vol 28 (4es) ◽  
pp. 187 ◽  
Author(s):  
Lui Sha
Keyword(s):  

2004 ◽  
Vol 41 (02) ◽  
pp. 299-312 ◽  
Author(s):  
Juri Hinz

The purpose of this paper is to analyse the real-time trading of electricity. We address a model for an auction-like trading which captures key features of real-world electricity markets. Our main result establishes that, under certain conditions, the expected total payment for electricity is independent of the particular auction type. This result is analogous to the revenue-equivalence theorem known for classical auctions and could contribute to an improved understanding of different electricity market designs and their comparison.


2012 ◽  
Author(s):  
Derek Gobel ◽  
Jan Briers ◽  
Frank de Boer ◽  
Ron Cramer ◽  
Kok-Lam Lai ◽  
...  

GEOMATICA ◽  
2014 ◽  
Vol 68 (2) ◽  
pp. 129-134
Author(s):  
Mingqiang Guo ◽  
Ying Huang ◽  
Zhong Xie

The real-time visualization of vector maps is the most common function in CyberGIS, and it is timeconsuming, especially as the data volume becomes larger. How to improve the efficiency of visualization of large vector maps is still a significant research direction for GIS scientists. In this research, we point out that parallel optimization is appropriate for the real-time visualization of large vector maps. The main purpose of this research is to investigate a balanced decomposition approach which can balance the load of each server node in a CyberGIS cluster environment. The load balancer analyzes the spatial characteristics of the map requests and decomposes the real-time viewshed into multiple balanced sub viewsheds, so as to balance the load of all the server nodes in the cluster environment. The test results demonstrate that the approach proposed in this research has the ability to balance the load in CyberGIS cluster environment.


Sign in / Sign up

Export Citation Format

Share Document