Ontology-Based Representation and Modelling of Synthetic 3D Content: A State-of-the-Art Review

2017 ◽  
Vol 36 (8) ◽  
pp. 329-353 ◽  
Author(s):  
Jakub Flotyński ◽  
Krzysztof Walczak
Keyword(s):  
2019 ◽  
Vol 8 (5) ◽  
pp. 221 ◽  
Author(s):  
Arttu Julin ◽  
Kaisa Jaalama ◽  
Juho-Pekka Virtanen ◽  
Mikko Maksimainen ◽  
Matti Kurkela ◽  
...  

The Internet has become a major dissemination and sharing platform for 3D content. The utilization of 3D measurement methods can drastically increase the production efficiency of 3D content in an increasing number of use cases where 3D documentation of real-life objects or environments is required. We demonstrated a developed, highly automated and integrated content creation process of providing reality-based photorealistic 3D models for the web. Close-range photogrammetry, terrestrial laser scanning (TLS) and their combination are compared using available state-of-the-art tools in a real-life project setting with real-life limitations. Integrating photogrammetry and TLS is a good compromise for both geometric and texture quality. Compared to approaches using only photogrammetry or TLS, it is slower and more resource-heavy but combines complementary advantages of each method, such as direct scale determination from TLS or superior image quality typically used in photogrammetry. The integration is not only beneficial, but clearly productionally possible using available state-of-the-art tools that have become increasingly available also for non-expert users. Despite the high degree of automation, some manual editing steps are still required in practice to achieve satisfactory results in terms of adequate visual quality. This is mainly due to the current limitations of WebGL technology.


Author(s):  
T. A. Welton

Various authors have emphasized the spatial information resident in an electron micrograph taken with adequately coherent radiation. In view of the completion of at least one such instrument, this opportunity is taken to summarize the state of the art of processing such micrographs. We use the usual symbols for the aberration coefficients, and supplement these with £ and 6 for the transverse coherence length and the fractional energy spread respectively. He also assume a weak, biologically interesting sample, with principal interest lying in the molecular skeleton remaining after obvious hydrogen loss and other radiation damage has occurred.


Author(s):  
Carl E. Henderson

Over the past few years it has become apparent in our multi-user facility that the computer system and software supplied in 1985 with our CAMECA CAMEBAX-MICRO electron microprobe analyzer has the greatest potential for improvement and updating of any component of the instrument. While the standard CAMECA software running on a DEC PDP-11/23+ computer under the RSX-11M operating system can perform almost any task required of the instrument, the commands are not always intuitive and can be difficult to remember for the casual user (of which our laboratory has many). Given the widespread and growing use of other microcomputers (such as PC’s and Macintoshes) by users of the microprobe, the PDP has become the “oddball” and has also fallen behind the state-of-the-art in terms of processing speed and disk storage capabilities. Upgrade paths within products available from DEC are considered to be too expensive for the benefits received. After using a Macintosh for other tasks in the laboratory, such as instrument use and billing records, word processing, and graphics display, its unique and “friendly” user interface suggested an easier-to-use system for computer control of the electron microprobe automation. Specifically a Macintosh IIx was chosen for its capacity for third-party add-on cards used in instrument control.


2010 ◽  
Vol 20 (1) ◽  
pp. 9-13 ◽  
Author(s):  
Glenn Tellis ◽  
Lori Cimino ◽  
Jennifer Alberti

Abstract The purpose of this article is to provide clinical supervisors with information pertaining to state-of-the-art clinic observation technology. We use a novel video-capture technology, the Landro Play Analyzer, to supervise clinical sessions as well as to train students to improve their clinical skills. We can observe four clinical sessions simultaneously from a central observation center. In addition, speech samples can be analyzed in real-time; saved on a CD, DVD, or flash/jump drive; viewed in slow motion; paused; and analyzed with Microsoft Excel. Procedures for applying the technology for clinical training and supervision will be discussed.


1995 ◽  
Vol 38 (5) ◽  
pp. 1126-1142 ◽  
Author(s):  
Jeffrey W. Gilger

This paper is an introduction to behavioral genetics for researchers and practioners in language development and disorders. The specific aims are to illustrate some essential concepts and to show how behavioral genetic research can be applied to the language sciences. Past genetic research on language-related traits has tended to focus on simple etiology (i.e., the heritability or familiality of language skills). The current state of the art, however, suggests that great promise lies in addressing more complex questions through behavioral genetic paradigms. In terms of future goals it is suggested that: (a) more behavioral genetic work of all types should be done—including replications and expansions of preliminary studies already in print; (b) work should focus on fine-grained, theory-based phenotypes with research designs that can address complex questions in language development; and (c) work in this area should utilize a variety of samples and methods (e.g., twin and family samples, heritability and segregation analyses, linkage and association tests, etc.).


Sign in / Sign up

Export Citation Format

Share Document