Mid-Surfaces of Profile-based Freeforms for Mold Design

1999 ◽  
Vol 121 (2) ◽  
pp. 202-207 ◽  
Author(s):  
A. Fischer ◽  
A. Smolin ◽  
G. Elber

Mid-surfaces of complex thin objects are commonly used in CAD applications for the analysis of casting and injection molding. However, geometrical representation in CAD typically takes the form of a solid representation rather than a mid-surface; therefore, a process for extracting the mid-surface is essential. Contemporary methods for extracting mid-surfaces are based on numerical computations using offsetting techniques or Voronoi diagram processes where the data is discrete and piecewise linear. These algorithms usually have high computational complexity, and their accuracy is not guaranteed. Furthermore, the geometry and topology of the object are not always preserved. To overcome these problems, this paper proposes a new approach for extracting a mid-surface from a freeform thin object. The proposed method reduces the mid-surface problem into a parametrization problem that is based on a matching technique in which a nonlinear optimization function is defined and solved according to mid-surface criteria. Then, the resulting mid-surface is dictated by a reparametrization process. The algorithm is implemented for freeform ruled, swept, and rotational surfaces, that are commonly used in engineering products. Reducing the problem to the profile curves of these surfaces alleviates the computational complexity of the 3D case and restricts it to a 2D case. Error is controlled globally through an iterative refinement process that utilizes continuous symbolic computations on the parametric representation. The feasibility of the proposed method is demonstrated through several examples.

2007 ◽  
Vol 04 (01) ◽  
pp. 125-149 ◽  
Author(s):  
YASAR AYAZ ◽  
KHALID MUNAWAR ◽  
MOHAMMAD BILAL MALIK ◽  
ATSUSHI KONNO ◽  
MASARU UCHIYAMA

Unlike wheeled robots, humanoid robots are able to cross obstacles by stepping over or upon them. Conventional 2D methods for robot navigation fail to exploit this unique ability of humanoids and thus design trajectories only by circumventing obstacles. Recently, global algorithms have been presented that take into account this feature of humanoids. However, due to high computational complexity, most of them are very time consuming. In this paper, we present a new approach to footstep planning in obstacle cluttered environments that employs a human-like strategy to terrain traversal. Simulation results of its implementation on a model of the Saika-3 humanoid robot are also presented. The algorithm, being one of reactive nature, refutes previous claims that reactive algorithms fail to find successful paths in complex obstacle cluttered environments.


1991 ◽  
Vol 14 (3) ◽  
pp. 367-385
Author(s):  
Andrzej Jankowski ◽  
Zbigniew Michalewicz

A number of approaches have been taken to represent compound, structured values in relational databases. We review a few such approaches and discuss a new approach, in which every set is represented as a Boolean term. We show that this approach generalizes the other approaches, leading to more flexible representation. Boolean term representation seems to be appropriate in handling incomplete information: this approach generalizes some other approaches (e.g. null value mark, null variables, etc). We consider definitions of algebraic operations on such sets, like join, union, selection, etc. Moreover, we introduce a measure of computational complexity of these operations.


2018 ◽  
Vol 1 (1) ◽  
pp. 139-156 ◽  
Author(s):  
Wen-wen Tung ◽  
Ashrith Barthur ◽  
Matthew C. Bowers ◽  
Yuying Song ◽  
John Gerth ◽  
...  

Author(s):  
Faten Mashta ◽  
Mohieddin Wainakh ◽  
Wissam Altabban

Spectrum sensing in cognitive radio has difficult and complex requirements such as requiring speed and sensing accuracy at very low SNRs. In this paper, the authors propose a novel fully blind sequential multistage spectrum sensing detector to overcome the limitations of single stage detector and make use of the advantages of each detector in each stage. In first stage, energy detection is used because of its simplicity. However, its performance decreases at low SNRs. In second and third stage, the maximum eigenvalues detector is adopted with different smoothing factor in each stage. Maximum eigenvalues detection technique provide good detection performance at low SNRs, but it requires a high computational complexity. In this technique, the probability of detection improves as the smoothing factor raises at the expense of increasing the computational complexity. The simulation results illustrate that the proposed detector has better sensing accuracy than the three individual detectors and a computational complexity lies in between the three individual complexities.


2020 ◽  
Vol 10 (3) ◽  
pp. 24
Author(s):  
Stefania Preatto ◽  
Andrea Giannini ◽  
Luca Valente ◽  
Guido Masera ◽  
Maurizio Martina

High Efficiency Video Coding (HEVC) is the latest video standard developed by the Joint Video Exploration Team. HEVC is able to offer better compression results than preceding standards but it suffers from a high computational complexity. In particular, one of the most time consuming blocks in HEVC is the fractional-sample interpolation filter, which is used in both the encoding and the decoding processes. Integrating different state-of-the-art techniques, this paper presents an architecture for interpolation filters, able to trade quality for energy and power efficiency by exploiting approximate interpolation filters and by halving the amount of required memory with respect to state-of-the-art implementations.


2017 ◽  
Vol 23 (4) ◽  
pp. 405-441 ◽  
Author(s):  
PAVEL PUDLÁK

AbstractMotivated by the problem of finding finite versions of classical incompleteness theorems, we present some conjectures that go beyondNP≠coNP. These conjectures formally connect computational complexity with the difficulty of proving some sentences, which means that high computational complexity of a problem associated with a sentence implies that the sentence is not provable in a weak theory, or requires a long proof. Another reason for putting forward these conjectures is that some results in proof complexity seem to be special cases of such general statements and we want to formalize and fully understand these statements. Roughly speaking, we are trying to connect syntactic complexity, by which we mean the complexity of sentences and strengths of the theories in which they are provable, with the semantic concept of complexity of the computational problems represented by these sentences.We have introduced the most fundamental conjectures in our earlier works [27, 33–35]. Our aim in this article is to present them in a more systematic way, along with several new conjectures, and prove new connections between them and some other statements studied before.


Sign in / Sign up

Export Citation Format

Share Document