Allocation of Dimensional Tolerances for Multiple Loop Planar Mechanisms

1989 ◽  
Vol 111 (4) ◽  
pp. 465-470 ◽  
Author(s):  
R. G. Fenton ◽  
W. L. Cleghorn ◽  
Jing-fan Fu

A method is presented to determine tolerance bands for the dimensions of multiple loop planar mechanisms such that output motions will be kept within specified allowable limits. The kinematic equations of mechanisms are generated by combining various link groups. A preliminary set of estimated tolerance bands is calculated using an analytical technique. An optimization and checking routine is then employed to determine the set of input parameters which satisfies the prescribed output motion requirements. Examples have been included to illustrate the method.

2019 ◽  
Vol 25 (2) ◽  
pp. 394-402 ◽  
Author(s):  
Reinout Heijungs

Abstract Introduction The Monte Carlo technique is widely used and recommended for including uncertainties LCA. Typically, 1000 or 10,000 runs are done, but a clear argument for that number is not available, and with the growing size of LCA databases, an excessively high number of runs may be a time-consuming thing. We therefore investigate if a large number of runs are useful, or if it might be unnecessary or even harmful. Probability theory We review the standard theory or probability distributions for describing stochastic variables, including the combination of different stochastic variables into a calculation. We also review the standard theory of inferential statistics for estimating a probability distribution, given a sample of values. For estimating the distribution of a function of probability distributions, two major techniques are available, analytical, applying probability theory and numerical, using Monte Carlo simulation. Because the analytical technique is often unavailable, the obvious way-out is Monte Carlo. However, we demonstrate and illustrate that it leads to overly precise conclusions on the values of estimated parameters, and to incorrect hypothesis tests. Numerical illustration We demonstrate the effect for two simple cases: one system in a stand-alone analysis and a comparative analysis of two alternative systems. Both cases illustrate that statistical hypotheses that should not be rejected in fact are rejected in a highly convincing way, thus pointing out a fundamental flaw. Discussion and conclusions Apart form the obvious recommendation to use larger samples for estimating input distributions, we suggest to restrict the number of Monte Carlo runs to a number not greater than the sample sizes used for the input parameters. As a final note, when the input parameters are not estimated using samples, but through a procedure, such as the popular pedigree approach, the Monte Carlo approach should not be used at all.


Author(s):  
Wu DeRong

Abstract The dynamic equations of planar mechanisms with flexible pin shafts and clearance connections is developed here using Lagrangian formulation. The general flexible-connection planar mechanism can be described by a damped, circulatory, gyroscopic system. Quite often, damping and circulatory forces are small, one may regards them as the perturbations. Indeed, a perturbation method for obtaining the eigensolution and dynamic response is presented by regarding the undamped, non-circulatory, gyroscopic system as the unperturbed system. The procedure may be used to investigate the fundamental nature of dynamic interaction between the elastic deflection of pin shafts and clearance connections. A major objective of this paper, in addition to the development of the analytical technique, was to obtain some insight into the phenomenon at flexible-connection, so that these effects may be included during the designing of mechanisms.


Author(s):  
C. Colliex ◽  
P. Trebbia

The physical foundations for the use of electron energy loss spectroscopy towards analytical purposes, seem now rather well established and have been extensively discussed through recent publications. In this brief review we intend only to mention most recent developments in this field, which became available to our knowledge. We derive also some lines of discussion to define more clearly the limits of this analytical technique in materials science problems.The spectral information carried in both low ( 0<ΔE<100eV ) and high ( >100eV ) energy regions of the loss spectrum, is capable to provide quantitative results. Spectrometers have therefore been designed to work with all kinds of electron microscopes and to cover large energy ranges for the detection of inelastically scattered electrons (for instance the L-edge of molybdenum at 2500eV has been measured by van Zuylen with primary electrons of 80 kV). It is rather easy to fix a post-specimen magnetic optics on a STEM, but Crewe has recently underlined that great care should be devoted to optimize the collecting power and the energy resolution of the whole system.


Author(s):  
A. M. Bradshaw

X-ray photoelectron spectroscopy (XPS or ESCA) was not developed by Siegbahn and co-workers as a surface analytical technique, but rather as a general probe of electronic structure and chemical reactivity. The method is based on the phenomenon of photoionisation: The absorption of monochromatic radiation in the target material (free atoms, molecules, solids or liquids) causes electrons to be injected into the vacuum continuum. Pseudo-monochromatic laboratory light sources (e.g. AlKα) have mostly been used hitherto for this excitation; in recent years synchrotron radiation has become increasingly important. A kinetic energy analysis of the so-called photoelectrons gives rise to a spectrum which consists of a series of lines corresponding to each discrete core and valence level of the system. The measured binding energy, EB, given by EB = hv−EK, where EK is the kineticenergy relative to the vacuum level, may be equated with the orbital energy derived from a Hartree-Fock SCF calculation of the system under consideration (Koopmans theorem).


Author(s):  
S.J. Krause ◽  
W.W. Adams

Over the past decade low voltage scanning electron microscopy (LVSEM) of polymers has evolved from an interesting curiosity to a powerful analytical technique. This development has been driven by improved instrumentation and in particular, reliable field emission gun (FEG) SEMs. The usefulness of LVSEM has also grown because of an improved theoretical and experimental understanding of sample-beam interactions and by advances in sample preparation and operating techniques. This paper will review progress in polymer LVSEM and present recent results and developments in the field.In the early 1980s a new generation of SEMs produced beam currents that were sufficient to allow imaging at low voltages from 5keV to 0.5 keV. Thus, for the first time, it became possible to routinely image uncoated polymers at voltages below their negative charging threshold, the "second crossover", E2 (Fig. 1). LVSEM also improved contrast and reduced beam damage in sputter metal coated polymers. Unfortunately, resolution was limited to a few tenths of a micron due to the low brightness and chromatic aberration of thermal electron emission sources.


2005 ◽  
Vol 10 (1) ◽  
pp. 65-75 ◽  
Author(s):  
Z. Kala

The load-carrying capacity of the member with imperfections under axial compression is analysed in the present paper. The study is divided into two parts: (i) in the first one, the input parameters are considered to be random numbers (with distribution of probability functions obtained from experimental results and/or tolerance standard), while (ii) in the other one, the input parameters are considered to be fuzzy numbers (with membership functions). The load-carrying capacity was calculated by geometrical nonlinear solution of a beam by means of the finite element method. In the case (ii), the membership function was determined by applying the fuzzy sets, whereas in the case (i), the distribution probability function of load-carrying capacity was determined. For (i) stochastic solution, the numerical simulation Monte Carlo method was applied, whereas for (ii) fuzzy solution, the method of the so-called α cuts was applied. The design load-carrying capacity was determined according to the EC3 and EN1990 standards. The results of the fuzzy, stochastic and deterministic analyses are compared in the concluding part of the paper.


2017 ◽  
Vol 5 (1) ◽  
pp. 8-15
Author(s):  
Sergii Hilgurt ◽  

The multi-pattern matching is a fundamental technique found in applications like a network intrusion detection system, anti-virus, anti-worms and other signature- based information security tools. Due to rising traffic rates, increasing number and sophistication of attacks and the collapse of Moore’s law, traditional software solutions can no longer keep up. Therefore, hardware approaches are frequently being used by developers to accelerate pattern matching. Reconfigurable FPGA-based devices, providing the flexibility of software and the near-ASIC performance, have become increasingly popular for this purpose. Hence, increasing the efficiency of reconfigurable information security tools is a scientific issue now. Many different approaches to constructing hardware matching circuits on FPGAs are known. The most widely used of them are based on discrete comparators, hash-functions and finite automata. Each approach possesses its own pros and cons. None of them still became the leading one. In this paper, a method to combine several different approaches to enforce their advantages has been developed. An analytical technique to quickly advance estimate the resource costs of each matching scheme without need to compile FPGA project has been proposed. It allows to apply optimization procedures to near-optimally split the set of pattern between different approaches in acceptable time.


2015 ◽  
Vol 10 (3) ◽  
pp. 2825-2833
Author(s):  
Achala Nargund ◽  
R Madhusudhan ◽  
S B Sathyanarayana

In this paper, Homotopy analysis method is applied to the nonlinear coupleddifferential equations of classical Boussinesq system. We have applied Homotopy analysis method (HAM) for the application problems in [1, 2, 3, 4]. We have also plotted Domb-Sykes plot for the region of convergence. We have applied Pade for the HAM series to identify the singularity and reflect it in the graph. The HAM is a analytical technique which is used to solve non-linear problems to generate a convergent series. HAM gives complete freedom to choose the initial approximation of the solution, it is the auxiliary parameter h which gives us a convenient way to guarantee the convergence of homotopy series solution. It seems that moreartificial degrees of freedom implies larger possibility to gain better approximations by HAM.


Sign in / Sign up

Export Citation Format

Share Document