Computationally efficient winding loss calculation with multiple windings, arbitrary waveforms, and two-dimensional or three-dimensional field geometry

2001 ◽  
Vol 16 (1) ◽  
pp. 142-150 ◽  
Author(s):  
C.R. Sullivan
1995 ◽  
Vol 291 ◽  
pp. 369-392 ◽  
Author(s):  
Ronald D. Joslin

The spatial evolution of three-dimensional disturbances in an attachment-line boundary layer is computed by direct numerical simulation of the unsteady, incompressible Navier–Stokes equations. Disturbances are introduced into the boundary layer by harmonic sources that involve unsteady suction and blowing through the wall. Various harmonic-source generators are implemented on or near the attachment line, and the disturbance evolutions are compared. Previous two-dimensional simulation results and nonparallel theory are compared with the present results. The three-dimensional simulation results for disturbances with quasi-two-dimensional features indicate growth rates of only a few percent larger than pure two-dimensional results; however, the results are close enough to enable the use of the more computationally efficient, two-dimensional approach. However, true three-dimensional disturbances are more likely in practice and are more stable than two-dimensional disturbances. Disturbances generated off (but near) the attachment line spread both away from and toward the attachment line as they evolve. The evolution pattern is comparable to wave packets in flat-plate boundary-layer flows. Suction stabilizes the quasi-two-dimensional attachment-line instabilities, and blowing destabilizes these instabilities; these results qualitatively agree with the theory. Furthermore, suction stabilizes the disturbances that develop off the attachment line. Clearly, disturbances that are generated near the attachment line can supply energy to attachment-line instabilities, but suction can be used to stabilize these instabilities.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Siewert Hugelier ◽  
Wim Vandenberg ◽  
Tomáš Lukeš ◽  
Kristin S. Grußmayer ◽  
Paul H. C. Eilers ◽  
...  

AbstractSub-diffraction or super-resolution fluorescence imaging allows the visualization of the cellular morphology and interactions at the nanoscale. Statistical analysis methods such as super-resolution optical fluctuation imaging (SOFI) obtain an improved spatial resolution by analyzing fluorophore blinking but can be perturbed by the presence of non-stationary processes such as photodestruction or fluctuations in the illumination. In this work, we propose to use Whittaker smoothing to remove these smooth signal trends and retain only the information associated to independent blinking of the emitters, thus enhancing the SOFI signals. We find that our method works well to correct photodestruction, especially when it occurs quickly. The resulting images show a much higher contrast, strongly suppressed background and a more detailed visualization of cellular structures. Our method is parameter-free and computationally efficient, and can be readily applied on both two-dimensional and three-dimensional data.


1982 ◽  
Vol 22 (1) ◽  
pp. 205 ◽  
Author(s):  
G. M. Philip ◽  
D. F. Watson

Although the petroleum geologist is concerned with analysing three-dimensional data, he relies entirely on two-dimensional portrayals - cross-sections and particularly contour maps of all types. With the advent of digital computers, machine contouring has become increasingly common, but little attention has been directed to the limitations of the various algorithms that can be employed to generate contour maps from a set of control points. For example, it is not widely appreciated that contouring procedures which faithfully honour the value of original control points produce poor predictions at locations where no control is available. Contouring a published set of topographic data shows how this and other limitations lead to approximations and errors in machine-generated contours.A new method based on triangulation interpolation using Delaunay tessellations (deltri analysis) is superior to existing methods. Not only does the method give the most accurate and objective measurement and display of the contoured surface, but it is also computationally efficient. Rapid calculation of volume of closure over contoured structures is possible. The method also allows estimation of the adequacy of the data on which the contouring is based by introducing a measure of 'roughness' of the surface. This is achieved by analysing the directions of normals to triangles surrounding each control point.


Geophysics ◽  
1991 ◽  
Vol 56 (11) ◽  
pp. 1778-1785 ◽  
Author(s):  
Dave Hale

Three‐dimensional seismic wavefields may be extrapolated in depth, one frequency at a time, by two‐dimensional convolution with a circularly symmetric, frequency‐ and velocity‐dependent filter. This depth extrapolation, performed for each frequency independently, lies at the heart of 3-D finite‐difference depth migration. The computational efficiency of 3-D depth migration depends directly on the efficiency of this depth extrapolation. McClellan transformations provide an efficient method for both designing and implementing two‐dimensional digital filters that have a particular form of symmetry, such as the circularly symmetric depth extrapolation filters used in 3-D depth migration. Given the coefficients of one‐dimensional, frequency‐ and velocity‐dependent filters used to accomplish 2-D depth migration, McClellan transformations lead to a simple and efficient algorithm for 3-D depth migration. 3-D depth migration via McClellan transformations is simple because the coefficients of two‐dimensional depth extrapolation filters are never explicitly computed or stored; only the coefficients of the corresponding one‐dimensional filter are required. The algorithm is computationally efficient because the cost of applying the two‐dimensional extrapolation filter via McClellan transformations increases only linearly with the number of coefficients N in the corresponding one‐dimensional filter. This efficiency is not intuitively obvious, because the cost of convolution with a two‐dimensional filter is generally proportional to [Formula: see text]. Computational efficiency is particularly important for 3-D depth migration, for which long extrapolation filters (large N) may be required for accurate imaging of steep reflectors.


Author(s):  
H.A. Cohen ◽  
T.W. Jeng ◽  
W. Chiu

This tutorial will discuss the methodology of low dose electron diffraction and imaging of crystalline biological objects, the problems of data interpretation for two-dimensional projected density maps of glucose embedded protein crystals, the factors to be considered in combining tilt data from three-dimensional crystals, and finally, the prospects of achieving a high resolution three-dimensional density map of a biological crystal. This methodology will be illustrated using two proteins under investigation in our laboratory, the T4 DNA helix destabilizing protein gp32*I and the crotoxin complex crystal.


Author(s):  
B. Ralph ◽  
A.R. Jones

In all fields of microscopy there is an increasing interest in the quantification of microstructure. This interest may stem from a desire to establish quality control parameters or may have a more fundamental requirement involving the derivation of parameters which partially or completely define the three dimensional nature of the microstructure. This latter categorey of study may arise from an interest in the evolution of microstructure or from a desire to generate detailed property/microstructure relationships. In the more fundamental studies some convolution of two-dimensional data into the third dimension (stereological analysis) will be necessary.In some cases the two-dimensional data may be acquired relatively easily without recourse to automatic data collection and further, it may prove possible to perform the data reduction and analysis relatively easily. In such cases the only recourse to machines may well be in establishing the statistical confidence of the resultant data. Such relatively straightforward studies tend to result from acquiring data on the whole assemblage of features making up the microstructure. In this field data mode, when parameters such as phase volume fraction, mean size etc. are sought, the main case for resorting to automation is in order to perform repetitive analyses since each analysis is relatively easily performed.


Author(s):  
Yu Liu

The image obtained in a transmission electron microscope is the two-dimensional projection of a three-dimensional (3D) object. The 3D reconstruction of the object can be calculated from a series of projections by back-projection, but this algorithm assumes that the image is linearly related to a line integral of the object function. However, there are two kinds of contrast in electron microscopy, scattering and phase contrast, of which only the latter is linear with the optical density (OD) in the micrograph. Therefore the OD can be used as a measure of the projection only for thin specimens where phase contrast dominates the image. For thick specimens, where scattering contrast predominates, an exponential absorption law holds, and a logarithm of OD must be used. However, for large thicknesses, the simple exponential law might break down due to multiple and inelastic scattering.


Author(s):  
D. E. Johnson

Increased specimen penetration; the principle advantage of high voltage microscopy, is accompanied by an increased need to utilize information on three dimensional specimen structure available in the form of two dimensional projections (i.e. micrographs). We are engaged in a program to develop methods which allow the maximum use of information contained in a through tilt series of micrographs to determine three dimensional speciman structure.In general, we are dealing with structures lacking in symmetry and with projections available from only a limited span of angles (±60°). For these reasons, we must make maximum use of any prior information available about the specimen. To do this in the most efficient manner, we have concentrated on iterative, real space methods rather than Fourier methods of reconstruction. The particular iterative algorithm we have developed is given in detail in ref. 3. A block diagram of the complete reconstruction system is shown in fig. 1.


Sign in / Sign up

Export Citation Format

Share Document