A Linearization Method for Determining the Effect of Loads, Shunts, and System Uncertainties on Line Protection with Distance Relays

1981 ◽  
Vol PER-1 (11) ◽  
pp. 24-25
Author(s):  
E. R. Sexton ◽  
D. Crevier
2016 ◽  
Vol 3 (1) ◽  
pp. 16-36 ◽  
Author(s):  
Ahmad A. Al-Subhi ◽  
Hesham K. Alfares

This paper presents an optimum solution of the economic dispatch (ED) problem without considering transmission losses using linear programming (LP). In the ED problem, several on-line units (generators) are available, and it is needed to determine the power to produce by each unit in order to meet the required load at minimum total cost. To apply LP, the nonlinear cost functions of all generators are approximated by linear piecewise functions. To examine the effectiveness of this linearization method, a comprehensive set of benchmark test problems is used consisting of 3, 6, 18, 20, 38, and 40 generators. Using this set, LP solutions of linearized ED problems are compared with several other techniques available in the literature. The LP technique with piecewise linearization shows an overall competitive advantage in terms of total cost, solution time, and load satisfaction accuracy. The impact of varying the width of the linearized pieces (segments) is also discussed. All the computational analysis is performed using MATLAB software environment.


Author(s):  
Tao Liu ◽  
Zhonghui Hu ◽  
Rupo Yin ◽  
Xiaoming Xu

In this paper, a new analytical Smith predictor (SP) controller design method is proposed for industrial and chemical high-order systems. Firstly, by using the integral-squared-error (ISE) performance specification, the ideally optimal SP controller is analytically derived according to the nominal high-order plant model, which inevitably results in the high-order controller. Then the analytical controller reduction formulae based on the mathematical Maclaurin and Padé expansions are proposed to duplicate it in the form of a low-order controller such as the proportional-integral-derivative (PID). Hence, the difficulty of controller implementation in practice is significantly relieved without pitiful system performance degradation in comparison with many existing methods. At the same time, the control system robust stability is analysed. Accordingly, the on-line tuning rule of the single adjustable parameter of the proposed controller is provided to cope with the actual system uncertainties. Finally, several illustrative simulation examples are included to demonstrate the effectiveness of the proposed method.


1995 ◽  
Vol 10 ◽  
pp. 585-587
Author(s):  
Keith Butler

In this paper I review some recent advances in the use of large amounts of atomic data in the modelling of atmospheres and winds of hot stars. The review is highly selective but representative of current developments. A more general overview is to be found in Kudritzki and Hummer (1990) although the field is changing so rapidly that much has happened since then. The paper breaks down into three parts: work on line formation, in which the atmospheric structure is known and held fixed, is described first, then follows a description of the inclusion of line opacities in non-LTE in the atmosphere problem itself, and finally recent developments in the theory of radiatively driven stellar winds are summarized. Here special emphasis is given to a novel distance determination method based entirely on spectroscopie quantities. I close with a brief shopping list.In a series of papers, Becker and Butler (1992,1994a, b,c) have investigated iron and nickel spectra in sub-dwarfs using the complete linearization method of Auer and Heasley (1976). The method scales linearly with the number of frequency points so they were able to use well over 10000 frequencies to adequately describe the line opacities. Several thousand lines were treated explicitly and the resultant computed spectra gave execellent fits to observed Hubble spectra in the wavelength ranges dominated by the ions concerned.The different ionization stages gave consistent results for the iron and nickel abundances but only after line-blocking from millions of spectral lines in the far UV had been included. This was done using the Kurucz (1988) line lists coupled with line grouping as suggested by Anderson (1989) and described briefly in the next section.The line-blanketed atmospheres of Kurucz (1991) are the best available up to about 30000K, where non-LTE effects start to become important. Non-LTE line-blanketed atmospheres have become feasible because the computational requirements of the accelerated lambda iteration (ALI) method (Werner and Husfeld, 1985) also scale linearly with the number of frequency points. On the other hand, Anderson (1989) suggested grouping energetically adjacent atomic levels together to form pseudo-levels on the basis that although they might, as a group, be in non-LTE, they should be in LTE with respect to one another due to the large number of collisions between them. This greatly reduces the number of levels to be considered but instead gives rise to highly complicated pseudo line-profiles. Grigsby et al (1992), who did not use ALI, constructed the first grid of line-blanketed non-LTE models by using a variation on the Opacity Distribution Function concept to group line opacities into blocks thereby reducing the number of frequency points required. Dreizler and Werner (1993) on the other hand were able to sample the opacity as they used ALI in their models.


Author(s):  
William Krakow

In the past few years on-line digital television frame store devices coupled to computers have been employed to attempt to measure the microscope parameters of defocus and astigmatism. The ultimate goal of such tasks is to fully adjust the operating parameters of the microscope and obtain an optimum image for viewing in terms of its information content. The initial approach to this problem, for high resolution TEM imaging, was to obtain the power spectrum from the Fourier transform of an image, find the contrast transfer function oscillation maxima, and subsequently correct the image. This technique requires a fast computer, a direct memory access device and even an array processor to accomplish these tasks on limited size arrays in a few seconds per image. It is not clear that the power spectrum could be used for more than defocus correction since the correction of astigmatism is a formidable problem of pattern recognition.


Author(s):  
A.M.H. Schepman ◽  
J.A.P. van der Voort ◽  
J.E. Mellema

A Scanning Transmission Electron Microscope (STEM) was coupled to a small computer. The system (see Fig. 1) has been built using a Philips EM400, equipped with a scanning attachment and a DEC PDP11/34 computer with 34K memory. The gun (Fig. 2) consists of a continuously renewed tip of radius 0.2 to 0.4 μm of a tungsten wire heated just below its melting point by a focussed laser beam (1). On-line operation procedures were developped aiming at the reduction of the amount of radiation of the specimen area of interest, while selecting the various imaging parameters and upon registration of the information content. Whereas the theoretical limiting spot size is 0.75 nm (2), routine resolution checks showed minimum distances in the order 1.2 to 1.5 nm between corresponding intensity maxima in successive scans. This value is sufficient for structural studies of regular biological material to test the performance of STEM over high resolution CTEM.


Author(s):  
Neil Rowlands ◽  
Jeff Price ◽  
Michael Kersker ◽  
Seichi Suzuki ◽  
Steve Young ◽  
...  

Three-dimensional (3D) microstructure visualization on the electron microscope requires that the sample be tilted to different positions to collect a series of projections. This tilting should be performed rapidly for on-line stereo viewing and precisely for off-line tomographic reconstruction. Usually a projection series is collected using mechanical stage tilt alone. The stereo pairs must be viewed off-line and the 60 to 120 tomographic projections must be aligned with fiduciary markers or digital correlation methods. The delay in viewing stereo pairs and the alignment problems in tomographic reconstruction could be eliminated or improved by tilting the beam if such tilt could be accomplished without image translation.A microscope capable of beam tilt with simultaneous image shift to eliminate tilt-induced translation has been investigated for 3D imaging of thick (1 μm) biologic specimens. By tilting the beam above and through the specimen and bringing it back below the specimen, a brightfield image with a projection angle corresponding to the beam tilt angle can be recorded (Fig. 1a).


Author(s):  
G.Y. Fan ◽  
J.M. Cowley

In recent developments, the ASU HB5 has been modified so that the timing, positioning, and scanning of the finely focused electron probe can be entirely controlled by a host computer. This made the asynchronized handshake possible between the HB5 STEM and the image processing system which consists of host computer (PDP 11/34), DeAnza image processor (IP 5000) which is interfaced with a low-light level TV camera, array processor (AP 400) and various peripheral devices. This greatly facilitates the pattern recognition technique initiated by Monosmith and Cowley. Software called NANHB5 is under development which, instead of employing a set of photo-diodes to detect strong spots on a TV screen, uses various software techniques including on-line fast Fourier transform (FFT) to recognize patterns of greater complexity, taking advantage of the sophistication of our image processing system and the flexibility of computer software.


Sign in / Sign up

Export Citation Format

Share Document