Prediction of Main/Secondary-Air System Flow Interaction in a High-Pressure Turbine

2005 ◽  
Vol 21 (1) ◽  
pp. 158-166
Author(s):  
Roger L. Davis ◽  
Juan J. Alonso ◽  
Jixian Yao ◽  
Roger Paolillo ◽  
Om P. Sharma
2017 ◽  
Vol 121 (1242) ◽  
pp. 1200-1215 ◽  
Author(s):  
L. Pawsey ◽  
D. J. Rajendran ◽  
V. Pachidis

ABSTRACTThe rotor sub-assembly of the high-pressure turbine of a modern turbofan engine is typically free to move downstream because of the force imbalance acting on the disc and blades following an un-located shaft failure. This downstream movement results in a change in the geometry of the rotor blade, tip seals and rim/platform seals because of the interaction of the rotor sub-assembly with the downstream vane sub-assembly. Additionally, there is a change in the leakage flow properties, which mix with the main flow because of the change in engine behaviour and secondary air system dynamics. In the present work, the changes in geometry following the downstream movement of the turbine, are obtained from a validated friction model and structural LS-DYNA simulations. Changes in leakage flow properties are obtained from a transient network source-sink secondary air system model. Three-dimensional Reynolds-averaged Navier-Stokes simulations are used to evaluate the aerodynamic effect from the inclusion of the leakage flows, tipseal domains, and downstream movement of the rotor for three displacement configurations (i.e. 0, 10 and 15 mm) with appropriate changes in geometry and leakage flow conditions. It is observed from the results that there is a significant reduction in the expansion ratio, torque and power produced by the turbine with the downstream movement of the rotor because of changes in the flow behaviour for the different configurations. These changes in turbine performance parameters are necessary to accurately predict the terminal speed of the rotor using an engine thermodynamic model. Further, it is to be noted that such reductions in turbine rotor torque will result in a reduction of the terminal speed attained by the rotor during an un-located shaft failure. Therefore the terminal speed of the rotor can be controlled by introducing design features that will result in the rapid rearward displacement of the turbine rotor.


Author(s):  
Lucas Pawsey ◽  
David John Rajendran ◽  
Vassilios Pachidis

An unlocated shaft failure in the high pressure turbine spool of an engine may result in a complex orbiting motion along with rearward axial displacement of the high pressure turbine rotor sub-assembly. This is due to the action of resultant forces and limitations imposed by constraints such as the bearings and turbine casing. Such motion of the rotor following an unlocated shaft failure, results in the development of multiple contacts between the components of the rotor sub-assembly, the turbine casing, and the downstream stator casing. Typically, in the case of shrouded rotor blades, the tip region is in the form of a seal with radial protrusions called ‘fins’ between the rotor blade and the turbine casing. The contact between the rotor blade and the turbine casing will therefore result in excessive wear of the tip seal fins, resulting in changes in the geometry of the tip seal domain that affects the characteristics of the tip leakage vortex. The rotor sub-assembly with worn seals may also be axially displaced rearwards, and consequent to this displacement, changes in the geometry of the rotor blade may occur because of the contact between the rotor sub-assembly and the downstream stator casing. An integrated approach of structural analyses, secondary air system dynamics, and 3D CFD is adopted in the present study to quantify the effect of the tip seal damage and axial displacement on the aerodynamic performance of the turbine stage. The resultant geometry after wearing down of the fins in the tip seal, and rearward axial displacement of the rotor sub-assembly is obtained from LS-DYNA simulations. 3D RANS analyses are carried out to quantify the aerodynamic performance of the turbine with worn fins in the tip seal at three different axial displacement locations i.e. 0 mm, 10 mm and 15 mm. The turbine performance parameters are then compared with equivalent cases in which the fins in the tip seal are intact for the same turbine axial displacement locations. From this study it is noted that the wearing of tip seal fins results in reduced turbine torque, power output and efficiency, consequent to changes in the flow behaviour in the turbine passages. The reduction in turbine torque will result in the reduction of the terminal speed of the rotor during an unlocated shaft failure. Therefore, a design modification that can lead to rapid wearing of the fins in the tip seal after an unlocated shaft failure holds promise for the management of a potential over-speed event.


Author(s):  
T. Wolf ◽  
K. Lehmann ◽  
L. Willer ◽  
A. Pahs ◽  
M. Rößling ◽  
...  

This paper introduces a new 2-stage high-pressure turbine rig for aerodynamic investigations. It is operated by DLR Göttingen (Germany) and installed in DLR’s new testing facility NG-Turb. The rig’s geometrical size as well as the non-dimensional parameters are comparable to a modern engine in the small to medium thrust range. The turbine rig closely resembles engine hardware and features all relevant blade and vane cooling as well as secondary air-system flows. The effect of variations of each individual flow and different tip clearances on overall turbine efficiency will be studied. While the first part of the testing program will be based on uniform inlet conditions the second part will be run with a combustor simulator, which is based on electrical heaters and delivers a flow field similar to a rich-burn combustor. In order to find the optimum relative position for maximum turbine efficiency the combustor simulator can be rotated relative to the HPT inlet (clocking). For the same reasons the stators can also be clocked. The paper gives a brief overview of the testing facility and from there on focuses on the HPT rig features such as aerodynamic design, cooling and sealing flows. The aerodynamic optimisation of the stator vanes and shroudless rotor blades will be outlined. Further topics are the aerodynamic design of the combustor simulator, a comparison with engine combustors as well as the implementation in the rig. The paper also describes the rig instrumentation in the stationary and rotating system which most importantly focuses on measurements of efficiency and capturing of traverse data. The topic of blade and vane manufacturing via direct metal laser sintering will be briefly covered. The discussion of test results and comparison with numerical simulations will be the subject of a follow-up paper.


Author(s):  
Winfried-Hagen Friedl ◽  
Dieter Peitsch ◽  
Dimitrie Negulescu

Conventional two-stage high pressure turbine (HPT) air system concepts usually are based on exclusive use of compressor delivery air to cool HPT blades and vanes and to seal the gaps between rotors and stators. This air is expensive in terms of engine efficiency, since work has been done on it from all stages of the compressor and the air is lost for the main thermodynamic engine cycle. It is also very hot, leading to strong thermal loading of the turbine material. Improvements in this area thus lead to an immediate reduction of fuel consumption and increased cycle life of the turbine discs. On static components, it is common practice to use pre-swirl nozzles in order to reduce the relative total temperature of downstream disc and blades. To feed the interstage cavity between both HPT discs, the air is transferred through the first rotor disc or drive arm. In a conventional system, the air is passed on through straight holes, thus no benefit is taken from the internal energy potential of the cooling air, where work output from the flow would lead to an immediate air temperature reduction. The advanced air system presented in this paper uses de-swirl nozzles in the rotating part to extract energy from the fluid. By this, the air temperature for the downstream static part is reduced on the one hand and in addition, the overall turbine efficiency is increased due to the contribution from the fluid. This paper will cover the effects of an air system design change from a conventional to an advanced HPT air system on the Rolls-Royce BR715 aeroengine based on numerical analysis and test data. An overview of the change to the flow field and prospect of current research programmes in this field will be given.


Author(s):  
Dieter Peitsch ◽  
Manuela Stein ◽  
Stefan Hein ◽  
Reinhard Niehuis ◽  
Ulf Reinmo¨ller

Modern jet engines require very high cycle temperatures for efficient operation. In turn, cooling air is needed for the turbine, since the materials are not yet capable of taking these temperatures. Air is taken from the compressor for the purpose of cooling and turbine rim sealing, bypassing the main combustion circuit. Since this affects the efficiency of the engine in a negative manner, measures are taken to reduce the amount of air to an absolute minimum. These measures include the investigation of reducing pressure losses within the involved subsystems. One of these subsystems in the BR700 aeroengine series of Rolls-Royce is the vortex reducer device, which delivers bleed air to the secondary air system of the engine. The German government has set up a research project, aiming for an overall improvement of aeroengines. This program, Engine 3E, where 3E reflects Efficiency, Economy and Environment, concentrates on the main components of gas turbines. Programmes for the high pressure turbine and for the combustion chamber have been set up. The high pressure compressor has been identified as key component as well. A new 9-stage compressor is being developed at Rolls-Royce Deutschland to adress the respective needs. From the point of view of the secondary air system, the vortex reducer in this component plays a major role with respect to the efficient use of cooling and sealing air. Rolls-Royce Deutschland has performed CFD studies on the performance of different vortex reducer geometries, which currently are considered for incorporation into the future engine. The results of these investigations wil be converted into more simple design rules for proper reflection of the behaviour of this system for future designs. The paper presents the set up of the geometries, the applied boundary conditions as well as the final results. To tackle the difference between a high pressure compressor rig and a typical two-shaft engine, a dedicated investigation to assess the difference between a pure high pressure core without an internal shaft and a realistic high/low pressure shaft configuration has been carried out and is included in the paper. Recommendations to improve the design with respect to minimized pressure losses will be shown as well.


Author(s):  
Thomas Weiss ◽  
Jose Maria Rey Villazón ◽  
Arnold Kühhorn

High Pressure Turbine Discs of Aero Engines are classified as Critical Parts. Critical Parts are those whose failure is classified as likely to have hazardous or even catastrophic effects (e.g. damage to or loss of aircraft structure, injury/loss of the crew/passengers) and therefore require special control in order to achieve an acceptably low probability of individual failure. Even though special care is taken during the design and manufacturing process of these parts, there are still tolerances within their manufacturing route and during operation. Historically, Aero Engine parts were designed and laid out not to fail by using large safety factors to allow for scatter in different parameters. With the advent of high power computing, the time to conduct detailed thermo-mechanical assessments has drastically reduced and is therefore now open for Probabilistic Analytical Methods to determine the influence of parameter scatter on life and integrity. This paper presents a parametric study of a typical two stage High Pressure Turbine (HPT) disc arrangement with a micro-turbine system, which feeds cooling air into the interstage cavity [1]. A series of automated studies were performed to determine the relevance of parameters, assess their sensitivity and evaluate their combined impact on the targets of the disc design process. The automated workflow couples a chain of programs that perform geometry manipulation, Finite-Element Thermo-Mechanical analysis simulation and life prediction. This process was used to assess parameter variations in the air system, thermal boundary conditions and the geometry of several turbine disc features. The resulting outputs of this study are the percentile impacts and correlations of each parameter on the life expectancy of the turbine discs. This provides a qualitative understanding of the relevance of each parameter when approaching the design of turbine discs.


Author(s):  
Cheng-Wei Fei ◽  
Wen-Zhong Tang ◽  
Guang-chen Bai ◽  
Zhi-Ying Chen

Around the engineering background of the probabilistic design of high-pressure turbine (HPT) blade-tip radial running clearance (BTRRC) which conduces to the high-performance and high-reliability of aeroengine, a distributed collaborative extremum response surface method (DCERSM) was proposed for the dynamic probabilistic analysis of turbomachinery. On the basis of investigating extremum response surface method (ERSM), the mathematical model of DCERSM was established. The DCERSM was applied to the dynamic probabilistic analysis of BTRRC. The results show that the blade-tip radial static clearance δ = 1.82 mm is advisable synthetically considering the reliability and efficiency of gas turbine. As revealed by the comparison of three methods (DCERSM, ERSM, and Monte Carlo method), the DCERSM reshapes the possibility of the probabilistic analysis for turbomachinery and improves the computational efficiency while preserving computational accuracy. The DCERSM offers a useful insight for BTRRC dynamic probabilistic analysis and optimization. The present study enrichs mechanical reliability analysis and design theory.


Author(s):  
Qingjun Zhao ◽  
Fei Tang ◽  
Huishe Wang ◽  
Jianyi Du ◽  
Xiaolu Zhao ◽  
...  

In order to explore the influence of hot streak temperature ratio on low pressure stage of a Vaneless Counter-Rotating Turbine, three-dimensional multiblade row unsteady Navier-Stokes simulations have been performed. The predicted results show that hot streaks are not mixed out by the time they reach the exit of the high pressure turbine rotor. The separation of colder and hotter fluids is observed at the inlet of the low pressure turbine rotor. After making interactions with the inner-extending shock wave and outer-extending shock wave in the high pressure turbine rotor, the hotter fluid migrates towards the pressure surface of the low pressure turbine rotor, and the most of colder fluid migrates to the suction surface of the low pressure turbine rotor. The migrating characteristics of the hot streaks are predominated by the secondary flow in the low pressure turbine rotor. The effect of buoyancy on the hotter fluid is very weak in the low pressure turbine rotor. The results also indicate that the secondary flow intensifies in the low pressure turbine rotor when the hot streak temperature ratio is increased. The effects of the hot streak temperature ratio on the relative Mach number and the relative flow angle at the inlet of the low pressure turbine rotor are very remarkable. The isentropic efficiency of the Vaneless Counter-Rotating Turbine decreases as the hot streak temperature ratio is increased.


Sign in / Sign up

Export Citation Format

Share Document