Wind loading on intermediate height buildings

1992 ◽  
Vol 19 (1) ◽  
pp. 148-163 ◽  
Author(s):  
R. A. Sanni ◽  
D. Surry ◽  
A. G. Davenport

The current gust factor approach in the detailed method of the National Building Code of Canada (NBCC) for the estimation of wind loads on buildings was developed from research work that was largely directed towards very tall and flexible buildings for which resonant responses are very significant; however, the dynamic responses of the majority of intermediate height buildings are dominated by quasi-steady gust loading with little resonant response. This study has been carried out to assess the applicability of the detailed approach of the NBCC to that class of fairly common intermediate height buildings, of which apartment buildings are good examples. For the purposes of this study, these buildings have been defined as buildings whose heights are between 20 and 120 m and whose ratio of height to minimum width is not more than 4. The responses estimated from the detailed approach of the NBCC have been compared with those from wind tunnel tests with a view to verifying and simplifying its application to such intermediate height buildings.Since intermediate height buildings are often arranged in groups, an experimental study of the interference effects between adjacent buildings was also undertaken to assess the effect of an upwind building on the wind-induced overall moments on a downwind building of a similar height. The influence of this interference effect on the member stresses or forces was investigated using the concept of joint action factors.General agreement between the test and the code-estimated responses was obtained in the comparisons. The small resonant responses observed provided a basis for deriving a simplified method for estimating the gust factor in the detailed method without the requirement of knowing the structure's dynamic properties.Significant interference effects were found, particularly for the across-wind and torsional moments on buildings in an open exposure; however, the amplification of the overall wind-induced moments does not necessarily translate into a similar amplification of member forces or stresses. For the buildings studied, the results have shown that for the majority of practical situations, interference effects are not likely to result in amplification of member stresses or forces. A set of additional factors of safety have been proposed, based on the limited experimental data set, to cover load amplification by interference effects for those members that are very sensitive to overall wind-induced torsional moments. Key words: codes, wind loads, wind engineering, intermediate height buildings, interference effects.

2020 ◽  
Vol 206 ◽  
pp. 104227 ◽  
Author(s):  
Ying Sun ◽  
Zhiyuan Li ◽  
Xiaoying Sun ◽  
Ning Su ◽  
Shitao Peng

2020 ◽  
Vol 13 (3) ◽  
pp. 381-393
Author(s):  
Farhana Fayaz ◽  
Gobind Lal Pahuja

Background:The Static VAR Compensator (SVC) has the capability of improving reliability, operation and control of the transmission system thereby improving the dynamic performance of power system. SVC is a widely used shunt FACTS device, which is an important tool for the reactive power compensation in high voltage AC transmission systems. The transmission lines compensated with the SVC may experience faults and hence need a protection system against the damage caused by these faults as well as provide the uninterrupted supply of power.Methods:The research work reported in the paper is a successful attempt to reduce the time to detect faults on a SVC-compensated transmission line to less than quarter of a cycle. The relay algorithm involves two ANNs, one for detection and the other for classification of faults, including the identification of the faulted phase/phases. RMS (Root Mean Square) values of line voltages and ratios of sequence components of line currents are used as inputs to the ANNs. Extensive training and testing of the two ANNs have been carried out using the data generated by simulating an SVC-compensated transmission line in PSCAD at a signal sampling frequency of 1 kHz. Back-propagation method has been used for the training and testing. Also the criticality analysis of the existing relay and the modified relay has been done using three fault tree importance measures i.e., Fussell-Vesely (FV) Importance, Risk Achievement Worth (RAW) and Risk Reduction Worth (RRW).Results:It is found that the relay detects any type of fault occurring anywhere on the line with 100% accuracy within a short time of 4 ms. It also classifies the type of the fault and indicates the faulted phase or phases, as the case may be, with 100% accuracy within 15 ms, that is well before a circuit breaker can clear the fault. As demonstrated, fault detection and classification by the use of ANNs is reliable and accurate when a large data set is available for training. The results from the criticality analysis show that the criticality ranking varies in both the designs (existing relay and the existing modified relay) and the ranking of the improved measurement system in the modified relay changes from 2 to 4.Conclusion:A relaying algorithm is proposed for the protection of transmission line compensated with Static Var Compensator (SVC) and criticality ranking of different failure modes of a digital relay is carried out. The proposed scheme has significant advantages over more traditional relaying algorithms. It is suitable for high resistance faults and is not affected by the inception angle nor by the location of fault.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2532
Author(s):  
Encarna Quesada ◽  
Juan J. Cuadrado-Gallego ◽  
Miguel Ángel Patricio ◽  
Luis Usero

Anomaly Detection research is focused on the development and application of methods that allow for the identification of data that are different enough—compared with the rest of the data set that is being analyzed—and considered anomalies (or, as they are more commonly called, outliers). These values mainly originate from two sources: they may be errors introduced during the collection or handling of the data, or they can be correct, but very different from the rest of the values. It is essential to correctly identify each type as, in the first case, they must be removed from the data set but, in the second case, they must be carefully analyzed and taken into account. The correct selection and use of the model to be applied to a specific problem is fundamental for the success of the anomaly detection study and, in many cases, the use of only one model cannot provide sufficient results, which can be only reached by using a mixture model resulting from the integration of existing and/or ad hoc-developed models. This is the kind of model that is developed and applied to solve the problem presented in this paper. This study deals with the definition and application of an anomaly detection model that combines statistical models and a new method defined by the authors, the Local Transilience Outlier Identification Method, in order to improve the identification of outliers in the sensor-obtained values of variables that affect the operations of wind tunnels. The correct detection of outliers for the variables involved in wind tunnel operations is very important for the industrial ventilation systems industry, especially for vertical wind tunnels, which are used as training facilities for indoor skydiving, as the incorrect performance of such devices may put human lives at risk. In consequence, the use of the presented model for outlier detection may have a high impact in this industrial sector. In this research work, a proof-of-concept is carried out using data from a real installation, in order to test the proposed anomaly analysis method and its application to control the correct performance of wind tunnels.


2016 ◽  
Vol 2016 ◽  
pp. 1-14
Author(s):  
Michalina Markousi ◽  
Dimitrios K. Fytanidis ◽  
Johannes V. Soulis

Reducing the wind loading of photovoltaic structures is crucial for their structural stability. In this study, two solar panel arrayed sets were numerically tested for load reduction purposes. All panel surface areas of the arrayed set are exposed to the wind similarly. The first set was comprised of conventional panels. The second one was fitted with square holes located right at the gravity center of each panel. Wind flow analysis on standalone arrayed set of panels at fixed inclination was carried out to calculate the wind loads at various flow velocities and directions. The panels which included holes reduced the velocity in the downwind flow region and extended the low velocity flow region when compared to the nonhole panels. The loading reduction, in the arrayed set of panels with holes ranged from 0.8% to 12.53%. The maximum load reduction occurred at 6.0 m/s upwind velocity and 120.0° approach angle. At 30.00 approach angle, wind load increased but marginally. Current research work findings suggest that the panel holes greatly affect the flow pattern and subsequently the wind load reduction. The computational analysis indicates that it is possible to considerably reduce the wind loading using panels with holes.


2021 ◽  
Vol 50 (1) ◽  
pp. 138-152
Author(s):  
Mujeeb Ur Rehman ◽  
Dost Muhammad Khan

Recently, anomaly detection has acquired a realistic response from data mining scientists as a graph of its reputation has increased smoothly in various practical domains like product marketing, fraud detection, medical diagnosis, fault detection and so many other fields. High dimensional data subjected to outlier detection poses exceptional challenges for data mining experts and it is because of natural problems of the curse of dimensionality and resemblance of distant and adjoining points. Traditional algorithms and techniques were experimented on full feature space regarding outlier detection. Customary methodologies concentrate largely on low dimensional data and hence show ineffectiveness while discovering anomalies in a data set comprised of a high number of dimensions. It becomes a very difficult and tiresome job to dig out anomalies present in high dimensional data set when all subsets of projections need to be explored. All data points in high dimensional data behave like similar observations because of its intrinsic feature i.e., the distance between observations approaches to zero as the number of dimensions extends towards infinity. This research work proposes a novel technique that explores deviation among all data points and embeds its findings inside well established density-based techniques. This is a state of art technique as it gives a new breadth of research towards resolving inherent problems of high dimensional data where outliers reside within clusters having different densities. A high dimensional dataset from UCI Machine Learning Repository is chosen to test the proposed technique and then its results are compared with that of density-based techniques to evaluate its efficiency.


Author(s):  
Ponugupati Narendra Mohan Et.al

Man In recent day’s occurrence of a global crisis in Environmental (Emission of pollutants) and in Health (Pandemic COVID-19) created a recession in all sectors. The innovations in technology lead to heavy competition in global market forcing to develop new variants especially in the automobile sector. This creates more turbulence in demand at the production of new models, maintenance of existing models that are obsolete while implementation of Bharat Standard automobile regulatory authority BS-VI of India. In this research work developed a novel model of value analysis is integrated by multi-objective function with multi-criteria decision-making analysis by incorporating the big data analytics with green supply chain management to bridge the gap in demand to an Indian manufacturing sector using a firm-level data set using matrix chain multiplication dynamic programming algorithm and the computational results illustrates that the algorithm proposed is effective.


2017 ◽  
Vol 5 (6) ◽  
pp. 953-963 ◽  
Author(s):  
Adam G Tennant ◽  
Nasir Ahmad ◽  
Sybil Derrible

Abstract A general model of the complexity of the sport of boxing has yet to be produced exploring the match play that goes on between combatants. The sport has a long history that dates back to the eighth century before common era (BCE) to the time of ancient Greece. Also known as the ‘sweet science’, most research work has legitimately focused on the combat sport’s long-term health affects concerning brain trauma. This present study seeks to explore the complexity of the sport by utilizing a data set of welterweights (63.5–67 kg). This data set was used to build a contact network with the boxers as nodes and the actual fights as the links. Additionally a PageRank algorithm was used to rank the boxers from the contact network. Devon Alexander was calculated as the top welterweight from data set. This was compared with the rankings of the sport’s notoriously corrupt sanctioning bodies, journalistic rankings, and a more standard non-network based ranking system. The network visualization displayed features typical of many others seen in the literature. A closer look was taken on several of the boxers by the visualization technique known as the rank clock. This allowed for the boxer’s rank history to be tracked and allowed for insight on their career trajectory. Timothy Bradley and Vyacheslav Senchenko had rank clocks that displayed them to be the most consistent boxers in the 2004–2014 decade. These research findings supply further confirmation of value of the network based approach in athletic match play.


Sign in / Sign up

Export Citation Format

Share Document