Comparison of Collisions of Rigid Trucks and Articulated Trucks Against Road Safety Barriers

2001 ◽  
Vol 1779 (1) ◽  
pp. 141-149
Author(s):  
Vittorio Giavotto ◽  
Mariano Pernetti

Heavy vehicles used for road transport are essentially rigid trucks, rigid trucks with trailers, and articulated trucks. A collision of such a vehicle against a safety barrier has different outcomes, depending on the vehicle characteristics, even when the impact energy (Ie) is the same. The factors responsible for the different behavior of rigid and articulated trucks during collision are addressed. The study, carried out by computer simulation, was divided into three parts. The first part compared the overall behavior of the two types of vehicles during collisions to detect the most responsible factors. In the second phase, the single features that characterize each type of vehicle were tested. The third phase tried to define a relationship between the two types of vehicles. Results from Tests TB81 and TB71, established by the European Committee for Standardization 1317, were compared. The results show that a collision of an articulated truck is less severe than one of a rigid truck because of greater length, suspension stiffness, inertia, and configuration. However, the difference in behavior depends on kinetic Ie and side-friction coefficient (SFC). Four analytical expressions were found that relate Ies producing the same maximum transversal displacements or vehicle roll angle for the two types of vehicles. The study concerning the European tests on safety barriers shows that a hierarchy exists between them and it depends on the SFC.

2014 ◽  
Vol 116 (3) ◽  
pp. 527-543 ◽  
Author(s):  
Dragan Tešanovic ◽  
Milovan Krasavcic ◽  
Bojana Miro Kalenjuk ◽  
Milijanko Portic ◽  
Snježana Gagic

Purpose – The aim of this paper is to determine the sensory quality of food in restaurants by professional food evaluators and to research the impact of education, age and number of employees on the quality of food. Design/methodology/approach – In the first phase five trained food tasters evaluated the sensory quality of food. In the second phase, the analysis of the structure of employees was done by establishing their level of education, age and number of employees. In the third phase the regression and correlation analysis was done with the aim to establish the impact of the level of education, age and number of employees on the sensory quality of food. Findings – The sensory evaluation has shown that the evaluated food is of moderate quality. Correlation matrix has shown that the education level of employees has a high impact on the sensory quality of food. There is a correlation between the number of employees, their age and their education. Practical implications – Obtained results are the indicators of the quality of food in restaurants in the region and they can serve for the improvement of quality. They have shown that education and staff training can contribute to a better quality of food. Established methodology can also contribute to the practical evaluation of quality. Originality/value – This paper is reflected on the specific application of methodology of the sensory analysis of food in restaurants. The paper pointed to the impact of employees on the sensory quality of food by statistical methods. Statistical results which point to the great impact of the level of education of employees on the sensory quality of food in restaurants are particularly valuable.


2020 ◽  
Vol 24 (5) ◽  
pp. 2791-2815
Author(s):  
Christian Onof ◽  
Li-Pen Wang

Abstract. The use of Poisson cluster processes to model rainfall time series at a range of scales now has a history of more than 30 years. Among them, the randomised (also called modified) Bartlett–Lewis model (RBL1) is particularly popular, while a refinement of this model was proposed recently (RBL2; Kaczmarska et al., 2014). Fitting such models essentially relies upon minimising the difference between theoretical statistics of the rainfall signal and their observed estimates. The first statistics are obtained using closed form analytical expressions for statistics of the orders 1 to 3 of the rainfall depths, as well as useful approximations of the wet–dry structure properties. The second are standard estimates of these statistics for each month of the data. This paper discusses two issues that are important for the optimal model fitting of RBL1 and RBL2. The first issue is that, when revisiting the derivation of the analytical expressions for the rainfall depth moments, it appears that the space of possible parameters is wider than has been assumed in past papers. The second issue is that care must be exerted in the way monthly statistics are estimated from the data. The impact of these two issues upon both models, in particular upon the estimation of extreme rainfall depths at hourly and sub-hourly timescales, is examined using 69 years of 5 min and 105 years of 10 min rainfall data from Bochum (Germany) and Uccle (Belgium), respectively.


2016 ◽  
Vol 11 (2) ◽  
pp. 127-135
Author(s):  
Biljana Maljković ◽  
Dražen Cvitanić

Experimental investigation was conducted on a 24 km long segment of the two-lane state road to collect the driver behavior data. The research involved 20 drivers driving their own cars equipped with the GPS device. Considering the impact of path radius and speed on the side friction demand, the design consistency on horizontal curves was evaluated by determining the margins of safety. The analysis showed that the vehicle path radii were mainly smaller than curve radius, on average for 12%. Regression analysis indicated that the percentage difference between the curve radius and vehicle path radius is not affected by the speed, speed differential and geometric characteristics of the curve and surrounding elements. Two different margins of safety were analyzed. One is the difference between maximum permissible side friction (based on design speed) and side friction demand, while another is the difference between side friction supply (based on operating speed) and side friction demand. Generally, demands exceeded supply side friction factors on curves with radii smaller than 150 m, whereas “poor” conditions (in terms of Lamm’s consistency levels) were noted for curves under approximately 220 m. Both values are very close to the critical radius below which higher accident rates were observed according to several accident studies. Based on the results of the research, it is proposed to use a 12% smaller curve radius for the evaluation of margin of safety and that curves with radii smaller than 200 m should be avoided on two-lane state roads outside the built-up area.


2019 ◽  
Author(s):  
Christian Onof ◽  
Li-Pen Wang

Abstract. The use of Poisson-cluster processes to model rainfall time series at a range of scales now has a history of more than 30 years. Among them, the Randomised (also called modified) Bartlett–Lewis model (RBL1) is particularly popular, while a refinement of this model was proposed recently (RBL2) (Kaczmarska et al., 2014). Fitting such models essentially relies upon minimising the difference between theoretical statistics of the rainfall signal and their observed estimates. The first are obtained using closed form analytical expressions for statistics of order 1 to 3 of the rainfall depths, as well as useful approximations of the wet-dry structure properties. The second are standard estimates of these statistics for each month of the data. This paper discusses two issues that are important for optimal model fitting of the RBL1 and RBL2. The first is that, when revisiting the derivation of the analytical expressions for the rainfall depth moments, it appears that the space of possible parameters is wider than has been assumed in the past papers. The second is that care must be exerted in the way monthly statistics are estimated from the data. The impact of these two issues upon both models, in particular upon the estimation of extreme rainfall depths at hourly and sub-hourly timescales is examined using 69 years of 5-min and 105 years of 10-min rainfall data from Bochum (Germany) and Uccle (Belgium), respectively.


2013 ◽  
Vol 53 (2) ◽  
pp. 498 ◽  
Author(s):  
Mark Hayward

This extended abstract discusses the top 10 risks and opportunities for oil and gas companies in 2013, which have been identified in our biannual global survey. It has been said that the difference between a business risk and an opportunity is the organisational speed of recognition and response. In this biannual update to the Ernst & Young oil and gas risk and opportunities report, we provide the latest views about the key risks and opportunities facing the oil and gas sector. Our three-phase approach provides a unique insight into the sectors, leading risks, and opportunities. We interview a panel of industry executives and experts, and ask them to identify the top risks and opportunities, as well as those below the radar that could rise into the top 10 in the years ahead. They are then grouped and aggregated to form a strategic challenge list for the oil and gas sector. The second phase of our research is to conduct a large-sample survey of companies and governments to rank the strategic challenges, obtain forecasts on whether these challenges would be more or less important in the future and discover how leading organisations are responding to them. The third phase of our research is to conduct interviews with leading industry executives to gain insights on how the risks and opportunities impact their organisations and how these executives are managing or preparing for them. The latest edition of the Ernst & Young Oil and Gas Risk and Opportunities Report was released in March 2013.


2012 ◽  
Vol 24 (2) ◽  
pp. 287-303 ◽  
Author(s):  
Michael A. Pitts ◽  
Antígona Martínez ◽  
Steven A. Hillyard

An inattentional blindness paradigm was adapted to measure ERPs elicited by visual contour patterns that were or were not consciously perceived. In the first phase of the experiment, subjects performed an attentionally demanding task while task-irrelevant line segments formed square-shaped patterns or random configurations. After the square patterns had been presented 240 times, subjects' awareness of these patterns was assessed. More than half of all subjects, when queried, failed to notice the square patterns and were thus considered inattentionally blind during this first phase. In the second phase of the experiment, the task and stimuli were the same, but following this phase, all of the subjects reported having seen the patterns. ERPs recorded over the occipital pole differed in amplitude from 220 to 260 msec for the pattern stimuli compared with the random arrays regardless of whether subjects were aware of the patterns. At subsequent latencies (300–340 msec) however, ERPs over bilateral occipital-parietal areas differed between patterns and random arrays only when subjects were aware of the patterns. Finally, in a third phase of the experiment, subjects viewed the same stimuli, but the task was altered so that the patterns became task relevant. Here, the same two difference components were evident but were followed by a series of additional components that were absent in the first two phases of the experiment. We hypothesize that the ERP difference at 220–260 msec reflects neural activity associated with automatic contour integration whereas the difference at 300–340 msec reflects visual awareness, both of which are dissociable from task-related postperceptual processing.


2020 ◽  

Background and Objectives: Globally, cardiovascular disease (CVD) is the number one cause of mortality. In this regard, this study aimed to provide policies for the management of CVD by focusing on the reduction of myocardial infarction (MI) mortality rate in Iran. Materials and Methods: The sequential mixed methods design will be employed to foresight the prevalence of MI in Iran in the next 10 years. This study consists of five phases and in the first phase, the risk factors of cardiovascular disease will be investigated using a systematic review. In the second phase, the uncertainty and impact of those factors will be demonstrated by the experts. Moreover, the impact/uncertainty grid will be used to identify the drivers that are less important and critical uncertainties. In the third phase, the cross-impact matrix will be developed by Scenario wizard, and the scenario logic and the scenarios will be developed. Once the scenario logic is established, details can be added to the scenarios. The next phase consists of statistical estimations of the rate of mortality due to heart attack using artificial neural networks. Finally, the policies will be developed based on the opinions of the panel of experts. The initial results will be published in mid-2020. Results: This future study will develop policies to prevent from MI with scenario-based and modeling approaches. The findings can be useful for healthcare professionals and it can improve our understanding of the future of MI to enhance the management of MI patients. Conclusion: The obtained policies will help policymakers to make evidence-based decisions, re-design structures, and processes of healthcare interventions, and also plan to decrease MI mortality rate.


Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 757 ◽  
Author(s):  
Olegas Prentkovskis ◽  
Živko Erceg ◽  
Željko Stević ◽  
Ilija Tanackov ◽  
Marko Vasiljević ◽  
...  

The daily requirements and needs imposed on the executors of logistics services imply the need for a higher level of quality. In this, the proper execution of all sustainability processes and activities plays an important role. In this paper, a new methodology for improving the measurement of the quality of the service consisting of three phases has been developed. The first phase is the application of the Delphi method to determine the quality dimension ranking. After that, in the second phase, using the FUCOM (full consistency method), we determined the weight coefficients of the quality dimensions. The third phase represents determining the level of quality using the SERVQUAL (service quality) model, or the difference between the established gaps. The new methodology considers the assessment of the quality dimensions of a large number of participants (customers), on the one hand, and experts’ assessments on the other hand. The methodology was verified through the research carried out in an express post company. After processing and analyzing the collected data, the Cronbach alpha coefficient for each dimension of the SERVQUAL model for determining the reliability of the response was calculated. To determine the validity of the results and the developed methodology, an extensive statistical analysis (ANOVA, Duncan, Signum, and chi square tests) was carried out. The integration of certain methods and models into the new methodology has demonstrated greater objectivity and more precise results in determining the level of quality of sustainability processes and activities.


2021 ◽  
pp. 80-81
Author(s):  
Unmesh. A.K ◽  
Biju Bahuleyan

Introduction: In higher education, outcome based approach to teaching is the dictum. Assessment criteria should be designed to ensure that learning takes place at the level appropriate to the assigned skill. Students when familiarised with the assessment criteria results in self-motivated approach to attain that skill.The objective of the present study is to determine the impact of awareness of assessment criteria on the performance of students in group activities. Materials and methods:100 phase one MBBS students were included in the study.Whole batch was divided into 15 small groups and each group was given a problem based question to discuss.The role of each participant in the group was assessed by assessors using specified assessment criteria. Assessment was done in two phases. In the second phase students were made aware of the assessment criteria. Reflections of students regarding the assessment criteria were also collected. Results:The scores obtained after the students were aware of the assessment criteria was higher and the difference was found to be statistically significant. Majority of the students reflected on the positive impact of being aware of the assessment criteria. Conclusion: Knowing about based on what criteria a candidate is being assessed motivates the student to perform better.In activities which was not assessed earlier the approach of using an assessment criteria and making the student aware of it would definitely assure better performance.


Author(s):  
Paul Luna

This chapter considers the technologies that emerged in the publishing trade between 1970 and 2004. OUP’s response to technological change can be considered in three phases. Initially the Press invested in the computerization of typesetting as both a cost- and time-saving measure. During the second phase, the Press introduced the efficient use of computers in book design and experimented with the sale of software and packaged electronic publications. The third phase witnessed the advent of the worldwide web, which allowed the Press to develop and exploit new methods of advertisement, sales, distribution, and publication. Throughout these phases, the Press demonstrated a consistent desire to reduce costs, protect intellectual property, and expand into new markets. Along with these developments in publication and distribution, the chapter briefly considers the impact of the computerization of administrative, editorial, and other office tasks.


Sign in / Sign up

Export Citation Format

Share Document