scholarly journals To Optimize Network Call Drops and Enhances Real Time (CDR) Data Handling with Security

2020 ◽  
Author(s):  
Ashok G V ◽  
Dr.Vasanthi Kumari P

The telecom networks generate multitudes and large sets of data related to networks, applications, users, network operations and real time call processing (Call Detail Record (CDR)). This large data set has the capability to give valuable business insights - for example, real-time user quality of service, network issues, call drop issues, customer satisfaction index, customer churn, network capacity forecast and many more revenue impacting insights. As even setting up of more towers for better coverage would also directly affect the health of habitants around. In this paper, the overall condition of call drops has been reviewed and possible ways to minimize the spectacles of network call drops. Applied Linear Regression algorithm which is used type of predictive analysis. Three major uses for regression analysis Determining the strength of predictors, Forecasting an effect and Trend forecasting. This paper gives to telecom service providers to improve their networks and minimize the network call drops with security. Deliver quality of services to their subscribers using the advanced technologies with accurate algorithms.

2020 ◽  
Vol 21 (3) ◽  
pp. 181-190
Author(s):  
Jaroslav Frnda ◽  
Marek Durica ◽  
Mihail Savrasovs ◽  
Philippe Fournier-Viger ◽  
Jerry Chun-Wei Lin

AbstractThis paper deals with an analysis of Kohonen map usage possibility for real-time evaluation of end-user video quality perception. The Quality of Service framework (QoS) describes how the network impairments (network utilization or packet loss) influence the picture quality, but it does not reflect precisely on customer subjective perceived quality of received video stream. There are several objective video assessment metrics based on mathematical models trying to simulate human visual system but each of them has its own evaluation scale. This causes a serious problem for service providers to identify a critical point when intervention into the network behaviour is needed. On the other hand, subjective tests (Quality of Experience concept) are time-consuming and costly and of course, cannot be performed in real-time. Therefore, we proposed a mapping function able to predict subjective end-user quality perception based on the situation in a network, video stream features and results obtained from the objective video assessment method.


2019 ◽  
Vol 13 (2) ◽  
Author(s):  
Catherine A. Bliss ◽  
Betty Lawrence

Asynchronous text based discussion boards are included in many online courses, however strategies to compare their use within and between courses, from a disciplinary standpoint, have not been well documented in the literature. The goal of this project was to develop a multi-factor metric which could be used to characterize discussion board use in a large data set (n=11,596 message posts) and to apply this metric to all Mathematics courses offered in the January 2008 term by the Center for Distance Learning at Empire State College. The results of this work reveal that student participation rates, quantity of student posts, quality of student posts and the extent of threading are well correlated with instructor activity.


2019 ◽  
Vol 37 (3) ◽  
pp. 429-446 ◽  
Author(s):  
Michal Kačmařík ◽  
Jan Douša ◽  
Florian Zus ◽  
Pavel Václavovic ◽  
Kyriakos Balidakis ◽  
...  

Abstract. An analysis of processing settings impacts on estimated tropospheric gradients is presented. The study is based on the benchmark data set collected within the COST GNSS4SWEC action with observations from 430 Global Navigation Satellite Systems (GNSS) reference stations in central Europe for May and June 2013. Tropospheric gradients were estimated in eight different variants of GNSS data processing using precise point positioning (PPP) with the G-Nut/Tefnut software. The impacts of the gradient mapping function, elevation cut-off angle, GNSS constellation, observation elevation-dependent weighting and real-time versus post-processing mode were assessed by comparing the variants by each to other and by evaluating them with respect to tropospheric gradients derived from two numerical weather models (NWMs). Tropospheric gradients estimated in post-processing GNSS solutions using final products were in good agreement with NWM outputs. The quality of high-resolution gradients estimated in (near-)real-time PPP analysis still remains a challenging task due to the quality of the real-time orbit and clock corrections. Comparisons of GNSS and NWM gradients suggest the 3∘ elevation angle cut-off and GPS+GLONASS constellation for obtaining optimal gradient estimates provided precise models for antenna-phase centre offsets and variations, and tropospheric mapping functions are applied for low-elevation observations. Finally, systematic errors can affect the gradient components solely due to the use of different gradient mapping functions, and still depending on observation elevation-dependent weighting. A latitudinal tilting of the troposphere in a global scale causes a systematic difference of up to 0.3 mm in the north-gradient component, while large local gradients, usually pointing in a direction of increasing humidity, can cause differences of up to 1.0 mm (or even more in extreme cases) in any component depending on the actual direction of the gradient. Although the Bar-Sever gradient mapping function provided slightly better results in some aspects, it is not possible to give any strong recommendation on the gradient mapping function selection.


2020 ◽  
Author(s):  
Markus Wiedemann ◽  
Bernhard S.A. Schuberth ◽  
Lorenzo Colli ◽  
Hans-Peter Bunge ◽  
Dieter Kranzlmüller

<p>Precise knowledge of the forces acting at the base of tectonic plates is of fundamental importance, but models of mantle dynamics are still often qualitative in nature to date. One particular problem is that we cannot access the deep interior of our planet and can therefore not make direct in situ measurements of the relevant physical parameters. Fortunately, modern software and powerful high-performance computing infrastructures allow us to generate complex three-dimensional models of the time evolution of mantle flow through large-scale numerical simulations.</p><p>In this project, we aim at visualizing the resulting convective patterns that occur thousands of kilometres below our feet and to make them "accessible" using high-end virtual reality techniques.</p><p>Models with several hundred million grid cells are nowadays possible using the modern supercomputing facilities, such as those available at the Leibniz Supercomputing Centre. These models provide quantitative estimates on the inaccessible parameters, such as buoyancy and temperature, as well as predictions of the associated gravity field and seismic wavefield that can be tested against Earth observations.</p><p>3-D visualizations of the computed physical parameters allow us to inspect the models such as if one were actually travelling down into the Earth. This way, convective processes that occur thousands of kilometres below our feet are virtually accessible by combining the simulations with high-end VR techniques.</p><p>The large data set used here poses severe challenges for real time visualization, because it cannot fit into graphics memory, while requiring rendering with strict deadlines. This raises the necessity to balance the amount of displayed data versus the time needed for rendering it.</p><p>As a solution, we introduce a rendering framework and describe our workflow that allows us to visualize this geoscientific dataset. Our example exceeds 16 TByte in size, which is beyond the capabilities of most visualization tools. To display this dataset in real-time, we reduce and declutter the dataset through isosurfacing and mesh optimization techniques.</p><p>Our rendering framework relies on multithreading and data decoupling mechanisms that allow to upload data to graphics memory while maintaining high frame rates. The final visualization application can be executed in a CAVE installation as well as on head mounted displays such as the HTC Vive or Oculus Rift. The latter devices will allow for viewing our example on-site at the EGU conference.</p>


2020 ◽  
Vol 8 (2) ◽  
pp. 16-23
Author(s):  
Lilik Mulyati

Community satisfaction measurements are intended to determine the level of community satisfaction with the services provided and provide an opportunity for the community to assess the services that have been received.The design of this study is quantitative with a Cross-Sectional approach. The study was conducted by involving 59 respondents taken by accidental sampling. The study was conducted in April until May 2019 at Wonosari Health Center, Bondowoso Regency. The measuring instrument used was a questionnaire, then bivariate analysis was performed with Spearman Rank Analysis, with a significance of p = 0.05. The results of the Spearman Rank statistical test with significance levels p <0.05. Obtained p-value 0,000 which means There is a Relationship between the Quality of Inpatient Services and the Wonosari Community Health Center Community Satisfaction Index with a correlation coefficient of 0.714. The quality of inpatient care results in an index of community satisfaction when receiving health services provided. The quality of this service can ultimately provide several benefits, including the establishment of harmonious relationships between service providers and customers, providing a good basis for creating customer loyalty and building a favorable recommendation for these service providers.


Author(s):  
Vesna Radonjic Djogatovic ◽  
Marko Djogatovic

This chapter aims to provide new possibilities for service providers to enhance their revenues using the appropriate pricing scheme. Features and applicability of responsive pricing scheme and hybrid pricing for charging end users in next generation network are discussed. Game theory is used as an underlying concept for the implementation of pricing. In addition, transparent mapping of quality of service parameters to quality of business are considered, encompassing service price dependence on quality of service violation, which is consequently reflected on service provider's revenue.


Author(s):  
Axel Aulin ◽  
Khurram Shahzad ◽  
Robert MacKenzie ◽  
Steven Bott

Abstract Effective and efficient crack management programs for liquids pipelines require consistent, high quality non-destructive examination (NDE) to allow validation of crack in-line inspection (ILI) results. Enbridge leveraged multiple NDE techniques on a 26-inch flash-welded pipe as part of a crack management program. This line is challenging to inspect given the presence of irregular geometry of the weld. In addition, the majority of the flaws are located on the internal surface, so buffing to obtain accurate measurements in the ditch is not possible. As such, to ensure a robust validation of crack ILI performance on the line, phased array ultrasonic testing (PAUT), time-of-flight diffraction (TOFD), and a full matrix capture (FMC) technology were all used as part of the validation dig program. PAUT and FMC were used on most of the flaws characterized as part of the dig program providing a relatively large data set for further analysis. Encoded scans on the flash welded long seam weld were collected in the ditch and additional analyses were performed off-site to characterize and size the flaws. Buff-sizing where possible and coupon cutouts were selected and completed to assist with providing an additional source of truth. Secondary review of results by an NDE specialist improved the quality of the results and identified locations for rescanning due to data quality concerns. Physical defect examinations completed after destructive testing of sample coupon cutouts were utilized to generate a correlation between the actual defect size from fracture surface observation and the field measurements using various NDE methods. This paper will review the findings from the program, including quality-related learnings implemented into standard NDE procedures as well as comparisons of detection and sizing from each methodology. Finally, a summary of the benefits and limitations of each technique based on the experience from a challenging inspection program will be summarized.


2018 ◽  
Vol 14 (10) ◽  
pp. 155014771880568 ◽  
Author(s):  
Wu Jiawei ◽  
Qiao Xiuquan ◽  
Nan Guoshun

Recently, there has been a surge of the video services over the Internet. However, service providers still have difficulties in providing high-quality video streaming due to the problem of scheduling efficiency and the wide fluctuations of end-to-end delays in the existing multi-path algorithms. To solve these two problems affecting video transmission quality, networks are expected to have the capability of dynamically managing the network nodes for satisfying quality-of-service requirements, which is a challenging issue for media streaming applications. Against this changing network landscape, this article proposes a dynamic and adaptive multi-path routing algorithm under three constraints (packet loss, time delay, and bandwidth) that are based on software-defined network for centralized routing computations and real-time network state updating in multimedia applications. Compared with related multi-path routing proposals, dynamic and adaptive multi-path routing makes efficient use of the latest global network state information achieved by the OpenFlow controller and calculates the optimal routes dynamically according to the real-time status information of the link. Moreover, our proposed algorithm can significantly reduce the computational overhead of the controller while completing a fine-grained flow balance. Experimental results show that dynamic and adaptive multi-path routing significantly outperforms other existing scheduling approaches in achieving a 35%–70% improvement in quality-of-service.


2019 ◽  
pp. 016555151986334 ◽  
Author(s):  
Shah Khalid ◽  
Shengli Wu ◽  
Aftab Alam ◽  
Irfan Ullah

Scholars routinely search relevant papers to discover and put a new idea into proper context. Despite ongoing advances in scholarly retrieval technologies, locating relevant papers through keyword queries is still quite challenging due to the massive expansion in the size of the research paper repository. To tackle this problem, we propose a novel real-time feedback query expansion technique, which is a two-stage interactive scholarly search process. Upon receiving the initial search query, the retrieval system provides a ranked list of results. In the second stage, a user selects a few relevant papers, from which useful terms are extracted for query expansion. The newly expanded query is run against the index in real time to generate the final list of research papers. In both stages, citation analysis is involved in further improving the quality of the results. The novelty of the approach lies in the combined exploitation of query expansion and citation analysis that may bring the most relevant papers to the top of the search results list. The experimental results on the Association of Computational Linguistics (ACL) Anthology Network data set demonstrate that this technique is effective and robust for locating relevant papers regarding normalised discounted cumulative gain (nDCG), precision and recall rates than several state-of-the-art approaches.


Sign in / Sign up

Export Citation Format

Share Document