Estimating Effective Web Server Response Time

Author(s):  
Mehul Nalin Vora ◽  
Dhaval Shah
Keyword(s):  
Author(s):  
Ibrahim Mahmood Ibrahim ◽  
Siddeeq Y. Ameen ◽  
Hajar Maseeh Yasin ◽  
Naaman Omar ◽  
Shakir Fattah Kak ◽  
...  

Today, web services rapidly increased and are accessed by many users, leading to massive traffic on the Internet. Hence, the web server suffers from this problem, and it becomes challenging to manage the total traffic with growing users. It will be overloaded and show response time and bottleneck, so this massive traffic must be shared among several servers. Therefore, the load balancing technologies and server clusters are potent methods for dealing with server bottlenecks. Load balancing techniques distribute the load among servers in the cluster so that it balances all web servers. The motivation of this paper is to give an overview of the several load balancing techniques used to enhance the efficiency of web servers in terms of response time, throughput, and resource utilization. Different algorithms are addressed by researchers and get good results like the pending job, and IP hash algorithms achieve better performance.


2010 ◽  
Vol 29 (4) ◽  
pp. 214 ◽  
Author(s):  
Margaret Brown-Sica ◽  
Jeffrey Beall ◽  
Nina McHale

Response time as defined for this study is the time that it takes for all files that constitute a single webpage to travel across the Internet from a Web server to the end user’s browser. In this study, the authors tested response times on queries for identical items in five different library catalogs, one of them a next-generation (NextGen) catalog. The authors also discuss acceptable response time and how it may affect the discovery process. They suggest that librarians and vendors should develop standards for acceptable response time and use it in the product selection and development processes.


Author(s):  
Fatma Mbarek ◽  
Volodymyr Mosorov ◽  
Rafał Wojciechowski

This paper investigates the characteristics of web server response delay in order to understand and analyze the optimisation techniques of reducing latency. The analysis of the latency behavior for multi-process Apache HTTP server with different thread count and various workloads, was made. It was indicated, that the insufficient number of threads used by the server handling the concurrent requests of clients, is responsible for increasing latency under various loads. The problem can be solved by using a modified web server configuration allowing to reduce the response time.


2018 ◽  
Vol 5 (2) ◽  
pp. 68-78 ◽  
Author(s):  
Intan Yuli Andhica ◽  
Dadan Irwan

ABSTRACT   One of the most frequently used server service functions is to provide a website access service, called a web server. In this study, we use two servers with Ubuntu operating system and Turnkey Linux with its objective to compare web performance to get parameters of response time value and throughput value. The test is carried out with a request rate of 10 to 100 with 1000 and 2000 connections. Based on the results obtained, Web server which uses Ubuntu Linux is better than Turnkey Linux, it is indicated by the value of small response time which means fast response and large throughput value, and this means good. A Web server that has 1000 connections generates a response time and a good throughput value, while the 2000 number of connections generates a response time value and a reduced throughput value, it is because for more than 1000 connections can affect the speed of the system.   Keywords : performance, web server, response time, thoughput     ABSTRAK   Salah satu fungsi layanan server yang sering digunakan adalah menyediakan layanan akses situs web, yang disebut sebagai web server. Dalam penelitian ini menggunakan dua server dengan sistem operasi Ubuntu dan Turnkey Linux dengan tujuan  membandingkan performa kinerja web  untuk mendapatkan parameter nilai responsese time dan nilai throughput. Pengujian dilakukan dengan request rate sebanyak 10 sampai 100 dengan jumlah 1000 koneksi dan 2000 koneksi. Berdasarkan hasil pengujian diperoleh bahwa Web server yang menggunakan Ubuntu Linux lebih baik dari Turnkey Linux yang ditunjukkan oleh nilai response time  kecil yang berarti response cepat dan nilai throughput besar yang berarti baik. Web server yang  memiliki jumlah 1000 koneksi menghasilkan nilai response time dan nilai throughput yang cukup baik, sedangkan jumlah 2000 koneksi menghasilkan nilai response time dan nilai throughput yang menurun karena jumlah koneksi yang lebih dari 1000 dapat mempengaruhi kecepatan sistem .        Kata kunci : performa, web server, response time, throughput


2004 ◽  
Vol 22 (1) ◽  
pp. 49-93 ◽  
Author(s):  
David Olshefski ◽  
Jason Nieh ◽  
Dakshi Agrawal
Keyword(s):  

Author(s):  
Fatma Mbarek ◽  
Volodymyr Mosorov ◽  
Rafał Wojciechowski

This paper investigates the characteristics of web server response delay in order to understand and analyze the optimization techniques of reducing latency. The analysis of the latency behavior for multi-process Apache HTTP server with different thread count and various workloads, was made. It was indicated, that the insufficient number of threads used by the server handling the concurrent requests of clients, is responsible for increasing latency under various loads. The problem can be solved by using a modified web server configuration allowing to reduce the response time.


Sign in / Sign up

Export Citation Format

Share Document