Application of a Mixed Effects Model in Predicting Quality of Experience in World Wide Web Services

Author(s):  
Le Thu Nguyen ◽  
Richard Harris ◽  
Amal Punchihewa ◽  
Jusak Jusak
1996 ◽  
Vol 2 (1) ◽  
pp. 101-110
Author(s):  
Wayne Myles

We live under the spectre of never quite getting beyond the last upgrade in our array of new electronic tools. We have become unwittingly tied to an ever-increasing set of demands to learn, relearn, and apply the latest addition to our technological inventory. The advent of e-mail has compressed communication patterns, committing us to “immediate” responses. World Wide Web home pages explode information sources, leaving us floundering for the best hypertext link to follow. Computer databases spin out reports on every imaginable aspect of our work.  How do we feel about our new status as “electronic advisors”? How is our interaction with students faring in all of this? Have we been able to secure more time for students to draw on our experience and knowledge through these labor-saving devices? What has happened to our priorities? Has quality of service to the students kept abreast with the demands of processing ever-increasing amounts of information? 


1996 ◽  
Vol 5 (2) ◽  
pp. 16-18 ◽  
Author(s):  
Alistair Inglis

A comparative study was made of the ways in which Australian universities are disseminating information about their courses over the World Wide Web. The study examined the quantity and quality of the information provided, the forms in which information is presented, and means of access to the information. The results of the survey indicated that while the majority of universities are now publishing at least some information over the World Wide Web, both the quantity and quality of information is variable. Implications for further development of institutional course information databases are discussed.


Author(s):  
Dr. Manish L Jivtode

The Broker Architecture became popular involving client and server. Representational State Transfer(REST) architecture is the architecture of World Wide Web. REST uses HTTP protocol based on Servlet and ASMX technology is replaced by WCF web service technology. SOAP and REST are two kinds of WCF web services. REST is lightweight compared to SOAP and hence emerged as the popular technology for building distributed applications in the cloud. In this paper, conducted by exposing a HTTP endpoint address, HTTP relay binding (webHttpRelayBinding) and CRUD contract through interface. The interface is decorated using WebGet and WebInvoke attributes. WCF configuration file created using XML tags for use with REST web service.


Replication [7] must work. In fact, few cyberneticists would disagree with the signifi-cant unification of the lookaside buffer and I/O automata, which embodies the practi-cal principles of Bayesian complexity theory. In order to solve this question, we describe a novel methodology for the deployment of object-oriented languages (YAMP), discon-firming that the World Wide Web and robots can collude to realize this intent.


Author(s):  
Bill Karakostas ◽  
Yannis Zorgios

Chapter II presented the main concepts underlying business services. Ultimately, as this book proposes, business services need to be decomposed into networks of executable Web services. Web services are the primary software technology available today that closely matches the characteristics of business services. To understand the mapping from business to Web services, we need to understand the fundamental characteristics of the latter. This chapter therefore will introduce the main Web services concepts and standards. It does not intend to be a comprehensive description of all standards applicable to Web services, as many of them are still in a state of flux. It focuses instead on the more important and stable standards. All such standards are fully and precisely defined and maintained by the organizations that have defined and endorsed them, such as the World Wide Web Consortium (http://w3c. org), the OASIS organization (http://www.oasis-open.org) and others. We advise readers to visit periodically the Web sites describing the various standards to obtain the up to date versions.


2011 ◽  
Vol 126 (2) ◽  
pp. 116-119 ◽  
Author(s):  
S Muthukumarasamy ◽  
Z Osmani ◽  
A Sharpe ◽  
R J A England

AbstractIntroduction:This study aimed to assess the quality of information available on the World Wide Web for patients undergoing thyroidectomy.Methods:The first 50 web-links generated by internet searches using the five most popular search engines and the key word ‘thyroidectomy’ were evaluated using the Lida website validation instrument (assessing accessibility, usability and reliability) and the Flesch Reading Ease Score.Results:We evaluated 103 of a possible 250 websites. Mean scores (ranges) were: Lida accessibility, 48/63 (27–59); Lida usability, 36/54 (21–50); Lida reliability, 21/51 (4–38); and Flesch Reading Ease, 43.9 (2.6–77.6).Conclusion:The quality of internet health information regarding thyroidectomy is variable. High ranking and popularity are not good indicators of website quality. Overall, none of the websites assessed achieved high Lida scores. In order to prevent the dissemination of inaccurate or commercially motivated information, we recommend independent labelling of medical information available on the World Wide Web.


Sign in / Sign up

Export Citation Format

Share Document