scholarly journals Probabilistic Models for Ad Viewability Prediction on the Web

2017 ◽  
Vol 29 (9) ◽  
pp. 2012-2025 ◽  
Author(s):  
Chong Wang ◽  
Achir Kalra ◽  
Li Zhou ◽  
Cristian Borcea ◽  
Yi Chen
Keyword(s):  
Author(s):  
Livia Predoiu

Recently, there has been an increasing interest in formalisms for representing uncertain information on the Semantic Web. This interest is triggered by the observation that knowledge on the web is not always crisp and we have to be able to deal with incomplete, inconsistent and vague information. The treatment of this kind of information requires new approaches for knowledge representation and reasoning on the web as existing Semantic Web languages are based on classical logic which is known to be inadequate for representing uncertainty in many cases. While different general approaches for extending Semantic Web languages with the ability to represent uncertainty are explored, we focus our attention on probabilistic approaches. We survey existing proposals for extending semantic web languages or formalisms underlying Semantic Web languages in terms of their expressive power, reasoning capabilities as well as their suitability for supporting typical tasks associated with the Semantic Web.


Author(s):  
Shaun C. D'Souza

Cognitive neuroscience is the study of how the human brain functions on tasks like decision making, language, perception and reasoning. Deep learning is a class of machine learning algorithms that use neural networks. They are designed to model the responses of neurons in the human brain. Learning can be supervised or unsupervised. Ngram token models are used extensively in language prediction. Ngrams are probabilistic models that are used in predicting the next word or token. They are a statistical model of word sequences or tokens and are called Language Models or Lms. Ngrams are essential in creating language prediction models. We are exploring a broader sandbox ecosystems enabling for AI. Specifically, around Deep learning applications on unstructured content form on the web.


2018 ◽  
Author(s):  
Shaun C. D'Souza

Cognitive neuroscience is the study of how the human brain functions on tasks like decision making, language, perception and reasoning. Deep learning is a class of machine learning algorithms that use neural networks. They are designed to model the responses of neurons in the human brain. Learning can be supervised or unsupervised. Ngram token models are used extensively in language prediction. Ngrams are probabilistic models that are used in predicting the next word or token. They are a statistical model of word sequences or tokens and are called Language Models or Lms. Ngrams are essential in creating language prediction models. We are exploring a broader sandbox ecosystems enabling for AI. Specifically, around Deep learning applications on unstructured content form on the web.


Author(s):  
S. Zimeras

Information system users, administrators, and designers are all interested in performance evaluation since their goal is to obtain or provide the highest performance at the lowest cost. This goal has resulted in continuing evolution of higher performance and lower cost systems leading to today’s proliferation of workstations and personal computers, many of which have better performance than earlier supercomputers. As the variety of Web services applications (Websites) increases, it gets more important to have a set of evaluation criteria that should evaluate the performance of their effectiveness. Based on those criteria, the quality of the services that the Web applications are providing could be analysed. This work represents software metrics that could (or need) be used to quantify the quality of the information that the Web services are providing. These measures could be useful to understand problematic frameworks during the implementation of the Websites and could lead to solutions preventing those problems.


Author(s):  
S. Zimeras

Information system users, administrators, and designers are all interested in performance evaluation since their goal is to obtain or provide the highest performance at the lowest cost. This goal has resulted in continuing evolution of higher performance and lower cost systems leading to today's proliferation of workstations and personal computers, many of which have better performance than earlier supercomputers. As the variety of Web services applications (Websites) increases, it gets more important to have a set of evaluation criteria that should evaluate the performance of their effectiveness. Based on those criteria, the quality of the services that the Web applications are providing could be analysed. This work represents software metrics that could (or need) be used to quantify the quality of the information that the Web services are providing. These measures could be useful to understand problematic frameworks during the implementation of the Websites and could lead to solutions preventing those problems.


2011 ◽  
pp. 1896-1928 ◽  
Author(s):  
Livia Predoiu ◽  
Heiner Stuckenschmidt

Recently, there has been an increasing interest in formalisms for representing uncertain information on the Semantic Web. This interest is triggered by the observation that knowledge on the web is not always crisp and we have to be able to deal with incomplete, inconsistent and vague information. The treatment of this kind of information requires new approaches for knowledge representation and reasoning on the web as existing Semantic Web languages are based on classical logic which is known to be inadequate for representing uncertainty in many cases. While different general approaches for extending Semantic Web languages with the ability to represent uncertainty are explored, we focus our attention on probabilistic approaches. We survey existing proposals for extending semantic web languages or formalisms underlying Semantic Web languages in terms of their expressive power, reasoning capabilities as well as their suitability for supporting typical tasks associated with the Semantic Web


2018 ◽  
Author(s):  
Shaun C. D'Souza

Cognitive neuroscience is the study of how the human brain functions on tasks like decision making, language, perception and reasoning. Deep learning is a class of machine learning algorithms that use neural networks. They are designed to model the responses of neurons in the human brain. Learning can be supervised or unsupervised. Ngram token models are used extensively in language prediction. Ngrams are probabilistic models that are used in predicting the next word or token. They are a statistical model of word sequences or tokens and are called Language Models or Lms. Ngrams are essential in creating language prediction models. We are exploring a broader sandbox ecosystems enabling for AI. Specifically, around Deep learning applications on unstructured content form on the web.


2008 ◽  
Vol 11 (2) ◽  
pp. 83-85
Author(s):  
Howard Wilson
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document