scholarly journals Function Summarization Modulo Theories

10.29007/d3bt ◽  
2018 ◽  
Author(s):  
Sepideh Asadi ◽  
Martin Blicha ◽  
Grigory Fedyukovich ◽  
Antti Hyvärinen ◽  
Karine Even-Mendoza ◽  
...  

SMT-based program verification can achieve high precision using bit-precise models or combinations of different theories. Often such approaches suffer from problems related to scalability due to the complexity of the underlying decision procedures. Precision is traded for performance by increasing the abstraction level of the model. As the level of abstraction increases, missing important details of the program model becomes problematic. In this paper we address this problem with an incremental verification approach that alternates precision of the program modules on demand. The idea is to model a program using the lightest possible (i.e., less expensive) theories that suffice to verify the desired property. To this end, we employ safe over-approximations for the program based on both function summaries and light-weight SMT theories. If during verification it turns out that the precision is too low, our approach lazily strengthens all affected summaries or the theory through an iterative refinement procedure. The resulting summarization framework provides a natural and light-weight approach for carrying information between different theories. An experimental evaluation with a bounded model checker for C on a wide range of benchmarks demonstrates that our approach scales well, often effortlessly solving instances where the state-of-the-art model checker CBMC runs out of time or memory.

Sensors ◽  
2020 ◽  
Vol 20 (15) ◽  
pp. 4114
Author(s):  
Shao-Kang Huang ◽  
Chen-Chien Hsu ◽  
Wei-Yen Wang ◽  
Cheng-Hung Lin

Accurate estimation of 3D object pose is highly desirable in a wide range of applications, such as robotics and augmented reality. Although significant advancement has been made for pose estimation, there is room for further improvement. Recent pose estimation systems utilize an iterative refinement process to revise the predicted pose to obtain a better final output. However, such refinement process only takes account of geometric features for pose revision during the iteration. Motivated by this approach, this paper designs a novel iterative refinement process that deals with both color and geometric features for object pose refinement. Experiments show that the proposed method is able to reach 94.74% and 93.2% in ADD(-S) metric with only 2 iterations, outperforming the state-of-the-art methods on the LINEMOD and YCB-Video datasets, respectively.


2020 ◽  
Vol 12 ◽  
Author(s):  
Francisco Basílio ◽  
Ricardo Jorge Dinis-Oliveira

Background: Pharmacobezoars are specific types of bezoars formed when medicines, such as tablets, suspensions, and/or drug delivery systems, aggregate and may cause death by occluding airways with tenacious material or by eluting drugs resulting in toxic or lethal blood concentrations. Objective: This work aims to fully review the state-of-the-art regarding pathophysiology, diagnosis, treatment and other relevant clinical and forensic features of pharmacobezoars. Results: patients of a wide range of ages and in both sexes present with signs and symptoms of intoxications or more commonly gastrointestinal obstructions. The exact mechanisms of pharmacobezoar formation are unknown but is likely multifactorial. The diagnosis and treatment depend on the gastrointestinal segment affected and should be personalized to the medication and the underlying factor. A good and complete history, physical examination, image tests, upper endoscopy and surgery through laparotomy of the lower tract are useful for diagnosis and treatment. Conclusion: Pharmacobezoars are rarely seen in clinical and forensic practice. They are related to controlled or immediate-release formulations, liquid or non-digestible substances, in normal or altered digestive motility/anatomy tract, and in overdoses or therapeutic doses, and should be suspected in the presence of risk factors or patients taking drugs which may form pharmacobezoars.


This volume vividly demonstrates the importance and increasing breadth of quantitative methods in the earth sciences. With contributions from an international cast of leading practitioners, chapters cover a wide range of state-of-the-art methods and applications, including computer modeling and mapping techniques. Many chapters also contain reviews and extensive bibliographies which serve to make this an invaluable introduction to the entire field. In addition to its detailed presentations, the book includes chapters on the history of geomathematics and on R.G.V. Eigen, the "father" of mathematical geology. Written to commemorate the 25th anniversary of the International Association for Mathematical Geology, the book will be sought after by both practitioners and researchers in all branches of geology.


Author(s):  
Jeremias Prassl

The rise of the gig economy is disrupting business models across the globe. Platforms’ digital work intermediation has had a profound impact on traditional conceptions of the employment relationship. The completion of ‘tasks’, ‘gigs’, or ‘rides’ in the (digital) crowd fundamentally challenges our understanding of work in modern labour markets: gone are the stable employment relationships between firms and workers, replaced by a world in which everybody can be ‘their own boss’ and enjoy the rewards—and face the risks—of independent businesses. Is this the future of work? What are the benefits and challenges of crowdsourced work? How can we protect consumers and workers without stifling innovation? Humans as a Service provides a detailed account of the growth and operation of gig-economy platforms, and develops a blueprint for solutions to the problems facing on-demand workers, platforms, and their customers. Following a brief introduction to the growth and operation of on-demand platforms across the world, the book scrutinizes competing narratives about ‘gig’ work. Drawing on a wide range of case studies, it explores how claims of ‘disruptive innovation’ and ‘micro-entrepreneurship’ often obscure the realities of precarious work under strict algorithmic surveillance, and the return to a business model that has existed for centuries. Humans as a Service shows how employment law can address many of these problems: gigs, tasks, and rides are work—and should be regulated as such. A concluding chapter demonstrates the broader benefits of a level playing field for consumers, taxpayers, and innovative entrepreneurs.


2021 ◽  
Vol 15 (5) ◽  
pp. 1-32
Author(s):  
Quang-huy Duong ◽  
Heri Ramampiaro ◽  
Kjetil Nørvåg ◽  
Thu-lan Dam

Dense subregion (subgraph & subtensor) detection is a well-studied area, with a wide range of applications, and numerous efficient approaches and algorithms have been proposed. Approximation approaches are commonly used for detecting dense subregions due to the complexity of the exact methods. Existing algorithms are generally efficient for dense subtensor and subgraph detection, and can perform well in many applications. However, most of the existing works utilize the state-or-the-art greedy 2-approximation algorithm to capably provide solutions with a loose theoretical density guarantee. The main drawback of most of these algorithms is that they can estimate only one subtensor, or subgraph, at a time, with a low guarantee on its density. While some methods can, on the other hand, estimate multiple subtensors, they can give a guarantee on the density with respect to the input tensor for the first estimated subsensor only. We address these drawbacks by providing both theoretical and practical solution for estimating multiple dense subtensors in tensor data and giving a higher lower bound of the density. In particular, we guarantee and prove a higher bound of the lower-bound density of the estimated subgraph and subtensors. We also propose a novel approach to show that there are multiple dense subtensors with a guarantee on its density that is greater than the lower bound used in the state-of-the-art algorithms. We evaluate our approach with extensive experiments on several real-world datasets, which demonstrates its efficiency and feasibility.


2021 ◽  
Vol 50 (1) ◽  
pp. 33-40
Author(s):  
Chenhao Ma ◽  
Yixiang Fang ◽  
Reynold Cheng ◽  
Laks V.S. Lakshmanan ◽  
Wenjie Zhang ◽  
...  

Given a directed graph G, the directed densest subgraph (DDS) problem refers to the finding of a subgraph from G, whose density is the highest among all the subgraphs of G. The DDS problem is fundamental to a wide range of applications, such as fraud detection, community mining, and graph compression. However, existing DDS solutions suffer from efficiency and scalability problems: on a threethousand- edge graph, it takes three days for one of the best exact algorithms to complete. In this paper, we develop an efficient and scalable DDS solution. We introduce the notion of [x, y]-core, which is a dense subgraph for G, and show that the densest subgraph can be accurately located through the [x, y]-core with theoretical guarantees. Based on the [x, y]-core, we develop both exact and approximation algorithms. We have performed an extensive evaluation of our approaches on eight real large datasets. The results show that our proposed solutions are up to six orders of magnitude faster than the state-of-the-art.


2021 ◽  
Author(s):  
Danila Piatov ◽  
Sven Helmer ◽  
Anton Dignös ◽  
Fabio Persia

AbstractWe develop a family of efficient plane-sweeping interval join algorithms for evaluating a wide range of interval predicates such as Allen’s relationships and parameterized relationships. Our technique is based on a framework, components of which can be flexibly combined in different manners to support the required interval relation. In temporal databases, our algorithms can exploit a well-known and flexible access method, the Timeline Index, thus expanding the set of operations it supports even further. Additionally, employing a compact data structure, the gapless hash map, we utilize the CPU cache efficiently. In an experimental evaluation, we show that our approach is several times faster and scales better than state-of-the-art techniques, while being much better suited for real-time event processing.


Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 537
Author(s):  
Hongxiang Gu ◽  
Miodrag Potkonjak

Physical Unclonable Functions (PUFs) are known for their unclonability and light-weight design. However, several known issues with state-of-the-art PUF designs exist including vulnerability against machine learning attacks, low output randomness, and low reliability. To address these problems, we present a reconfigurable interconnected PUF network (IPN) design that significantly strengthens the security and unclonability of strong PUFs. While the IPN structure itself significantly increases the system complexity and nonlinearity, the reconfiguration mechanism remaps the input–output mapping before an attacker could collect sufficient challenge-response pairs (CRPs). We also propose using an evolution strategies (ES) algorithm to efficiently search for a network configuration that is capable of producing random and stable responses. The experimental results show that applying state-of-the-art machine learning attacks result in less than 53.19% accuracy for single-bit output prediction on a reconfigurable IPN with random configurations. We also show that, when applying configurations explored by our proposed ES method instead of random configurations, the output randomness is significantly improved by 220.8% and output stability by at least 22.62% in different variations of IPN.


2020 ◽  
Vol 499 (4) ◽  
pp. 5732-5748 ◽  
Author(s):  
Rahul Kannan ◽  
Federico Marinacci ◽  
Mark Vogelsberger ◽  
Laura V Sales ◽  
Paul Torrey ◽  
...  

ABSTRACT We present a novel framework to self-consistently model the effects of radiation fields, dust physics, and molecular chemistry (H2) in the interstellar medium (ISM) of galaxies. The model combines a state-of-the-art radiation hydrodynamics module with a H  and He  non-equilibrium thermochemistry module that accounts for H2 coupled to an empirical dust formation and destruction model, all integrated into the new stellar feedback framework SMUGGLE. We test this model on high-resolution isolated Milky-Way (MW) simulations. We show that the effect of radiation feedback on galactic star formation rates is quite modest in low gas surface density galaxies like the MW. The multiphase structure of the ISM, however, is highly dependent on the strength of the interstellar radiation field. We are also able to predict the distribution of H2, that allow us to match the molecular Kennicutt–Schmidt (KS) relation, without calibrating for it. We show that the dust distribution is a complex function of density, temperature, and ionization state of the gas. Our model is also able to match the observed dust temperature distribution in the ISM. Our state-of-the-art model is well-suited for performing next-generation cosmological galaxy formation simulations, which will be able to predict a wide range of resolved (∼10 pc) properties of galaxies.


Author(s):  
Jose A. Gallud ◽  
Monica Carreño ◽  
Ricardo Tesoriero ◽  
Andrés Sandoval ◽  
María D. Lozano ◽  
...  

AbstractTechnology-based education of children with special needs has become the focus of many research works in recent years. The wide range of different disabilities that are encompassed by the term “special needs”, together with the educational requirements of the children affected, represent an enormous multidisciplinary challenge for the research community. In this article, we present a systematic literature review of technology-enhanced and game-based learning systems and methods applied on children with special needs. The article analyzes the state-of-the-art of the research in this field by selecting a group of primary studies and answering a set of research questions. Although there are some previous systematic reviews, it is still not clear what the best tools, games or academic subjects (with technology-enhanced, game-based learning) are, out of those that have obtained good results with children with special needs. The 18 articles selected (carefully filtered out of 614 contributions) have been used to reveal the most frequent disabilities, the different technologies used in the prototypes, the number of learning subjects, and the kind of learning games used. The article also summarizes research opportunities identified in the primary studies.


Sign in / Sign up

Export Citation Format

Share Document