scholarly journals The Distribution-Free Newsboy Problem with Multiple Discounts and Upgrades

2016 ◽  
Vol 2016 ◽  
pp. 1-11 ◽  
Author(s):  
Ilkyeong Moon ◽  
Dong Kyoon Yoo ◽  
Subrata Saha

Most papers on the newsboy problem assume that excess inventory is either sold after discount or discarded. In the real world, overstocks are handled with multiple discounts, upgrades, or a combination of these measures. For example, a seller may offer a series of progressively increasing discounts for units that remain on the shelf, or the seller may use incrementally applied innovations aimed at stimulating greater product sophistication. Moreover, the normal distribution does not provide better protection than other distributions with the same mean and variance. In this paper, we find the differences between normal distribution approaches and distribution-free approaches in four scenarios with mean and variance of demand as the only available data to decision-makers. First, we solve the newsboy problem by considering multiple discounts. Second, we formulate and solve the newsboy problem by considering multiple upgrades. Third, we formulate and solve a mixed newsboy problem characterized with multiple discounts and upgrades. Finally, we extend the model to solve a multiproduct newsboy problem with a storage or a budget constraint and develop an algorithm to find the solutions of the models. Concavity of the models is proved analytically. Extensive computational experiments are presented to verify the robustness of the distribution-free approach. The results show that the distribution-free approach is robust.

2020 ◽  
Author(s):  
Niamh Lennox-Chhugani ◽  
Simon Harris ◽  
Jacqueline Moxon ◽  
Vipul Patel

BACKGROUND Application of artificial intelligence (AI) in healthcare is accelerating but relatively little is yet known about the real-world implementation of AI in clinical workflows. OBJECTIVE In this paper, we have focused on one application of AI as a second reader of breast mammograms in the context of a national breast screening programme. We look at the development and testing of an AI image reading tool for mammograms and the effect of organisational readiness for AI tool adoption. We focus on two aspects of organisational readiness as conceptualised by Weiner (2009) for AI technology specifically and answer the questions (1) what are the views of the technology adopters in a healthcare organisation to the use of AI technology in the case of breast screening? (2) What are some of the emerging organisation factors that are likely to effect adoption and spread and are any unique to AI technology? METHODS A prospective mixed methods study of the real-world development of AI tools for use in the National Breast Screening Programme in England. We recruited 67 radiologists and reporting radiographers in four breast screening services and 18 organisational leaders who were the AI project decision-makers. Data was collected using an online survey of breast screening staff (adopters), semi-structured interviews with organisational leaders, participant observation of project meetings and document review. Data regarding organisational and adopter readiness for technology adoption was analysed over the duration of the project. RESULTS Sixty-seven clinicians and eighteen organisational leaders participated the study. Commitment to adoption is positive but adopters want to see clinical evidence of AI safety and accuracy. Decision-makers and other organisational adopters do not yet have shared views on their resources, capacity and capability to adopt and spread the technology and significant challenges related to task demands and situational factors emerged during the project causing substantial delays to adoption. The nature of AI and ML technology surfaced novel complexities not encountered by traditional health technology related to explainability and meaningful decision-support. CONCLUSIONS The case study shows that adopter commitment in this case and AI technology in breast screening is growing but gaps remain in the collective capability of organisations to adopt these novel technologies. CLINICALTRIAL Not applicable


2021 ◽  
Author(s):  
Leonardo de Lima

The literature on network reliability shows that Harary networks are designed so that the link reliability is maximum in many cases. The following question: ``what are the best topology networks in maximizing node reliability?'' is still open. In this paper, we performed computational experiments with eleven real-world networks and their corresponding Harary graphs. The node reliability of both sets of networks was computed. Computational results point out that the Harary network has a topology with high node reliability if compared to the real-world networks studied.


Author(s):  
Daniel Link ◽  
Markus Raab

AbstractHuman behavior is often assumed to be irrational, full of errors, and affected by cognitive biases. One of these biases is base-rate neglect, which happens when the base rates of a specific category are not considered when making decisions. We argue here that while naïve subjects demonstrate base-rate neglect in laboratory conditions, experts tested in the real world do use base rates. Our explanation is that lab studies use single questions, whereas, in the real world, most decisions are sequential in nature, leading to a more realistic test of base-rate use. One decision that lends itself to testing base-rate use in real life occurs in beach volleyball—specifically, deciding to whom to serve to win the game. Analyzing the sequential choices in expert athletes in more than 1,300 games revealed that they were sensitive to base rates and adapted their decision strategies to the performance of the opponent. Our data describes a threshold at which players change their strategy and use base rates. We conclude that the debate over whether decision makers use base rates should be shifted to real-world tests, and the focus should be on when and how base rates are used.


Mathematics ◽  
2021 ◽  
Vol 9 (20) ◽  
pp. 2636
Author(s):  
Napat Harnpornchai ◽  
Wiriyaporn Wonggattaleekam

The paper addresses a new facet of problem regarding the application of AHP in the real world. There are occasions that decision makers are not certain about relative importance assignment in pairwise comparison. The decision makers think the relative importance is among a set of scales, each of which is associated with a different possibility degree. A Discrete Single Valued Neutrosophic Number (DSVNN) with specified degrees of truth, indeterminacy, and falsity is employed to represent each assignment by taking into account all possible scales according to the decision maker’s thought. Each DSVNN assignment is transformed into a crisp value via a deneutrosophication using a similarity-to-absolute-truth measure. The obtained crisp scales are input to a pairwise comparison matrix for further analysis. The proposed neutrosophic set-based relative importance assignment is another additional novelty of the paper, which is different from all prior studies focusing only on the definition of measurement scales. The presented assignment emulates the real-world approach of decision making in human beings which may consider more than one possibility. It is also shown herein that the single and crisp relative importance assignment in the original AHP by Saaty is just a special case of the proposed methodology. The sensitivity analysis informs that when decision makers have neither absolute truth nor falsity about a scale, the proposed methodology is recommended for obtaining reliable relative importance scale. The applicability of the proposed methodology to the real-world problem is shown through the investment in equity market.


2020 ◽  
Author(s):  
◽  
Ion Agirrezabal

According to the World Health Organization, the key goal of health systems is to improve the average level of the population health and to reduce health inequalities in the population. In order to realise this goal, health system decision-makers need to decide which health technologies to invest in and which not to. Health technology assessment (HTA) provides a framework for decision-makers to make resource allocation and priority setting decisions based on the existing evidence. Considering the increasingly tight healthcare budgets and the rich pipeline of high-cost, innovative drugs very likely coming to market in the next few years, it is crucial that a robust and transparent HTA process be undertaken to assess these drugs, evaluating all aspects of the disease and treatment and involving all stakeholders affected. We conducted three standalone projects analysing different aspects of recently launched innovative drugs. In our first study, we combined high-quality sources of evidence, both from the real-world and randomised controlled trials, to evaluate the cost-effectiveness of carfilzomib for treating multiple myeloma patients. By harnessing the power of these data sources, we demonstrated that the reimbursement of carfilzomib is likely to represent an efficient allocation of existing resources. Despite the availability of good sources of evidence, the real-world distribution and use of innovative drugs may not be efficient nor fair, and this is what we demonstrated with our two other studies. Firstly, we showed that significant inequalities exist in the distribution of anti-osteoporosis drugs in primary care in England. The most striking case was that of denosumab, a high-cost innovative treatment, with prescriptions disproportionately concentrated among the least deprived. Substantial inequalities also exist in the use of insulin glargine biosimilars in primary care in England, even though guidelines and initiatives to promote the use of biosimilars have been put in place. In this study we observed that the real-world savings realised from the use of insulin glargine biosimilars represents a small proportion compared with what could have been achieved should their uptake had been higher. The results of these two studies, therefore, show that resource allocation may not be efficient nor fair in the real world, and similar situations are likely to exist in other disease areas. In summary, even though in many cases ample evidence exists to assist healthcare authorities making resource allocation decisions, we have demonstrated that resource allocation in the real world may not be optimal.


2020 ◽  
pp. 18-26
Author(s):  
Philippe Schweizer ◽  

Uncertainty is inherent to the real world: everything is only probable, precision like in measurements is finite, noise is everywhere... Also, science is based on a modeling of reality that can only be approximate. Therefore we postulate that uncertainty should be considered in our models, and for making this more easy we propose a simple operational conceptualization of uncertainty. Starting from the simple model of associating a probability p to a statement supposed to be true our proposed modeling bridges the gap towards the most complex representation proposed by neutrosophy as a triplet of probabilities. The neutrosophic representation consists in using a triplet of probabilities (t,i,f) instead of just a single probability. In this triplet, t represents the probability of the statement to be true, and f it's the probability to be false. The specific point of neutrosophy it that the probability i represents the probability of the statement to be uncertain, imprecise, or neutral among other significations according to the application. Our proposed representation uses only 2 probabilities instead of 3, and it can be easily translated into the neutrosophic representation. By being simpler we renounce to some power of representing the uncertain but we encourage the modeling of uncertainty (instead of ignoring it) by making this simpler. Briefly said, the prepare the path towards using neutrosophy. Our proposed representation of uncertainty consist, for a statement, not only to add its probability to be true p, but also a second probability pp to model the confidence we have in the first probability p. This second parameter pp represents the plausibility of p, therefore the opposite of its uncertainty. This is the confidence given to the value of p, in short pp is the probability of p (hence the name pp), This is simple to understand, and that allows calculations of combined events using classical probability such as based on the concepts of mean and variance. The stringent advantage of our modeling by the couple (p,pp) is that experts can be easily interrogated to provide their expertise by asking them simply the chance they give to an event a occur (this is p) and the confidence they have in that prediction (which is pp). We give also a formula to transform from our model to the neutrosophic representation. Finally, a short discussion on the entropy as a measure of uncertainty is done.


Author(s):  
Raanan Lipshitz

We analyzed 112 self reports of decision-making under uncertainty to find how decision makers conceptualize uncertainty and cope with it in the real world. The results show that decision makers distinguish between three types of uncertainty, inadequate understanding, incomplete information and undifferentiated alternatives, to which they apply five strategies of coping, reducing uncertainty, assumption-based reasoning, weighing pros and cons of competing alternatives, suppressing uncertainty, and forestalling. The relationships between these types of uncertainty and tactics of coping suggest a R.A.W.F.S. (Reduction, Assumption based reasoning, Weighing pros and cons, Forestalling and Suppression) heuristic of contingent coping with uncertainty in naturalistic settings.


Author(s):  
Lawrence A. Boland

In the real world, the process of reaching the assumed equilibrium involves decision makers’ knowledge and their awareness of any disequilibrium. Equilibrium attainment also requires their making the correct decisions required for a ‘stable’ equilibrium. Any model which fails to explicitly address the equilibrium process and its requirements is vulnerable to criticism of the model’s realism. This chapter explores, specifically, whether the knowledge required to reach equilibrium can ever be attained by participants, whether the process of obtaining that knowledge can be consistent with the requirements of achieving an equilibrium. It also explores the ‘ignorant consumer’ who has no way of knowing that he or she is not maximizing.


2012 ◽  
Vol 178-181 ◽  
pp. 2854-2858
Author(s):  
Ming Yu Zhao ◽  
Zhen Yu Wang ◽  
Qiong Liu

Crew rostering is one of the most important scheduling problems for large airlines. In this study, a LP resolving algorithm in the B&B tree context and a heuristic for the IP solution of the crew rostering problem were proposed. The efficiency mainly comes from the special structure of the crew rostering problem. The computational experiments on the real-world problems showed that the time of the proposed resolving algorithm doesn't exceed 30% of that solves from scratch although the problem has largely changed, and the heuristic for the IP solution can rapidly get a sufficiently good scheme for all crew.


2021 ◽  
Author(s):  
Leonardo de Lima

The literature on network reliability shows that Harary networks are designed so that the link reliability is maximum in many cases. The following question: ``what are the best topology networks in maximizing node reliability?'' is still open. In this paper, we performed computational experiments with eleven real-world networks and their corresponding Harary graphs. The node reliability of both sets of networks was computed. Computational results point out that the Harary network has a topology with high node reliability if compared to the real-world networks studied.


Sign in / Sign up

Export Citation Format

Share Document