scholarly journals A Friendly Smoothed Analysis of the Simplex Method

2020 ◽  
Vol 49 (5) ◽  
pp. STOC18-449-STOC18-499
Author(s):  
Daniel Dadush ◽  
Sophie Huiberts
Liquidity ◽  
2018 ◽  
Vol 2 (1) ◽  
pp. 59-65 ◽  
Author(s):  
Yanti Budiasih

The purpose of this study are to (1) determine the combination of inputs used in producing products such as beef sausages and veal sausage meatball; and (2) determine the optimal combination whether the product can provide the maximum profit. In order to determine the combination of inputs and maximum benefits can be used linear programming with graphical and simplex method. The valuation result shows that the optimal input combination would give a profit of Rp. 1.115 million per day.


Author(s):  
Qiusheng WANG ◽  
Xiaolan GU ◽  
Yingyi LIU ◽  
Haiwen YUAN

Author(s):  
A. V. Katernyuk

In all spheres business experts try to raise competitiveness of the company by different ways, for instance at the expense of more efficient redistribution of available resources (costs). Objectives connected with modeling and optimizing resources used in advertising are becoming the most topical. Deeper knowledge in planning and conducting any marketing and advertising campaigns are in demand today among many specialists. The process of searching for and finding optimum costs of advertising in the Internet as a factor of the rise in the company sustainability can be successfully shaped through universal matrix methods of solution (e.g. simplex-method). Objectives which cannot be resolved by this method can be supplemented by such economic indicators, as profitability of investment and return on one ruble. The article summarizes the instrumental base dealing with estimating the efficiency of events connected with customer attraction to such a fast growing industry as internet-services. The author proposes besides traditional ways of expense optimization to take into account economic indicators connected with profitability of each sale channel. The following tools were used in the research: modeling, induction method, investment analysis, methods of statistics and formal logics, multi-criteria optimization, specific software meant for solving similar tasks, in particular special macros for excel table.  


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Zhikuan Zhao ◽  
Jack K. Fitzsimons ◽  
Patrick Rebentrost ◽  
Vedran Dunjko ◽  
Joseph F. Fitzsimons

AbstractMachine learning has recently emerged as a fruitful area for finding potential quantum computational advantage. Many of the quantum-enhanced machine learning algorithms critically hinge upon the ability to efficiently produce states proportional to high-dimensional data points stored in a quantum accessible memory. Even given query access to exponentially many entries stored in a database, the construction of which is considered a one-off overhead, it has been argued that the cost of preparing such amplitude-encoded states may offset any exponential quantum advantage. Here we prove using smoothed analysis that if the data analysis algorithm is robust against small entry-wise input perturbation, state preparation can always be achieved with constant queries. This criterion is typically satisfied in realistic machine learning applications, where input data is subjective to moderate noise. Our results are equally applicable to the recent seminal progress in quantum-inspired algorithms, where specially constructed databases suffice for polylogarithmic classical algorithm in low-rank cases. The consequence of our finding is that for the purpose of practical machine learning, polylogarithmic processing time is possible under a general and flexible input model with quantum algorithms or quantum-inspired classical algorithms in the low-rank cases.


Author(s):  
Hongwei Liu ◽  
Rui Yang ◽  
Pingjiang Wang ◽  
Jihong Chen ◽  
Hua Xiang

The objective of this research is to develop a novel correction mechanism to reduce the fluctuation range of tools in numerical control (NC) machining. Error compensation is an effective method to improve the machining accuracy of a machine tool. If the difference between two adjacent compensation data is too large, the fluctuation range of the tool will increase, which will seriously affect the surface quality of the machined parts in mechanical machining. The methodology used in compensation data processing is a simplex method of linear programming. This method reduces the fluctuation range of the tool and optimizes the tool path. The important aspect of software error compensation is to modify the initial compensation data by using an iterative method, and then the corrected tool path data are converted into actual compensated NC codes by using a postprocessor, which is implemented on the compensation module to ensure a smooth running path of the tool. The generated, calibrated, and amended NC codes were immediately fed to the machine tool controller. This technique was verified by using repeated measurements. The results of the experiments demonstrate efficient compensation and significant improvement in the machining accuracy of the NC machine tool.


Sign in / Sign up

Export Citation Format

Share Document