An 0(n 2 lognloglogn) expected time algorithm for the all shortest distance problem

Author(s):  
Tadao Takaoka ◽  
Alistair Moffat
1999 ◽  
Vol 10 (04) ◽  
pp. 465-472 ◽  
Author(s):  
ALAN P. SPRAGUE ◽  
TADAO TAKAOKA

We present an algorithm for the All Pairs Shortest Distance problem on an interval graph on n vertices: after O(n) preprocessing time, the algorithm can deliver a response to a distance query in O(1) time. The method used here is simpler than the method of Chen et al. [4], which has the same preprocessing and query time. It is assumed that an interval model for the graph is given, and ends of intervals are already sorted by coordinate. The preprocessing algorithm can be executed in the EREW PRAM model in O( log n) time, using n/ log n processors. These algorithms (sequential and parallel) may be extended to circular arc graphs, with the same time and processor bounds.


1987 ◽  
Vol 25 (2) ◽  
pp. 77-86 ◽  
Author(s):  
Jyrki Katajainen ◽  
Olli Nevalainen ◽  
Jukka Teuhola
Keyword(s):  

1980 ◽  
Vol 27 (3) ◽  
pp. 428-444 ◽  
Author(s):  
Ronald L. Graham ◽  
Andrew C. Yao ◽  
F. Frances Yao

1986 ◽  
Vol 34 (2) ◽  
pp. 223-231 ◽  
Author(s):  
Pierre Hansen ◽  
Brigitte Jaumard ◽  
Michel Minoux
Keyword(s):  

2008 ◽  
Vol 19 (01) ◽  
pp. 219-242 ◽  
Author(s):  
CORINNA CORTES ◽  
MEHRYAR MOHRI ◽  
ASHISH RASTOGI ◽  
MICHAEL RILEY

We present an exhaustive analysis of the problem of computing the relative entropy of two probabilistic automata. We show that the problem of computing the relative entropy of unambiguous probabilistic automata can be formulated as a shortest-distance problem over an appropriate semiring, give efficient exact and approximate algorithms for its computation in that case, and report the results of experiments demonstrating the practicality of our algorithms for very large weighted automata. We also prove that the computation of the relative entropy of arbitrary probabilistic automata is PSPACE-complete. The relative entropy is used in a variety of machine learning algorithms and applications to measure the discrepancy of two distributions. We examine the use of the symmetrized relative entropy in machine learning algorithms and show that, contrarily to what is suggested by a number of publications in that domain, the symmetrized relative entropy is neither positive definite symmetric nor negative definite symmetric, which limits its use and application in kernel methods. In particular, the convergence of training for learning algorithms is not guaranteed when the symmetrized relative entropy is used directly as a kernel, or as the operand of an exponential as in the case of Gaussian Kernels. Finally, we show that our algorithm for the computation of the entropy of an unambiguous probabilistic automaton can be generalized to the computation of the norm of an unambiguous probabilistic automaton by using a monoid morphism. In particular, this yields efficient algorithms for the computation of the Lp-norm of a probabilistic automaton.


2016 ◽  
Vol 2016 ◽  
pp. 1-6
Author(s):  
Yu Zhou ◽  
Wenfeng Zheng ◽  
Zhixi Shen

This paper investigates the distributed shortest-distance problem of multiagent systems where agents satisfy the same continuous-time dynamics. The objective of multiagent systems is to find a common point for all agents to minimize the sum of the distances from each agent to its corresponding convex region. A distributed consensus algorithm is proposed based on local information. A sufficient condition also is given to guarantee the consensus. The simulation example shows that the distributed shortest-distance consensus algorithm is effective for our theoretical results.


Sign in / Sign up

Export Citation Format

Share Document