dense sequence
Recently Published Documents


TOTAL DOCUMENTS

7
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

Water ◽  
2020 ◽  
Vol 12 (12) ◽  
pp. 3553
Author(s):  
Ioannis Petikas ◽  
Evangelos Keramaris ◽  
Vasilis Kanakoudis

Flood simulation and hydrodynamic modeling of river flow require a dense sequence of river cross-sections. These cross-sections should be perpendicular to the flow path and are usually obtained through an in-field survey that is both a costly and time-consuming procedure. An alternative way to get these river cross-sections is to extract them from Digital Elevation Models (DEM). The accuracy achieved, though, depends on the quality and the resolution of the DEM available. Although there are specialized computer programs available for this process, the entire work must be mainly done manually. Some researchers have presented methods for the automatic extraction, but the cross-sections “produced” are restricted to be planar. This restriction does not ensure that they are perpendicular to the flow at all positions and does not allow them to be close to each other. In this paper, a new method is presented that, along with the algorithm developed, is fully parametric and allows non-planar (or dog-legged) river cross-sections to be extracted. These cross-sections offer two important advantages: (a) they are perpendicular to the flow at each subsection; and (b) they allow a much denser sequence to be formed. Moreover, as the proposed procedure is fully parametric, it can be repeated as many times as necessary, simply by altering any of the specified parameters, until the desirable result is achieved.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Yuling Hong ◽  
Yingjie Yang ◽  
Qishan Zhang

PurposeThe purpose of this paper is to solve the problems existing in topic popularity prediction in online social networks and advance a fine-grained and long-term prediction model for lack of sufficient data.Design/methodology/approachBased on GM(1,1) and neural networks, a co-training model for topic tendency prediction is proposed in this paper. The interpolation based on GM(1,1) is employed to generate fine-grained prediction values of topic popularity time series and two neural network models are considered to achieve convergence by transmitting training parameters via their loss functions.FindingsThe experiment results indicate that the integrated model can effectively predict dense sequence with higher performance than other algorithms, such as NN and RBF_LSSVM. Furthermore, the Markov chain state transition probability matrix model is used to improve the prediction results.Practical implicationsFine-grained and long-term topic popularity prediction, further improvement could be made by predicting any interpolation in the time interval of popularity data points.Originality/valueThe paper succeeds in constructing a co-training model with GM(1,1) and neural networks. Markov chain state transition probability matrix is deployed for further improvement of popularity tendency prediction.


2019 ◽  
Vol 32 (3) ◽  
pp. 317-329 ◽  
Author(s):  
Upasana Tayal ◽  
Ricardo Wage ◽  
Pedro Filipe Ferreira ◽  
Sonia Nielles-Vallespin ◽  
Frederick Howard Epstein ◽  
...  

2016 ◽  
Vol 22 (4) ◽  
pp. 445-468 ◽  
Author(s):  
ZVONKO ILJAZOVIĆ ◽  
LUCIJA VALIDŽIĆ

AbstractA computability structure on a metric space is a set of sequences which satisfy certain conditions. Of a particular interest are those computability structures which contain a dense sequence, so called separable computability structures. In this paper we observe maximal computability structures which are more general than separable computability structures and we examine their properties. In particular, we examine maximal computability structures on subspaces of Euclidean space, we give their characterization and we investigate conditions under which a maximal computability structure on such a space is unique. We also give a characterization of separable computability structures on a segment.


2004 ◽  
Vol 93 (6) ◽  
Author(s):  
Ralf Drautz ◽  
Alejandro Díaz-Ortiz ◽  
Manfred Fähnle ◽  
Helmut Dosch

Sign in / Sign up

Export Citation Format

Share Document