parametric embedding
Recently Published Documents


TOTAL DOCUMENTS

9
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

New Astronomy ◽  
2020 ◽  
Vol 80 ◽  
pp. 101403 ◽  
Author(s):  
Jaya Upreti ◽  
Satyanarayana Gedela ◽  
Neeraj Pant ◽  
R.P. Pant

Author(s):  
Peng Hu ◽  
Rong Du ◽  
Yao Hu ◽  
Nan Li

Nowadays, item-item recommendation plays an important role in modern recommender systems. Traditionally, this is either solved by behavior-based collaborative filtering or content-based meth- ods. However, both kinds of methods often suffer from cold-start problems, or poor performance due to few behavior supervision; and hybrid methods which can leverage the strength of both kinds of methods are needed. In this paper, we propose a semi-parametric embedding framework for this problem. Specifically, the embedding of an item is composed of two parts, i.e., the parametric part from content information and the non-parametric part designed to encode behavior information; meanwhile, a deep learning algorithm is proposed to learn two parts simultaneously. Extensive experiments on real-world datasets demonstrate the effectiveness and robustness of the proposed method.


2017 ◽  
Vol 12 (1) ◽  
pp. 151-164 ◽  
Author(s):  
Mayer Alvo ◽  
Tze Leung Lai ◽  
Philip L. H. Yu

Author(s):  
Martin Renqiang Min ◽  
Hongyu Guo ◽  
Dongjin Song

Metric learning methods for dimensionality reduction in combination with k-Nearest Neighbors (kNN) have been extensively deployed in many classification, data embedding, and information retrieval applications. However, most of these approaches involve pairwise training data comparisons, and thus have quadratic computational complexity with respect to the size of training set, preventing them from scaling to fairly big datasets. Moreover, during testing, comparing test data against all the training data points is also expensive in terms of both computational cost and resources required. Furthermore, previous metrics are either too constrained or too expressive to be well learned. To effectively solve these issues, we present an exemplar-centered supervised shallow parametric data embedding model, using a Maximally Collapsing Metric Learning (MCML) objective. Our strategy learns a shallow high-order parametric embedding function and compares training/test data only with learned or precomputed exemplars, resulting in a cost function with linear computational complexity for both training and testing. We also empirically demonstrate, using several benchmark datasets, that for classification in two-dimensional embedding space, our approach not only gains speedup of kNN by hundreds of times, but also outperforms state-of-the-art supervised embedding approaches.


2017 ◽  
Vol 11 (7) ◽  
pp. 1543-1556 ◽  
Author(s):  
Mark Elin ◽  
David Shoikhet ◽  
Nikola Tuneski

2007 ◽  
Vol 19 (9) ◽  
pp. 2536-2556 ◽  
Author(s):  
Tomoharu Iwata ◽  
Kazumi Saito ◽  
Naonori Ueda ◽  
Sean Stromsten ◽  
Thomas L. Griffiths ◽  
...  

We propose a new method, parametric embedding (PE), that embeds objects with the class structure into a low-dimensional visualization space. PE takes as input a set of class conditional probabilities for given data points and tries to preserve the structure in an embedding space by minimizing a sum of Kullback-Leibler divergences, under the assumption that samples are generated by a gaussian mixture with equal covariances in the embedding space. PE has many potential uses depending on the source of the input data, providing insight into the classifier's behavior in supervised, semisupervised, and unsupervised settings. The PE algorithm has a computational advantage over conventional embedding methods based on pairwise object relations since its complexity scales with the product of the number of objects and the number of classes. We demonstrate PE by visualizing supervised categorization of Web pages, semisupervised categorization of digits, and the relations of words and latent topics found by an unsupervised algorithm, latent Dirichlet allocation.


1999 ◽  
Vol 49 (3) ◽  
pp. 359-371 ◽  
Author(s):  
Guillermo L�pez ◽  
Francisco Guerra

Sign in / Sign up

Export Citation Format

Share Document