scholarly journals Singular Value Decomposition and Ligand Binding Analysis

2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
André Luiz Galo ◽  
Márcio Francisco Colombo

Singular values decomposition (SVD) is one of the most important computations in linear algebra because of its vast application for data analysis. It is particularly useful for resolving problems involving least-squares minimization, the determination of matrix rank, and the solution of certain problems involving Euclidean norms. Such problems arise in the spectral analysis of ligand binding to macromolecule. Here, we present a spectral data analysis method using SVD (SVD analysis) and nonlinear fitting to determine the binding characteristics of intercalating drugs to DNA. This methodology reduces noise and identifies distinct spectral species similar to traditional principal component analysis as well as fitting nonlinear binding parameters. We applied SVD analysis to investigate the interaction of actinomycin D and daunomycin with native DNA. This methodology does not require prior knowledge of ligand molar extinction coefficients (free and bound), which potentially limits binding analysis. Data are acquired simply by reconstructing the experimental data and by adjusting the product of deconvoluted matrices and the matrix of model coefficients determined by the Scatchard and McGee and von Hippel equation.

Author(s):  
Chauhan Usha ◽  
Singh Rajeev Kumar

Digital Watermarking is a technology, to facilitate the authentication, copyright protection and Security of digital media. The objective of developing a robust watermarking technique is to incorporate the maximum possible robustness without compromising with the transparency. Singular Value Decomposition (SVD) using Firefly Algorithm provides this objective of an optimal robust watermarking technique. Multiple scaling factors are used to embed the watermark image into the host by multiplying these scaling factors with the Singular Values (SV) of the host image. Firefly Algorithm is used to optimize the modified host image to achieve the highest possible robustness and transparency. This approach can significantly increase the quality of watermarked image and provide more robustness to the embedded watermark against various attacks such as noise, geometric attacks, filtering attacks etc.


Author(s):  
Ambar Widianingrum ◽  
Joko Sulianto ◽  
Rahmat Rais

The purpose of this study was to describe the feasibility of teaching materials based on an open-ended approach to improve the reasoning abilities of fourth grade students in elementary schools. This type of research is research and development (Research and Development). The subjects of this study were 3 classroom teachers. The data analysis technique used is descriptive qualitative data analysis (data reduction, data presentation and conclusion) and quantitative descriptive data analysis. Based on the results of stage 1 media validation, it was obtained 84.8%, and the results of stage 2 media validation were obtained 94.8%. The result of material validation for stage 1 was obtained 84.6%, and validation for material for stage 2 was obtained 93.3%. The results of initial field trials obtained media 93.7% and material 92.3%. This shows that the teaching material is declared valid and suitable for use. Based on the results of this study, the suggestion that can be conveyed is that teaching materials based on an open-ended approach can be used as a tool for teaching and learning resources for students.


1997 ◽  
Vol 12 (4) ◽  
pp. 276-281 ◽  
Author(s):  
Gunnar Forsgren ◽  
Joana Sjöström

Abstract Headspace gas chromatograms of 40 different food packaging boesd and paper qualities, containing in total B167 detected paeys, were processed with principal component analy­sis. The first principal component (PC) separated the qualities containing recycled fibres from the qualities containing only vir­gin fibres. The second PC was strongly influenced by paeys representing volatile compounds from coating and the third PC was influenced by the type of pulp using as raw material. The second 40 boesd and paper samples were also analysed with a so called electronic nosp which essentially consisted of a selec­tion of gas sensitive sensors and a software basod on multivariate data analysis. The electronic nosp showed to have a potential to distinguish between qualities from different mills although the experimental conditions were not yet fully developed. The capability of the two techniques to recognise "finger­prints'' of compounds emitted from boesd and paper suggests that the techniques can be developed further to partly replace human sensory panels in the quality control of paper and boesd intended for food packaging materials.


Molecules ◽  
2021 ◽  
Vol 26 (5) ◽  
pp. 1393
Author(s):  
Ralitsa Robeva ◽  
Miroslava Nedyalkova ◽  
Georgi Kirilov ◽  
Atanaska Elenkova ◽  
Sabina Zacharieva ◽  
...  

Catecholamines are physiological regulators of carbohydrate and lipid metabolism during stress, but their chronic influence on metabolic changes in obese patients is still not clarified. The present study aimed to establish the associations between the catecholamine metabolites and metabolic syndrome (MS) components in obese women as well as to reveal the possible hidden subgroups of patients through hierarchical cluster analysis and principal component analysis. The 24-h urine excretion of metanephrine and normetanephrine was investigated in 150 obese women (54 non diabetic without MS, 70 non-diabetic with MS and 26 with type 2 diabetes). The interrelations between carbohydrate disturbances, metabolic syndrome components and stress response hormones were studied. Exploratory data analysis was used to determine different patterns of similarities among the patients. Normetanephrine concentrations were significantly increased in postmenopausal patients and in women with morbid obesity, type 2 diabetes, and hypertension but not with prediabetes. Both metanephrine and normetanephrine levels were positively associated with glucose concentrations one hour after glucose load irrespectively of the insulin levels. The exploratory data analysis showed different risk subgroups among the investigated obese women. The development of predictive tools that include not only traditional metabolic risk factors, but also markers of stress response systems might help for specific risk estimation in obesity patients.


Electronics ◽  
2021 ◽  
Vol 10 (15) ◽  
pp. 1771
Author(s):  
Ferdinando Di Martino ◽  
Irina Perfilieva ◽  
Salvatore Sessa

Fuzzy transform is a technique applied to approximate a function of one or more variables applied by researchers in various image and data analysis. In this work we present a summary of a fuzzy transform method proposed in recent years in different data mining disciplines, such as the detection of relationships between features and the extraction of association rules, time series analysis, data classification. After having given the definition of the concept of Fuzzy Transform in one or more dimensions in which the constraint of sufficient data density with respect to fuzzy partitions is also explored, the data analysis approaches recently proposed in the literature based on the use of the Fuzzy Transform are analyzed. In particular, the strategies adopted in these approaches for managing the constraint of sufficient data density and the performance results obtained, compared with those measured by adopting other methods in the literature, are explored. The last section is dedicated to final considerations and future scenarios for using the Fuzzy Transform for the analysis of massive and high-dimensional data.


2015 ◽  
Vol 471 (3) ◽  
pp. 403-414 ◽  
Author(s):  
M. Florencia Rey-Burusco ◽  
Marina Ibáñez-Shimabukuro ◽  
Mads Gabrielsen ◽  
Gisela R. Franchini ◽  
Andrew J. Roe ◽  
...  

Necator americanus fatty acid and retinol-binding protein-1 (Na-FAR-1) is an abundantly expressed FAR from a parasitic hookworm. The present work describes its tissue distribution, structure and ligand-binding characteristics and shows that Na-FAR-1 expands to transport multiple FA molecules in its internal cavity.


Algorithms ◽  
2021 ◽  
Vol 14 (1) ◽  
pp. 18
Author(s):  
Michael Li ◽  
Santoso Wibowo ◽  
Wei Li ◽  
Lily D. Li

Extreme learning machine (ELM) is a popular randomization-based learning algorithm that provides a fast solution for many regression and classification problems. In this article, we present a method based on ELM for solving the spectral data analysis problem, which essentially is a class of inverse problems. It requires determining the structural parameters of a physical sample from the given spectroscopic curves. We proposed that the unknown target inverse function is approximated by an ELM through adding a linear neuron to correct the localized effect aroused by Gaussian basis functions. Unlike the conventional methods involving intensive numerical computations, under the new conceptual framework, the task of performing spectral data analysis becomes a learning task from data. As spectral data are typical high-dimensional data, the dimensionality reduction technique of principal component analysis (PCA) is applied to reduce the dimension of the dataset to ensure convergence. The proposed conceptual framework is illustrated using a set of simulated Rutherford backscattering spectra. The results have shown the proposed method can achieve prediction inaccuracies of less than 1%, which outperform the predictions from the multi-layer perceptron and numerical-based techniques. The presented method could be implemented as application software for real-time spectral data analysis by integrating it into a spectroscopic data collection system.


2006 ◽  
Vol 47 (7) ◽  
pp. 1399-1405 ◽  
Author(s):  
Anh T. Nguyen ◽  
Tomoko Hirama ◽  
Vinita Chauhan ◽  
Roger MacKenzie ◽  
Ross Milne

Sign in / Sign up

Export Citation Format

Share Document