scholarly journals Design Time Optimization for Hardware Watermarking Protection of HDL Designs

2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
E. Castillo ◽  
D. P. Morales ◽  
A. García ◽  
L. Parrilla ◽  
E. Todorovich ◽  
...  

HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time.

Author(s):  
Wim Vanderbauwhede

With the increase in System-on-Chip (SoC) complexity and CMOS technology capabilities, the SoC design community has recently observed a convergence of a number of critical trends, all of them aimed at addressing the design gap: the advent of heterogeneous multicore SoCs and Networks-on-Chip and the recognition of the need for design reuse through Intellectual Property (IP) cores, for dynamic reconfigurability and for high abstraction-level design. In this chapter, we present a solution for High-level Programming of Dynamically Reconfigurable NoC-based Heterogeneous Multicore SoCs. Our solution, the Gannet framework, allows IP core-based Heterogeneous Multicore SoCs to be programmed using a high-level language whilst preserving the full potential for parallelism and dynamic reconfigurability inherent in such a system. The required hardware infrastructure is small and low-latency, thus adding full dynamic reconfiguration capabilities with a small overhead both in area and performance.


2016 ◽  
Author(s):  
Ahmed A. Metwally ◽  
Yang Dai ◽  
Patricia W. Finn ◽  
David L. Perkins

AbstractMetagenome shotgun sequencing presents opportunities to identify organisms that may prevent or promote disease. The analysis of sample diversity is achieved by taxonomic identification of metagenomic reads followed by generating an abundance profile. Numerous tools have been developed based on different design principles. Tools achieving high precision can lack sensitivity in some applications. Conversely, tools with high sensitivity can suffer from low precision and require long computation time. In this paper, we present WEVOTE (WEighted VOting Taxonomic idEntification), a method that classifies metagenome shotgun sequencing DNA reads based on an ensemble of existing methods using k-mer-based, marker-based, and naive-similarity based approaches. Our evaluation on fourteen benchmarking datasets shows that WEVOTE improves the classification precision by reducing false positive annotations while preserving a high level of sensitivity. WEVOTE is an efficient and automated tool that combines multiple individual taxonomic identification methods to produce more precise and sensitive microbial profiles. WEVOTE is developed primarily to identify reads generated by MetaGenome Shotgun sequencing. It is expandable and has the potential to incorporate additional tools to produce a more accurate taxonomic profile. WEVOTE was implemented using C++ and shell scripting and is available at www.bitbucket.org/ametwally/wevote


Author(s):  
G. Rozi ◽  
V. Thanopoulos ◽  
N. Geladas ◽  
E. Soultanaki ◽  
M. Dopsaj

The effect of the anthropometric characteristics on performances has been the subject of many studies (Reilly, T., Bangsbo, J., & Franks, A. (2000). Anthropometric and physiological predispositions for elite soccer. Journal of Sports Science, 18(9), 669–683) but performance also depends on different physiological parameters. The aim of the present study is to define the anthropometric and physiological variables that best predict performance time of 100 m freestyle swimming. Twenty-five competitive male swimmers (age: 15 ± 1.2 years) participated in the research. Multiple stepwise regression analysis showed that arm span is the best predictor of 100 m freestyle swimming performance (r = 0.835). Arm span explains 68.5% of the variance of dependent variable (Adj R2: 0.685). In the final model, the variables that best describe 100 m freestyle swimming are the number of strokes of 100 m freestyle swimming, triceps skinfold, pelvis and shoulders width (Adj R2: 0.882). These findings confirm the importance of the anthropometric variables for swimming performance and could help coaches in the selection of high-level athletes.


2018 ◽  
Vol 6 (3) ◽  
pp. 118-123 ◽  
Author(s):  
Lailan M. Haji ◽  
Subhi R.M. Zeebaree ◽  
Karwan Jacksi ◽  
Diyar Q. Zeebaree

Through the huge growth of heavy computing applications which require a high level of performance, it is observed that the interest of monitoring operating system performance has also demanded to be grown widely. In the past several years since OS performance has become a critical issue, many research studies have been produced to investigate and evaluate the stability status of OSs performance. This paper presents a survey of the most important and state of the art approaches and models to be used for performance measurement and evaluation. Furthermore, the research marks the capabilities of the performance-improvement of different operating systems using multiple metrics. The selection of metrics which will be used for monitoring the performance depends on monitoring goals and performance requirements. Many previous works related to this subject have been addressed, explained in details, and compared to highlight the top important features that will very beneficial to be depended for the best approach selection.


2019 ◽  
Vol 214 ◽  
pp. 01051
Author(s):  
Julie Kirk

The design and performance of the ATLAS Inner Detector (ID) trigger algorithms running online on the High Level Trigger (HLT) processor farm for 13 TeV LHC collision data with high pileup are discussed. The HLT ID tracking is a vital component in all physics signatures in the ATLAS trigger for the precise selection of the rare or interesting events necessary for physics analysis without overwhelming the offline data storage in terms of both size and rate. To cope with the high interaction rates expected in the 13 TeV LHC collisions, the ID trigger was redesigned during the 2013-15 long shutdown. The performance of the ID trigger in Run 2 data from 13 TeV LHC collisions has been excellent and exceeded expectations, even at the very high interaction multiplicities observed at the end of data-taking in 2017. The detailed efficiencies and resolutions of the ID trigger in a wide range of physics signatures are presented for the Run 2 data. The superb performance of the ID trigger algorithms in these extreme pileup conditions demonstrates how the ID tracking continues to lie at the heart of the trigger performance to enable the ATLAS physics program, and will continue to do so in the future.


1986 ◽  
Vol 34 (10) ◽  
pp. 1245-1252 ◽  
Author(s):  
G Shippey ◽  
A D Carothers ◽  
J Gordon

The Medical Research Council's fast interval processor (FIP) has been adapted for metaphase finding and selection. This article summarizes recent improvements to the hardware, and describes the selection of image features. The system uses a highly simplified but effective clustering procedure to reduce computation time, and incorporates a ranking algorithm based on computed cluster features so that high-quality metaphases can be preferentially selected. Experimental results indicate that the system can detect high-quality metaphases rapidly in "rich" material and a high proportion of the available metaphases in "sparse" material. It can handle a wide range of material with good repeatability of performance.


Author(s):  
Michael J Schmid ◽  
Achim Conzelmann ◽  
Claudia Zuber

Talent selection in rowing is often solely based on anthropometric and performance variables, even though psychological characteristics are considered to be important contributors to successful talent development. Because multidimensional talent models and holistic theories represent the state-of-the-art in talent research, we aimed to find patterns connecting psychological and performance variables to future success in rowing. Therefore, 22 coaches rated the achievement-motivated behavior represented by the variables proactivity, ambition and commitment of 65 competitive to high-level athletes ( Mage = 17.2 ± 1.55 years) for the past year (t1). Additionally, the athletes performed several 2,000 m ergometer tests during that same period. At t2 (30 months later), each rower’s performance was evaluated based on the success at different competitions. To examine the results, we used the person-oriented Linking of Clusters after removal of a Residue (LICUR) method to identify the relationships between the achievement-motivated behavior and ergometer results at t1 and the success at t2. The rowers could be assigned to five clusters. Although the highly motivated rowers were not the fastest on the ergometer at t1, they were more likely to be in highest performance level at t2 compared to the other clusters (OR = 3.5, p < .05). By contrast, all the ambitionless rowers and unmotivated rowers were either racing at national level or had dropped out. In conclusion, certain patterns of achievement-motivated behavior and current performance are associated with future success (30 months later). The consideration of achievement-motivated behavior in the selection of rowers seems promising in this context.


Author(s):  
Ahmad Arif Santosa ◽  
Anak Agung Ngurah Perwira Redi

Sistem tanda tangan digital telah banyak dimanfaatkan terutama untuk kegiatan persetujuan dokumen selama pandemi COVID-19 di Indonesia. Penelitian ini bertujuan untuk mengetahui faktor-faktor dari dimensi sustainability yang mempengaruhi keberlanjutan dalam menggunakan sistem tanda tangan digital berdasarkan metode pendekatan AHP. Penelitian ini menggunakan metode AHP karena metode ini mampu menyelesaikan persoalan dalam suatu kerangka berpikir yang terorganisir sehingga dapat mengambil keputusan secara efektif dan akurat terhadap suatu persoalan dalam penelitian. Hasil penelitian menunjukkan bahwa sub-kriteria continuous improvement dari dimensi ekonomi menjadi prioritas utama yang menjadi penunjang dalam keberlanjutan bisnis penyedia tanda tangan digital. Selanjutnya, melakukan analisis pemilihan platform tanda tangan digital antara PrivyID, DigiSign, dan Manual Input. Berdasarkan hasil perhitungan terhadap tiga alternatif menghasilkan platform DigiSign yang paling unggul dibandingkan dengan platform PrivyID dan Manual Input. Hasil tersebut sejalan dengan komitmen dari platform DigiSign yang memberikan kemudahan bagi pengguna agar dapat memeriksa dokumen yang tertunda dengan cepat, menandatangani dokumen dengan tingkat keamanan yang tinggi dan enkripsi berstandar Internasional, serta dapat melacak status dokumen dengan mudah. Sedangkan tanda tangan elektronik yang dilakukan dengan cara Manual Input memiliki kelemahan yang sangat kritikal, dimana tanda tangan elektronik tersebut tidak terenkripsi sehingga tidak mampu untuk melindungi dokumen dari pencurian data identitas atau entitas perusahaan oleh pihak yang tidak bertanggung jawab. Abstract             The digital signature system has been widely used, especially for document approval activities during the COVID-19 pandemic in Indonesia. This study aims to determine the factors of the sustainability dimension that affect sustainability in using a digital signature system based on the AHP approach. This study uses the AHP method because this method is able to solve problems in an organized framework so that it can take a decisions effectively and accurately on a research problem. The results of this study indicate that the sub-criteria for continuous improvement from the economic dimension is the main priority that supports the sustainability of the digital signature provider business. Furthermore, analyze the selection of digital signature platforms between PrivyID, DigiSign, and Manual Input. Based on the results of the calculation of the three alternatives, the DigiSign platform is the most superior compared to the PrivyID platform and Manual Input. This results are in line with the commitment of the DigiSign platform which makes it easy for users to quickly check pending documents, sign the documents with a high level of security and International standard encryption, and easy to tracking the document status. Meanwhile, electronic signatures made by Manual Input have a very critical weakness, where the electronic signature is not encrypted so it is unable to protect documents from theft of identity data or corporate entities by irresponsible parties.


Methodology ◽  
2007 ◽  
Vol 3 (1) ◽  
pp. 14-23 ◽  
Author(s):  
Juan Ramon Barrada ◽  
Julio Olea ◽  
Vicente Ponsoda

Abstract. The Sympson-Hetter (1985) method provides a means of controlling maximum exposure rate of items in Computerized Adaptive Testing. Through a series of simulations, control parameters are set that mark the probability of administration of an item on being selected. This method presents two main problems: it requires a long computation time for calculating the parameters and the maximum exposure rate is slightly above the fixed limit. Van der Linden (2003) presented two alternatives which appear to solve both of the problems. The impact of these methods in the measurement accuracy has not been tested yet. We show how these methods over-restrict the exposure of some highly discriminating items and, thus, the accuracy is decreased. It also shown that, when the desired maximum exposure rate is near the minimum possible value, these methods offer an empirical maximum exposure rate clearly above the goal. A new method, based on the initial estimation of the probability of administration and the probability of selection of the items with the restricted method ( Revuelta & Ponsoda, 1998 ), is presented in this paper. It can be used with the Sympson-Hetter method and with the two van der Linden's methods. This option, when used with Sympson-Hetter, speeds the convergence of the control parameters without decreasing the accuracy.


Sign in / Sign up

Export Citation Format

Share Document