A Parallel Version of the Lindsey-Fox algorithm for factoring High Degree Polynomials in Signal Processing

Author(s):  
C. Burrus ◽  
James Fox ◽  
Gary Sitton ◽  
Sven Treitel
1986 ◽  
Vol 14 (3) ◽  
pp. 139-159 ◽  
Author(s):  
A. G. Veith

Abstract A system, called the “Driving Severity Monitor” (DSM), has been developed for characterizing tire force distribution as related to treadwear in either normal tire use or in tire fleet testing in a convoy. The system consists of an accelerometer for monitoring lateral accelerations, a wheel revolution counter, and a module for signal processing and read-out. The output of the DSM is reduced to a single index, the Driving Severity Number (DSN), which characterizes a vehicle journey. The DSN is equal to the sum of squares of lateral acceleration measured once per tire revolution during a trip, divided by the number of wheel revolutions. The DSN had a high degree of correlation (R ≧ 0.95) with treadwear in two wear programs when pavement abrasiveness was held constant. This supports the concept that the three basic treadwear components: tire force distribution, pavement abrasiveness, and ambient temperature, can be separated for better understanding of tire treadwear.


Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 2988 ◽  
Author(s):  
Zhenhong Chen ◽  
Yingtao Ding ◽  
Shiwei Ren ◽  
Zhiming Chen

Recently, the concept of the difference and sum co-array (DSCa) has attracted much attention in array signal processing due to its high degree of freedom (DOF). In this paper, the DSCa of the nested array (NA) is analyzed and then an improved nested configuration known as the diff-sum nested array (DsNA) is proposed. We find and prove that the sum set for the NA contains all the elements in the difference set. Thus, there exists the dual characteristic between the two sets, i.e., for the difference result between any two sensor locations of the NA, one equivalent non-negative/non-positive sum result of two other sensor locations can always be found. In order to reduce the redundancy for further DOF enhancement, we develop a new DsNA configuration by moving nearly half the dense sensors of the NA to the right side of the sparse uniform linear array (ULA) part. These moved sensors together with the original sparse ULA form an extended sparse ULA. For analysis, we provide the closed form expressions of the DsNA locations as well as the DOF. Compared with some novel sparse arrays with large aperture such as the NA, coprime array and augmented nested array, the DsNA can achieve a higher number of DOF. The effectiveness of the proposed array is proved by the simulations.


2000 ◽  
Vol 10 (02n03) ◽  
pp. 165-176 ◽  
Author(s):  
EDWIN RIJPKEMA ◽  
ED F. DEPRETTERE ◽  
BART KIENHUIS

High level modeling and (quantitative) performance analysis of signal processing systems requires high level models for the applications (algorithms) and the implementations (architectures), a mapping of the former into the latter, and a simulator for fast execution of the whole. Signal processing algorithms are very often nested-loop algorithms with a high degree of inherent parallelism. This paper presents – for such applications – a suitable application model and a method to convert a given imperative executable specification to a specification in terms of this application model. The methods and tools are illustrated by means of an example.


Sensors ◽  
2021 ◽  
Vol 21 (20) ◽  
pp. 6812
Author(s):  
Shane Reid ◽  
Sonya Coleman ◽  
Philip Vance ◽  
Dermot Kerr ◽  
Siobhan O’Neill

Retail shoplifting is one of the most prevalent forms of theft and has accounted for over one billion GBP in losses for UK retailers in 2018. An automated approach to detecting behaviours associated with shoplifting using surveillance footage could help reduce these losses. Until recently, most state-of-the-art vision-based approaches to this problem have relied heavily on the use of black box deep learning models. While these models have been shown to achieve very high accuracy, this lack of understanding on how decisions are made raises concerns about potential bias in the models. This limits the ability of retailers to implement these solutions, as several high-profile legal cases have recently ruled that evidence taken from these black box methods is inadmissible in court. There is an urgent need to develop models which can achieve high accuracy while providing the necessary transparency. One way to alleviate this problem is through the use of social signal processing to add a layer of understanding in the development of transparent models for this task. To this end, we present a social signal processing model for the problem of shoplifting prediction which has been trained and validated using a novel dataset of manually annotated shoplifting videos. The resulting model provides a high degree of understanding and achieves accuracy comparable with current state of the art black box methods.


Geophysics ◽  
1988 ◽  
Vol 53 (4) ◽  
pp. 553-557 ◽  
Author(s):  
Charles T. Young ◽  
John R. Booker ◽  
Ricardo Fernandez ◽  
George R. Jiracek ◽  
Mario Martinez ◽  
...  

Given the degree of complexity of modern magnetotelluric (MT) instrumentation, comparison of the total performance for two or more systems is an important verification test. This paper compares the processed data from five MT systems which were designed and constructed separately, and which employ different electrode types, electrode separations, magnetometers, and methods of signal processing. The comparison shows that there is a high degree of agreement among the data from the different systems. The study also demonstrates the compatibility and reliability of the MT systems employed as part of EMSLAB Juan de Fuca (Electromagnetic Sounding of the Lithosphere and Asthenosphere Beneath the Juan de Fuca Plate). This project, proposed by a consortium of institutions, involves not only magnetotellurics studies but also studies of magnetic variation, on land and on the sea bottom. The project calls for the real‐time MT systems to occupy stations along segments of a profile in Oregon. A composite profile will be created from the segments. Prior to commencing the main MT profiling phase, one week was set aside in August, 1984, for all groups to record and process MT data sequentially at six sites in diverse geologic terrains; this experiment was called mini‐EMSLAB.


2007 ◽  
Vol 20 (3) ◽  
pp. 437-459 ◽  
Author(s):  
Mariusz Rawski ◽  
Bogdan Falkowski ◽  
Tadeusz Łuba

This paper presents the discussion on efficiency of different implementation methodologies of DSP algorithms targeted for modern FPGA architectures. Modern programmable structures are equipped with specialized DSP embedded blocks that allow implementing digital signal processing algorithms with use of the methodology known from digital signal processors. On the first place however, programmable architectures give the designer the possibility to increase efficiency of designed system by exploitation of parallelism of implemented algorithms. Moreover, it is possible to apply special techniques such as distributed arithmetic (DA) that will boost the performance of designed processing systems. Additionally, application of the functional decomposition based methods, known to be best suited for FPGA structures allows utilizing possibilities of programmable technology in very high degree. The paper presents results of comparison of different design approaches in this area.


2021 ◽  
Vol 5 (EICS) ◽  
pp. 1-29
Author(s):  
Jakob Karolus ◽  
Francisco Kiss ◽  
Caroline Eckerth ◽  
Nicolas Viot ◽  
Felix Bachmann ◽  
...  

Body movements, from a short smile to a marathon run, are driven by muscle activity. Despite the fact that measuring muscle activity with electromyography (EMG) is technically well established, it is highly complex and its use in interfaces has been limited. Easy access to muscle sensing can offer new opportunities to Human-Computer Interaction (HCI) research. Off-the-shelf sensors often only provide low-level access, hence requiring expertise in signal processing and widening the gulf of execution for users without engineering skills. To address this challenge, we introduce EMBody, a data-centric toolkit for EMG-based interface prototyping and experimentation. EMBody offers multiple levels of prototyping fidelity for EMG sensing, signal processing, and data interpretation. Our data-centric toolkit encapsulates the different data representation stages, offering a wide range of customization opportunities to experts while also allowing non-technical designers to focus on creating new interaction techniques. EMBody features a lightweight form factor and wireless connectivity. Additionally, the system leverages an exploration-centered workflow by allowing rapid access to measurement data via the accompanying software. Users define a set of motions to be recognized and interactively provide example data points. The toolkit then handles signal processing and classification. The recognized movements are streamed on the local network, ready to be used by interactive applications. This paper reports on how to use EMBody and its implementation. We iteratively developed the toolkit in a series of workshops and example applications. Users who had none or very limited knowledge of EMG could rapidly create engaging functional prototypes, while experts appreciated the modularity of the software component allowing for a high degree of customization. We contribute the software and hardware components of EMBody as a tool for the research community to stimulate creative exploration of EMG systems.


Author(s):  
Adrian F. van Dellen

The morphologic pathologist may require information on the ultrastructure of a non-specific lesion seen under the light microscope before he can make a specific determination. Such lesions, when caused by infectious disease agents, may be sparsely distributed in any organ system. Tissue culture systems, too, may only have widely dispersed foci suitable for ultrastructural study. In these situations, when only a few, small foci in large tissue areas are useful for electron microscopy, it is advantageous to employ a methodology which rapidly selects a single tissue focus that is expected to yield beneficial ultrastructural data from amongst the surrounding tissue. This is in essence what "LIFTING" accomplishes. We have developed LIFTING to a high degree of accuracy and repeatability utilizing the Microlift (Fig 1), and have successfully applied it to tissue culture monolayers, histologic paraffin sections, and tissue blocks with large surface areas that had been initially fixed for either light or electron microscopy.


Author(s):  
Cecil E. Hall

The visualization of organic macromolecules such as proteins, nucleic acids, viruses and virus components has reached its high degree of effectiveness owing to refinements and reliability of instruments and to the invention of methods for enhancing the structure of these materials within the electron image. The latter techniques have been most important because what can be seen depends upon the molecular and atomic character of the object as modified which is rarely evident in the pristine material. Structure may thus be displayed by the arts of positive and negative staining, shadow casting, replication and other techniques. Enhancement of contrast, which delineates bounds of isolated macromolecules has been effected progressively over the years as illustrated in Figs. 1, 2, 3 and 4 by these methods. We now look to the future wondering what other visions are waiting to be seen. The instrument designers will need to exact from the arts of fabrication the performance that theory has prescribed as well as methods for phase and interference contrast with explorations of the potentialities of very high and very low voltages. Chemistry must play an increasingly important part in future progress by providing specific stain molecules of high visibility, substrates of vanishing “noise” level and means for preservation of molecular structures that usually exist in a solvated condition.


Sign in / Sign up

Export Citation Format

Share Document