scholarly journals A note on the relationship between high-frequency trading and latency arbitrage

2016 ◽  
Vol 47 ◽  
pp. 281-296 ◽  
Author(s):  
Viktor Manahov
PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0260724
Author(s):  
Ke Meng ◽  
Shouhao Li

This paper uses NASDAQ order book data for the S&P 500 exchange traded fund (SPY) to examine the relationship between one-minute, informational market efficiency and high frequency trading (HFT). We find that the level of efficiency varies widely over time and appears to cluster. Periods of high efficiency are followed by periods of low efficiency and vice versa. Further, we find that HFT activity is higher during periods of low efficiency. This supports the argument that HFTs seek profits and risk reduction by actively processing information, through limit order additions and cancellations, during periods of lower efficiency and revert to more passive market-making and rebate-generation during periods of higher efficiency. These findings support the argument that the adaptive market hypothesis (AMH) is an appropriate description of how prices evolve to incorporate information.


Author(s):  
Matteo Aquilina ◽  
Eric Budish ◽  
Peter O’Neill

Abstract We use stock exchange message data to quantify the negative aspect of high-frequency trading, known as “latency arbitrage.” The key difference between message data and widely familiar limit order book data is that message data contain attempts to trade or cancel that fail. This allows the researcher to observe both winners and losers in a race, whereas in limit order book data you cannot see the losers, so you cannot directly see the races. We find that latency arbitrage races are very frequent (about one per minute per symbol for FTSE 100 stocks), extremely fast (the modal race lasts 5–10 millionths of a second), and account for a remarkably large portion of overall trading volume (about 20%). Race participation is concentrated, with the top six firms accounting for over 80% of all race wins and losses. The average race is worth just a small amount (about half a price tick), but because of the large volumes the stakes add up. Our main estimates suggest that races constitute roughly one-third of price impact and the effective spread (key microstructure measures of the cost of liquidity), that latency arbitrage imposes a roughly 0.5 basis point tax on trading, that market designs that eliminate latency arbitrage would reduce the market’s cost of liquidity by 17%, and that the total sums at stake are on the order of $5 billion per year in global equity markets alone.


2020 ◽  
Vol 29 (3) ◽  
pp. 429-435
Author(s):  
Patricia C. Mancini ◽  
Richard S. Tyler ◽  
Hyung Jin Jun ◽  
Tang-Chuan Wang ◽  
Helena Ji ◽  
...  

Purpose The minimum masking level (MML) is the minimum intensity of a stimulus required to just totally mask the tinnitus. Treatments aimed at reducing the tinnitus itself should attempt to measure the magnitude of the tinnitus. The objective of this study was to evaluate the reliability of the MML. Method Sample consisted of 59 tinnitus patients who reported stable tinnitus. We obtained MML measures on two visits, separated by about 2–3 weeks. We used two noise types: speech-shaped noise and high-frequency emphasis noise. We also investigated the relationship between the MML and tinnitus loudness estimates and the Tinnitus Handicap Questionnaire (THQ). Results There were differences across the different noise types. The within-session standard deviation averaged across subjects varied between 1.3 and 1.8 dB. Across the two sessions, the Pearson correlation coefficients, range was r = .84. There was a weak relationship between the dB SL MML and loudness, and between the MML and the THQ. A moderate correlation ( r = .44) was found between the THQ and loudness estimates. Conclusions We conclude that the dB SL MML can be a reliable estimate of tinnitus magnitude, with expected standard deviations in trained subjects of about 1.5 dB. It appears that the dB SL MML and loudness estimates are not closely related.


1986 ◽  
Vol 51 (4) ◽  
pp. 362-369 ◽  
Author(s):  
Donna M. Risberg ◽  
Robyn M. Cox

A custom in-the-ear (ITE) hearing aid fitting was compared to two over-the-ear (OTE) hearing aid fittings for each of 9 subjects with mild to moderately severe hearing losses. Speech intelligibility via the three instruments was compared using the Speech Intelligibility Rating (SIR) test. The relationship between functional gain and coupler gain was compared for the ITE and the higher rated OTE instruments. The difference in input received at the microphone locations of the two types of hearing aids was measured for 10 different subjects and compared to the functional gain data. It was concluded that (a) for persons with mild to moderately severe hearing losses, appropriately adjusted custom ITE fittings typically yield speech intelligibility that is equal to the better OTE fitting identified in a comparative evaluation; and (b) gain prescriptions for ITE hearing aids should be adjusted to account for the high-frequency emphasis associated with in-the-concha microphone placement.


Author(s):  
Yacine Aït-Sahalia ◽  
Jean Jacod

High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. The book covers the mathematical foundations of stochastic processes, describes the primary characteristics of high-frequency financial data, and presents the asymptotic concepts that their analysis relies on. It also deals with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As the book demonstrates, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. The book approaches high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.


Author(s):  
Peter Gomber ◽  
Björn Arndt ◽  
Marco Lutat ◽  
Tim Elko Uhle

Sign in / Sign up

Export Citation Format

Share Document