Markus Bibinger from the University of Marburg, Moritz Jirak, from TU Braunschweig and Markus Reiss from Humboldt University Berlin, published a paper using Lobster data. It is titled Volatility estimation under one-sided errors with applications to limit order books and is forthcoming in Annals of Applied Probability.
Abstract: For a semi-martingale X_t, which forms a stochastic boundary, a rate-optimal estimator for its quadratic variation ⟨X,X⟩_t is constructed based on observations in the vicinity of X_t. The problem is embedded in a Poisson point process framework, which reveals an interesting connection to the theory of Brownian excursion areas. We derive n^−1/3 as optimal convergence rate in a high-frequency framework with n observations (in mean). We discuss a potential application for the estimation of the integrated squared volatility of an efficient price process X_t from intra-day order book quotes.
A working paper version is found here.
Markus Bibinger from the University of Marburg, Nikolaus Hautsch from the University of Vienna, Peter Malec from the University of Cambridge and Markus Reiss from Humboldt University Berlin published a paper using LOBSTER data. It is titled Estimating the Spot Covariation of Asset Prices — Statistical Theory and Empirical Evidence and is forthcoming in the Journal of Business and Economic Statistics.
Abstract: We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al (2014). We prove consistency and a point-wise stable central limit theorem for the proposed spot covariance estimator in a very general setup with stochastic volatility, leverage effects and general noise distributions. Moreover, we extend the LMM estimator to be robust against autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. Based on simulations we provide empirical guidance on the effective implementation of the estimator and apply it to high-frequency data of a cross-section of Nasdaq blue chip stocks. Employing the estimator to estimate spot covariances, correlations and volatilities in normal but also unusual periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, and (iii) can increase strongly and nearly instantaneously if new information arrives.
A working paper version is found here.
Torben G. Andersen from Northwestern University, Gökhan Cebiroglu and Nikolaus Hautsch, both from the University of Vienna, published a CFS working paper using LOBSTER data, titled Volatility, Information Feedback and Market Microstructure Noise: A Tale of Two Regimes.
Abstract: We extend the classical “martingale-plus-noise” model for high-frequency prices by an error correction mechanism originating from prevailing mispricing. The speed of price reversal is a natural measure for informational efficiency. The strength of the price reversal relative to the signal-to-noise ratio determines the signs of the return serial correlation and the bias in standard realized variance estimates. We derive the model’s properties and locally estimate it based on mid-quote returns of the NASDAQ 100 constituents. There is evidence of mildly persistent local regimes of positive and negative serial correlation, arising from lagged feedback effects and sluggish price adjustments. The model performance is decidedly superior to existing stylized microstructure models. Finally, we document intraday periodicities in the speed of price reversion and noise-to-signal ratios.
Read the working paper version here.
and Ulrich Horst of Universität Wien and Humboldt-Universität zu Berlin published and Article in April 2015) with the titel Optimal order display in limit order markets with liquidity competition using LOBSTER data. Abstract:
Order display is associated with benefits and costs. Benefits arise from increased execution-priority, while costs are due to adverse market impact. We analyze a structural model of optimal order placement that captures trade-off between the costs and benefits of order display. For a benchmark model of pure liquidity competition, we give a closed-form solution for optimal display sizes. We show that competition in liquidity supply incentivizes the use of hidden orders to prevent losses due to over-bidding. Thus, because aggressive liquidity competition is more prevalent in liquid stocks, our model predicts that the proportion of hidden liquidity is higher in liquid markets. Our theoretical considerations ares supported by an empirical analysis using high-frequency order-message data from NASDAQ. We find that there are no benefits in hiding orders in il-liquid stocks, whereas the performance gains can be significant in liquid stocks.
Julius Bonart and Martin Gould of Imperial College London published an Article in (April 2017) using LOBSTER data titled Latency and Liquidity Provision in a Limit Order Book. Abstract:
We use a recent, high-quality data set from Nasdaq to perform an empirical analysis of order flow in a limit order book (LOB) before and after the arrival of a market order. For each of the stocks that we study, we identify a sequence of distinct phases across which the net flow of orders differs considerably. We note some of our results are consist with the widely reported phenomenon of stimulated refill, but that others are not. We therefore propose alternative mechanical and strategic motivations for the behaviour that we observe. Based on our findings, we argue that strategic liquidity providers consider both adverse selection and expected waiting costs when deciding how to act.
Read the working paper version here.
VieCo 2017 aims to bring together leading experts and practitioners in financial econometrics, financial statistics, quantitative financial economics as well as applied mathematical finance. It is jointly organized by the Department of Statistics and Operations Research of the University of Vienna, in cooperation with the Wolfgang Pauli Institute Vienna and the Department of Economics of the University of Copenhagen. The “Vienna–Copenhagen Conference on Financial Econometrics” will take place on March 9-11 2017 in Vienna.
LOBSTER will be present at the conference, if you like to get in touch please come to our desk during the poster session on friday or send an email to get in touch.
HFT2016 aims at bringing together some of the world’s leading experts on high-frequency trading. The focus will be on a critical analysis as well as the perspectives of this recent development in global financial markets.
LOBSTER is presented at the conference with our academic adviser and co-founder Prof. Nikolaus Hautsch (who also happens to organize the event).
After two years of testing LOBSTER supported by Humboldt-Innovation GmbH, the incubator of Humboldt-Universität zu Berlin, we, the developers and creators of LOBSTER, founded our own company to develop & distribute LOBSTER further.
Our company’s name frischedaten UG (haftungsbeschränkt) roughly translates to fresh and recent data, and verbalizes what we strive for.
If you are already a customer, you received an email legally informing you of the transition of your current contract from Humboldt-Innovation to frischedaten UG. Your terms remain unchanged, but you will need to shortly confirm to that transition so that you can access LOBSTER after the 14th of November.
Our new customers will be able to conclude a contract directly with frischedaten UG from now on. We will update the relevant pages for the onboarding process shortly, in the meantime just contact us for all questions regarding enrolment.
As our customer base increases, so are the usage patterns of our users. We saw an increase in very big data queries the last weeks, which our queuing system handled efficiently but not very fair-minded with respect to users that wanted to see their small queries fulfilled in a reasonable time frame. To accommodate all of our users, we had to tweak the queuing system of LOBSTER.
From now on we have a fixed amount of threads (currently three) that run in parallel associated with each user. Of course you can still enter all your queries at once, and three of them will always start immediately while all others get queued until the first one finishes.