You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
The SEC's Former Top "HFT Expert" Joins HFT Titan Citadel
Last April, we commented on the most blatant (pre) revolving door we had ever seen at the SEC (and there have been many): the departure of the SEC's head HFT investigator, Gregg Berman, who during his tenure at the agency (whose alleged purpose is to keep the "market" fair, efficient and unmanipulated) did everything in his power to draw attention away from HFTs. He did that, for example, by blaming Waddell and Reed for the May 2010 flash crash. This is what Berman, whose full title was the SEC's "Associate Director of the Office of Analytics and Research in the Division of Trading and Markets" said in the final version of the agency's Flash Crash report:
Several years later, when the HFT lobby made a coordinated push to eliminate human spoofers (which algos were apparently helpless against without regulatory intervention), the SEC changed its story entirely and blamed the flash crash on one solitary trader, Navinder Sarao. By then the SEC had lost all credibility. It had also lost Gregg Berman, who six months after quitting the SEC ended up taking a nondescript job at EY, where he joined the Financial Services Organization (FSO) of Ernst & Young LLP as a Principal focusing on market risk and data analytics.
We, for one, were surprised: having expended so much energy to cater to the HFT lobby, we were confident Berman would end up collecting a 7-figure paycheck from one of the world's most prominent high frequency frontrunning parasite firms. As a reminder, this is what we predicted when the creator of Midas, and Eric Hunsader's archnemesis, quit the SEC:
In retrospect, Berman's detour into E&Y ended up being just that: an attempt to mask his true career intentions by taking a less than 1 year "sabbatical" from his true calling: getting compensated from the very HFT industry whom he did everything in his power to reward generously during his tenure at the SEC.
Well, as it turns out, we were right after all, because lo and behold, as the WSJ first reported, Gregg Berman is now director of market-structure research at the world's most levered hedge fund, HFT powerhouse and massive electronic market-making firm: Citadel, which also happens to be the entity through which the NY Fed intervenes in the market.
And just like that all is well again in the corrupt world, in which the market "regulators" pretends to protect the little guy, when in reality all they only cater to the most criminal with the simple hope of landing a job there one day and getting paid in 1 year what they make in 10 at the SEC or any other government agency.
Forex prices to get even faster! EBS set to launch ultra fast data service
EBS says it is about to launch what it calls its "Ultra" data service, a high-speed feed
Any book with HFT applied to mt4?
I don't think that it is possible to HFT from MT
Bundesbank says HFT market makers typically pull out during periods of high volatility
A report from Germany's central bank, the Bundesbank, studied data from Bund and DAX futures markets
Automated high-frequency trading has grown tremendously in the past 20 years and is responsible for about half of all trading activities at stock exchanges worldwide. Geography is central to the rise of high-frequency trading due to a market design of “continuous trading” that allows traders to engage in arbitrage based upon informational advantages built into the socio-technical assemblages that make up current capital markets. Enormous investments have been made in creating transmission technologies and optimizing computer architectures, all in an effort to shave milliseconds of order travel time (or latency) within and between markets. We show that as a result of the built spatial configuration of capital markets, “public” is no longer synonymous with “equal” information. High-frequency trading increases information inequalities between market participants.
Quantitative tools have been widely adopted in order to extract the massive information from a variety of financial data. Mathematics, statistics and computers algorithms have never been so important to financial practitioners in history. Investment banks develop equilibrium models to evaluate financial instruments; mutual funds applied time series to identify the risks in their portfolio; and hedge funds hope to extract market signals and statistical arbitrage from noisy market data. The rise of quantitative finance in the last decade relies on the development of computer techniques that makes processing large datasets possible. As more data is available at a higher frequency, more researches in quantitative finance have switched to the microstructures of financial market. High frequency data is a typical example of big data that is characterized by the 3V’s: velocity, variety and volume. In addition, the signal to noise ratio in financial time series is usually very small. High frequency datasets are more likely to be exposed to extreme values, jumps and errors than the low frequency ones. Specific data processing techniques and quantitative models are elaborately designed to extract information from financial data efficiently. In this chapter, we present the quantitative data analysis approaches in finance. First, we review the development of quantitative finance in the past decade. Then we discuss the characteristics of high frequency data and the challenges it brings. The quantitative data analysis consists of two basic steps: (i) data cleaning and aggregating; (ii) data modeling. We review the mathematics tools and computing technologies behind the two steps. The valuable information extracted from raw data is represented by a group of statistics. The most widely used statistics in finance are expected return and volatility, which are the fundamentals of modern portfolio theory. We further introduce some simple portfolio optimization strategies as an example of the application of financial data analysis. Big data has already changed financial industry fundamentally; while quantitative tools for addressing massive financial data still have a long way to go. Adoptions of advanced statistics, information theory, machine learning and faster computing algorithm are inevitable in order to predict complicated financial markets. These topics are briefly discussed in the later part of this chapter.