Philip Gladwin

big-data-600px

Googling “Big Data” results in 2.1 billion hits in 0.4 seconds. That in itself is big data indeed. According to a McKinsey study, big data is impacting the private sector as well as government, retail, healthcare, the computing industry, and academic research. The financial industry had been progressing into this area well before the “big data” label was applied. In this post, we explore what it means to work with large sets of market data and the challenges of doing that well to generate best execution for the client.

The challenges of big data are enormous: the cost of storing the data securely with the ability to recover data in the event of a disaster; the cost of hardware with room for expansion; the process of gathering a complete set of data without holes, which updates consistently real-time; obtaining accurate results quickly; and finally, providing senior management with something actionable. How do you prevent being drowned in a sea of data that is run on decrepit technology that often delivers results that take ages to appear and nobody trusts?

What is “Big Data” and how big is it? Data has always been generated, but now we have the ability to securely record and store a very high volume and query it efficiently. Data scales very quickly. Think of a stock churning out bid and ask prices, maybe several times a second. Multiply those by the number of tickers trading on an exchange, multiply again by the number of days, and repeat for all exchanges. Scale that for other asset classes, and suddenly the data set becomes very large.

Manipulating the data is the next challenge. How is it possible to deliver actionable results to a desktop PC within a few minutes from an enormous set of data that is stored in a database on the other side of the planet? There have been dramatic changes and advances in big data software in recent years. We now have the ability to draw an advanced type of flow chart to perform interesting analysis rather than the older, more time-consuming method of writing code line by line.

For an agency broker like Tradebook, this opens up a significant number of opportunities to provide more detail and analysis to help optimize trade execution, including, but not limited to:

  • Several flavors of toxicity reports
  • Transaction cost analysis that can be examined with a set of advanced benchmarks
  • Bloomberg terminal functions that provide robust proprietary models and statistics that can inform clients on their trading decisions–for example, AV/ LN Equity STAZ .

Big data is an exciting area for agency brokers. Bloomberg Tradebook’s approach to equity trading venues, for example, means that we can offer clients an impartial view on seeking best execution.

Philip Gladwin joined Bloomberg Tradebook in 2009 and is responsible for Tradebook product in Europe. He specializes in data analysis and pre- and post-trade transaction cost analysis. Prior to joining Tradebook, Philip worked at Citigroup and Dresdner Bank developing automated trading models for FX and FX options. Philip holds a PhD in Astrophysics from Cardiff University.

Equities

Equity traders can preserve and, in some cases, add alpha to the investment process using our sophisticated, anonymous agency platform.

Get More Information