Tech At Bloomberg

Tackling the enterprise customer’s need for speed: real-time meets its mandate

August 28, 2017

Economics 101 teaches us the concept of perfect information, that is, that the markets function best when everyone has access to the same information. In this scenario, no party has an unfair competitive advantage.

This is why so many millions of man hours and investment dollars are spent on refining market data technologies to adhere to market structure mandates and adopt technologies designed to enhance market efficiency and transparency.

In a world where computer algorithms react instantaneously to market-moving news and minute price changes, fortunes can be made or lost in the blink of an eye. Speed has always been important when it comes to market data, but in today’s marketplace, the aggregation and distribution of diverse data sets as fast as possible has stretched the capabilities of existing network hardware and software. A paradigm shift of sorts needs to support the data demands of modern institutional investors — even when they don’t exclusively rely on automated trading strategies.

Investors of all types need access to relevant market-moving information — a news headline about a company they’re invested in, landmark litigation, FDA approval of a new drug, or an M&A scoop. When an investor needs this information as soon as possible, speed starts with the efficiency and skill of the newsroom. The faster a reporter can get the news and publish it in a headline, the faster it gets out to the public.

The best market-moving news feeds owe a lot to the speed of the newsroom, but while their competitive advantage starts there, that’s certainly not where it ends. Speed also largely depends on the networking technology used to promulgate market moving news, data and sentiment analysis to investors everywhere at the same time. Utilizing multicast distribution technology helps level the information playing field. It allows data providers to distribute market data synchronously from the data center to clients, and Bloomberg harnesses this technology to enhance the speed and agility of its news feeds.

To enable market makers and liquidity takers to act on the same information at precisely the same time, the real-time content and technology used to deliver it needed to evolve. Bloomberg first produced a stripped-down machine-readable version of news stories. The content is usually delivered in a ‘heavy’ XML feed, containing only the most essential information relevant to making a trading decision. And all of this information had to be contained in a single network packet, the smallest unit of information on the network.

“Imagine if you’re a client and there’s an important story that requires multiple packets of data to transmit it. They can’t trade on it until they receive the last packet. For them, this could severely impact their trading.” says Ali Mohsin, a product manager with Bloomberg’s Event-Driven Feeds team. “We had to strip any redundant information out. Any static reference content was removed so that only content relevant for trading was published.”

The team had to develop a whole new approach to move the data quickly through the Bloomberg infrastructure, including writing the libraries necessary to encode and decode the information stream. The goal was to create a message packet that contained this information, but was also lightweight and optimized for the customer’s systems.

The team also added flags to label packets that contain notable news, which was another request from clients. The flags include notifications for Bloomberg News stories that are tagged as scoops, exclusive stories on M&A deals, and for all hot headlines (the news story headlines that appear on a flashing red background on the Bloomberg Terminal).

Clients connect to Bloomberg’s feed in the Equinix Secaucus campus. This location was selected because all the major exchanges and trading firms have a presence there, Mohsin said.

The team got the feed up and running in a beta environment in less than six months. Tom Mineo, an engineering lead on the team, said the new Event-Driven Feeds are faster than previous products by an average of eight to eleven milliseconds. That’s 1/125th of a second, or about the amount of time it takes a typical camera’s shutter to open and close.

“When we set out to do this, it was clear it had to be our fastest product 100 percent of the time,” says Mineo. “Because we knew we wouldn’t have performance data until it was too late to turn back, we needed to scrutinize every byte and every hop through the infrastructure during the design and implementation phases.”

To ensure this was successful, they had to build monitoring tools to track feed performance in parallel to developing the feed. This component of the design was every bit as important as everything else in the system. The team set up servers that live in the same space – as if they were a client – to compare every message that is received to messages that appear in the existing infrastructure. This generates detailed performance metrics for every packet that passes through the system. “We wanted to be the most demanding customer of our own feed,” he said. The servers track every aspect of the feed’s behavior, and it’s programmed to set off alarms when anything unexpected happens.

The monitoring is so thorough and precise that it has exposed potential issues before the standard monitoring tools that are already in place get alerted to them. For example, this added resolution has led to capacity upgrades and route optimizations, Mineo shared.

Adopting multicast technology and testing it in this fashion exemplifies Bloomberg’s dedication to being agile and meeting the evolving needs of its enterprise data customers. When every millisecond counts, maximizing quality and efficiency requires that technologists revisit how the content is created, formatted and distributed to maximize its performance and value.