Tuesday, November 26, 2013

Streaming Map/Reduce on Wall Street

0 comments
I’ve been building event processing systems now for over 20 years and am very familiar with the issues that companies run into deploying them. Over the last decade, I’ve focused on Complex Event Processing (CEP), bringing one of the first CEP engines to market in 2005 and subsequently working with many of the top vendors and products in the space.

One of the problems of most of these products back in 2009 when I got started with Darkstar was that CEP engines just didn’t scale. I wanted to get past that limitation – I wanted something that could sift through the entire Internet in real time. My specific use case was the Consolidated Audit Trail for the SEC – something they could use to look for things in real time, throughout the day.

I wanted to build a system that could handle millions of events/messages per second, scan those data streams for patterns, persist the data, and make that data *immediately* available for subsequent query. There was nothing like this on the market at that time. Most people said this couldn’t be done. What I need was a business partner who’d believe in me and my team.

NYSE Technologies wanted to start offering a post-trade surveillance system but didn’t want to stand up a separate set of machinery/systems for each client. They wanted to uncouple revenue growth from expense – seemed like a perfect opportunity to build a clustered solution that only required hardware to scale. The problem with many systems is that in order to scale, they have to be re-architected. I wanted to avoid that.

The answer came in the form of mixing CEP and Hadoop. We needed a way to distribute queries to the cluster and then reassemble the results – seemed like a perfect use case for map/reduce.

Read more here

Leave a Reply

 
All Tech News IN © 2011 DheTemplate.com & Main Blogger .