We
will soon be formally announcing the next major version of the
Apama product, but I thought I would give a bit of a sneak preview
here.
We've added a number of significant enhancements to the
product, one
in particular I want to highlight is a new parallel execution model.
Our CEP engine's new parallelism allows massive vertical
scalability on multi-core hardware. We can leverage 8-way, 16-way even
32-way processors with ease. We've enabled this capability
by extending the Apama EPL with four new keywords. To explain requires
a bit of background ...
In the Apama EPL, we have the notion of a “sub-monitor”, which can be thought of as a separate parameterized “instance” of a set of business logic that encompasses a scenario, a trading strategy for example. Each sub-monitor can load and maintain its own data (e.g. consider a sub-monitor as corresponding to a different occurrence, with different timeouts, key indicators, client data, etc.) and can set up its own listeners to key into a (different or overlapping) subset of the event stream. This allows us to easily model factory like behavior, with each sub-monitor maintaining it's own state and possibly listening for different (sequences of) events – but applying the same basic logic including the actions to take when an event condition becomes true. We call this our micro threading or mThread architecture.
In the latest Apama release, v4.1 we extended this and introduced the notion of contexts. These are silos of execution which take the factory model to the next level. ”context" is a reserved word in Apama 4.1 – it defines a collection of sub-monitors which can inter-operate in a protected address space, with strong semantics for inter-context communication (via event passing) similar in concept to the Erlang message passing programming model. Most importantly, it is also the unit of parallelization – allowing the same CEP engine to spawn multiple “contexts” which key into the event flow but execute in parallel on multi-core architectures.
Contexts in Apama's EPL adhere to our language's basic event paradigm, providing a safe semantic for concurrency yet avoiding the typical race conditions common in multi-threaded programming in other languages (i.e. java) which require the use of mutexes, semaphores, etc.
"The world IS concurrent. It IS parallel. Things happen all over the place at the same time. I could not drive my car on the highway if I did not intuitively understand the notion of concurrency; pure message-passing concurrency is what we do all the time."
A great quote, one that we've taken to heart in the design of the parallelism now available in our EPL. Our approach is based on a deep understanding of the types of applications being built with the Apama platform. Our broad customer base provided us that advantage. For example, we took our Smart Order Router Solution Accelerator, enhanced it to use the context model and did a performance benchmark and achieved a 6.5 times increase in overall capacity on an 8-core server while holding to a steady low-latency threshold (notice we also improved overall performance in the v4.1version over previous versions as well).
This is a graph that compares the Capacity (number of concurrent open orders that can be processed in a specific timescale) of the Equities Smart Order Router for an increasing number of symbols. The comparison was on three versions of the Apama product. In the parallel version we have modified the SOR to partition symbols across contexts (each context goes on a processor core). The machine used for the experiment was an 8-core Intel Server.
Apama's new enhanced parallel execution model is a reflection of how our customers use CEP technology to build out real world applications. The competitive landscape of CEP dwells on performance with wild claims of speed often without substance. It reminds me of a teenager revving their car's engine at a traffic light. You can see the needle on the tachometer race up, the engine makes a lot of noise, but to what purpose? With the new release of the Apama CEP platform it shows that we know how and when to engage the transmission.
View all posts from The Progress Team on the Progress blog. Connect with us about all things application development and deployment, data integration and digital business.
Let our experts teach you how to use Sitefinity's best-in-class features to deliver compelling digital experiences.
Learn MoreSubscribe to get all the news, info and tutorials you need to build better business apps and sites