First push… the event store

So we went with event store ( and so far, it’s working pretty well out of the box, but I think we’re already starting to deal with some of the issues that come with event sourcing which is making sure that events are being processed in the right order. The problem arises when you have a few instances of the same denormalizer processing events from the store. If, for say, you have 3 events published by the event store and the 2nd event in the chain takes longer than the 3rd, then you end up with a bad state since the 2nd will process after the 3rd. There are number of techniques that were discussed on a couple threads with Greg Young, myself, and some others here and here.

Event store has a few different subscription strategies that can help. You generally want to use catch-up subscriptions so that each projection can specify exactly where in the event log it last recorded. Doing this, you can create new projections and have them backfill their own data. Unfortunately, these subscriptions just pull data as they come in, so competing consumers don’t know what event belong to which stream. For this, ES offers Competing Consumers as a type of subscription. Basically you name a bucket and have your projections subscribe to them by name. The server managers how streams are published and if you choose the ‘pinned’ strategy, it’ll ensure that streams are published in batches, by id to the same consumer which should alleviate much of the issues that come from events being processed out of order when you have multiple instances (consumers) of the same projection.

At the moment, I’m blocked because even with the subscription issue, it seems that the events are coming in out of order. I’m sure it’s just a problem with the requested ordering (last first vs first first) but I don’t see a setting for that. I have this running in a different service just fine. not sure what the issue is here…

Leave a Reply