Tutorial, practical samples and other resources about Event Sourcing in .NET. See also my similar repositories for JVM and NodeJS.
Event Sourcing is a design pattern in which results of business operations are stored as a series of events.
It is an alternative way to persist data. In contrast with state-oriented persistence that only keeps the latest version of the entity state, Event Sourcing stores each state change as a separate event.
Thanks to that, no business data is lost. Each operation results in the event stored in the database. That enables extended auditing and diagnostics capabilities (both technically and business-wise). What's more, as events contains the business context, it allows wide business analysis and reporting.
In this repository I'm showing different aspects and patterns around Event Sourcing from the basic to advanced practices.
Read more in my articles:
Events represent facts in the past. They carry information about something accomplished. It should be named in the past tense, e.g. "user added", "order confirmed". Events are not directed to a specific recipient - they're broadcasted information. It's like telling a story at a party. We hope that someone listens to us, but we may quickly realise that no one is paying attention.
Events:
Read more in my articles:
Events are logically grouped into streams. In Event Sourcing, streams are the representation of the entities. All the entity state mutations end up as the persisted events. Entity state is retrieved by reading all the stream events and applying them one by one in the order of appearance.
A stream should have a unique identifier representing the specific object. Each event has its own unique position within a stream. This position is usually represented by a numeric, incremental value. This number can be used to define the order of the events while retrieving the state. It can also be used to detect concurrency issues.
Technically events are messages.
They may be represented, e.g. in JSON, Binary, XML format. Besides the data, they usually contain:
correlation id
, causation id
, etc.Sample event JSON can look like:
{
"id": "e44f813c-1a2f-4747-aed5-086805c6450e",
"type": "invoice-issued",
"streamId": "INV/2021/11/01",
"streamPosition": 1,
"timestamp": "2021-11-01T00:05:32.000Z",
"data":
{
"issuedTo": {
"name": "Oscar the Grouch",
"address": "123 Sesame Street"
},
"amount": 34.12,
"number": "INV/2021/11/01",
"issuedAt": "2021-11-01T00:05:32.000Z"
},
"metadata":
{
"correlationId": "1fecc92e-3197-4191-b929-bd306e1110a4",
"causationId": "c3cf07e8-9f2f-4c2d-a8e9-f8a612b4a7f1"
}
}
Read more in my articles:
Event Sourcing is not related to any type of storage implementation. As long as it fulfills the assumptions, it can be implemented having any backing database (relational, document, etc.). The state has to be represented by the append-only log of events. The events are stored in chronological order, and new events are appended to the previous event. Event Stores are the databases' category explicitly designed for such purpose.
Read more in my articles:
In Event Sourcing, the state is stored in events. Events are logically grouped into streams. Streams can be thought of as the entities' representation. Traditionally (e.g. in relational or document approach), each entity is stored as a separate record.
Id | IssuerName | IssuerAddress | Amount | Number | IssuedAt |
---|---|---|---|---|---|
e44f813c | Oscar the Grouch | 123 Sesame Street | 34.12 | INV/2021/11/01 | 2021-11-01 |
In Event Sourcing, the entity is stored as the series of events that happened for this specific object, e.g. InvoiceInitiated
, InvoiceIssued
, InvoiceSent
.
[
{
"id": "e44f813c-1a2f-4747-aed5-086805c6450e",
"type": "invoice-initiated",
"streamId": "INV/2021/11/01",
"streamPosition": 1,
"timestamp": "2021-11-01T00:05:32.000Z",
"data":
{
"issuer": {
"name": "Oscar the Grouch",
"address": "123 Sesame Street",
},
"amount": 34.12,
"number": "INV/2021/11/01",
"initiatedAt": "2021-11-01T00:05:32.000Z"
}
},
{
"id": "5421d67d-d0fe-4c4c-b232-ff284810fb59",
"type": "invoice-issued",
"streamId": "INV/2021/11/01",
"streamPosition": 2,
"timestamp": "2021-11-01T00:11:32.000Z",
"data":
{
"issuedTo": "Cookie Monster",
"issuedAt": "2021-11-01T00:11:32.000Z"
}
},
{
"id": "637cfe0f-ed38-4595-8b17-2534cc706abf",
"type": "invoice-sent",
"streamId": "INV/2021/11/01",
"streamPosition": 3,
"timestamp": "2021-11-01T00:12:01.000Z",
"data":
{
"sentVia": "email",
"sentAt": "2021-11-01T00:12:01.000Z"
}
}
]
All of those events share the stream id ("streamId": "INV/2021/11/01"
), and have incremented stream positions.
In Event Sourcing each entity is represented by its stream: the sequence of events correlated by the stream id ordered by stream position.
To get the current state of an entity we need to perform the stream aggregation process. We're translating the set of events into a single entity. This can be done with the following steps:
This process is called also stream aggregation or state rehydration.
We could implement that as:
public record Person(
string Name,
string Address
);
public record InvoiceInitiated(
double Amount,
string Number,
Person IssuedTo,
DateTime InitiatedAt
);
public record InvoiceIssued(
string IssuedBy,
DateTime IssuedAt
);
public enum InvoiceSendMethod
{
Email,
Post
}
public record InvoiceSent(
InvoiceSendMethod SentVia,
DateTime SentAt
);
public enum InvoiceStatus
{
Initiated = 1,
Issued = 2,
Sent = 3
}
public class Invoice
{
public string Id { get;set; }
public double Amount { get; private set; }
public string Number { get; private set; }
public InvoiceStatus Status { get; private set; }
public Person IssuedTo { get; private set; }
public DateTime InitiatedAt { get; private set; }
public string IssuedBy { get; private set; }
public DateTime IssuedAt { get; private set; }
public InvoiceSendMethod SentVia { get; private set; }
public DateTime SentAt { get; private set; }
public void Evolve(object @event)
{
switch (@event)
{
case InvoiceInitiated invoiceInitiated:
Apply(invoiceInitiated);
break;
case InvoiceIssued invoiceIssued:
Apply(invoiceIssued);
break;
case InvoiceSent invoiceSent:
Apply(invoiceSent);
break;
}
}
private void Apply(InvoiceInitiated @event)
{
Id = @event.Number;
Amount = @event.Amount;
Number = @event.Number;
IssuedTo = @event.IssuedTo;
InitiatedAt = @event.InitiatedAt;
Status = InvoiceStatus.Initiated;
}
private void Apply(InvoiceIssued @event)
{
IssuedBy = @event.IssuedBy;
IssuedAt = @event.IssuedAt;
Status = InvoiceStatus.Issued;
}
private void Apply(InvoiceSent @event)
{
SentVia = @event.SentVia;
SentAt = @event.SentAt;
Status = InvoiceStatus.Sent;
}
}
and use it as:
var invoiceInitiated = new InvoiceInitiated(
34.12,
"INV/2021/11/01",
new Person("Oscar the Grouch", "123 Sesame Street"),
DateTime.UtcNow
);
var invoiceIssued = new InvoiceIssued(
"Cookie Monster",
DateTime.UtcNow
);
var invoiceSent = new InvoiceSent(
InvoiceSendMethod.Email,
DateTime.UtcNow
);
// 1,2. Get all events and sort them in the order of appearance
var events = new object[] {invoiceInitiated, invoiceIssued, invoiceSent};
// 3. Construct empty Invoice object
var invoice = new Invoice();
// 4. Apply each event on the entity.
foreach (var @event in events)
{
invoice.Evolve(@event);
}
and generalise this into Aggregate
base class:
public abstract class Aggregate<T>
{
public T Id { get; protected set; }
protected Aggregate() { }
public virtual void Evolve(object @event) { }
}
The biggest advantage of "online" stream aggregation is that it always uses the most recent business logic. So after the change in the apply method, it's automatically reflected on the next run. If events data is fine, then it's not needed to do any migration or updates.
In Marten Evolve
method is not needed. Marten uses naming convention and call the Apply
method internally. It has to:
void
type as the result.See samples:
Read more in my article:
Strongly typed ids (or, in general, a proper type system) can make your code more predictable. It reduces the chance of trivial mistakes, like accidentally changing parameters order of the same primitive type.
So for such code:
var reservationId = "RES/01";
var seatId = "SEAT/22";
var customerId = "CUS/291";
var reservation = new Reservation(
reservationId,
seatId,
customerId
);
the compiler won't catch if you switch reservationId
with seatId
.
If you use strongly typed ids, then compile will catch that issue:
var reservationId = new ReservationId("RES/01");
var seatId = new SeatId("SEAT/22");
var customerId = new CustomerId("CUS/291");
var reservation = new Reservation(
reservationId,
seatId,
customerId
);
They're not ideal, as they're usually not playing well with the storage engines. Typical issues are: serialisation, Linq queries, etc. For some cases they may be just overkill. You need to pick your poison.
To reduce tedious, copy/paste code, it's worth defining a strongly-typed id base class, like:
public class StronglyTypedValue<T>: IEquatable<StronglyTypedValue<T>> where T: IComparable<T>
{
public T Value { get; }
public StronglyTypedValue(T value)
{
Value = value;
}
public bool Equals(StronglyTypedValue<T>? other)
{
if (ReferenceEquals(null, other)) return false;
if (ReferenceEquals(this, other)) return true;
return EqualityComparer<T>.Default.Equals(Value, other.Value);
}
public override bool Equals(object? obj)
{
if (ReferenceEquals(null, obj)) return false;
if (ReferenceEquals(this, obj)) return true;
if (obj.GetType() != this.GetType()) return false;
return Equals((StronglyTypedValue<T>)obj);
}
public override int GetHashCode()
{
return EqualityComparer<T>.Default.GetHashCode(Value);
}
public static bool operator ==(StronglyTypedValue<T>? left, StronglyTypedValue<T>? right)
{
return Equals(left, right);
}
public static bool operator !=(StronglyTypedValue<T>? left, StronglyTypedValue<T>? right)
{
return !Equals(left, right);
}
}
Then you can define specific id class as:
public class ReservationId: StronglyTypedValue<Guid>
{
public ReservationId(Guid value) : base(value)
{
}
}
You can even add additional rules:
public class ReservationNumber: StronglyTypedValue<string>
{
public ReservationNumber(string value) : base(value)
{
if (string.IsNullOrEmpty(value) || !value.StartsWith("RES/") || value.Length <= 4)
throw new ArgumentOutOfRangeException(nameof(value));
}
}
The base class working with Marten, can be defined as:
public abstract class Aggregate<TKey, T>
where TKey: StronglyTypedValue<T>
where T : IComparable<T>
{
public TKey Id { get; set; } = default!;
[Identity]
public T AggregateId {
get => Id.Value;
set {}
}
public int Version { get; protected set; }
[JsonIgnore] private readonly Queue<object> uncommittedEvents = new();
public object[] DequeueUncommittedEvents()
{
var dequeuedEvents = uncommittedEvents.ToArray();
uncommittedEvents.Clear();
return dequeuedEvents;
}
protected void Enqueue(object @event)
{
uncommittedEvents.Enqueue(@event);
}
}
Marten requires the id with public setter and getter of string
or Guid
. We used the trick and added AggregateId
with a strongly-typed backing field. We also informed Marten of the Identity attribute to use this field in its internals.
Example aggregate can look like:
public class Reservation : Aggregate<ReservationId, Guid>
{
public CustomerId CustomerId { get; private set; } = default!;
public SeatId SeatId { get; private set; } = default!;
public ReservationNumber Number { get; private set; } = default!;
public ReservationStatus Status { get; private set; }
public static Reservation CreateTentative(
SeatId seatId,
CustomerId customerId)
{
return new Reservation(
new ReservationId(Guid.NewGuid()),
seatId,
customerId,
new ReservationNumber(Guid.NewGuid().ToString())
);
}
// (...)
}
See the full sample here.
Read more in the article:
Feel free to create an issue if you have any questions or request for more explanation or samples. I also take Pull Requests!
π If this repository helped you - I'd be more than happy if you join the group of my official supporters at:
π Github Sponsors
β Star on GitHub or sharing with your friends will also help!
For running the Event Store examples you need to have:
docker
folder and running:docker compose --profile all up
More information about using .NET, WebApi and Docker you can find in my other tutorials: WebApi with .NET
See also fully working, real-world samples of Event Sourcing and CQRS applications in Samples folder.
Samples are using CQRS architecture. They're sliced based on the business modules and operations. Read more about the assumptions in "How to slice the codebase effectively?".
WriteToAggregate
, AggregateStream
to simplify the processing,$all
,$all
.Variation of the previous example, but:
Shows how to handle basic event schema versioning scenarios using event and stream transformations (e.g. upcasting):
Shows how to compose event handlers in the processing pipelines to:
I prepared the self-paced training Kits for the Event Sourcing. See more in the Workshop description.
Event Sourcing is perceived as a complex pattern. Some believe that it's like Nessie, everyone's heard about it, but rarely seen it. In fact, Event Sourcing is a pretty practical and straightforward concept. It helps build predictable applications closer to business. Nowadays, storage is cheap, and information is priceless. In Event Sourcing, no data is lost.
The workshop aims to build the knowledge of the general concept and its related patterns for the participants. The acquired knowledge will allow for the conscious design of architectural solutions and the analysis of associated risks.
The emphasis will be on a pragmatic understanding of architectures and applying it in practice using Marten and EventStoreDB.
You can do the workshop as a self-paced kit. That should give you a good foundation for starting your journey with Event Sourcing and learning tools like Marten and EventStoreDB. If you'd like to get full coverage with all nuances of the private workshop, feel free to contact me via email.
It teaches the event store basics by showing how to build your Event Store on top of Relational Database. It starts with the tables setup, goes through appending events, aggregations, projections, snapshots, and finishes with the Marten
basics.
Read also more on the Event Sourcing and CQRS topics in my blog posts:
var streamId = Guid.NewGuid();
documentSession.Events.StartStream<IssuesList>(streamId);
var @event = new IssueCreated { IssueId = Guid.NewGuid(), Description = "Description" };
var streamId = documentSession.Events.StartStream<IssuesList>(@event);
var @event = new IssueCreated { IssueId = Guid.NewGuid(), Description = "Description" };
var streamId = Guid.NewGuid();
documentSession.Events.Append(streamId, @event);
var eventsList = documentSession.Events.FetchStream(streamId);
var @event = documentSession.Events.Load<IssueCreated>(eventId);
var dateTime = new DateTime(2017, 1, 11);
var events = documentSession.Events.FetchStream(streamId, timestamp: dateTime);
var versionNumber = 3;
var events = documentSession.Events.FetchStream(streamId, version: versionNumber);
var onlineAggregation = documentSession.Events.AggregateStream<TEntity>(streamId);
documentSession.Store<TEntity>(onlineAggregation);
documentSession.SaveChanges();
I gathered and generalized all of the practices used in this tutorial/samples in Nuget Packages maintained by me GoldenEye Framework. See more in:
GoldenEye DDD package - it provides a set of base and bootstrap classes that helps you to reduce boilerplate code and help you focus on writing business code. You can find all classes like Commands/Queries/Event handlers and many more. To use it run:
dotnet add package GoldenEye
GoldenEye Marten package - contains helpers, and abstractions to use Marten as document/event store. Gives you abstractions like repositories etc. To use it run:
dotnet add package GoldenEye.Marten
If you're interested in Architecture resources, check my other repository: https://github.com/oskardudycz/ArchitectureWeekly/.
It contains a weekly updated list of materials I found valuable and educational.
This blog is licensed under License Creative Commons BY-SA 4.0.
EventSourcing.NetCore is Copyright Β© 2017-2022 Oskar Dudycz and other contributors under the MIT license.