Sunday, January 27, 2013

Organizing commands and queries

In the last few posts I settled on an architecture for handling commands and queries. A byproduct of the described approach, is that your codebase quickly racks up plentiful little classes; a class to hold data, and a handler to act on that data, for each use case.

There are a few ways you can go at organizing things.

Everything in one location

When there is very little going on in your application, you can just dump everything in one location without getting hurt too much.




Folder per functionality

When your application grows, and the coherency between different use cases are obvious, you can just use folders - and corresponding namespaces, to organize your commands and queries, and to draw their functional boundaries. Uncle Bob and Mark Needham have sold me on structuring my code based on functionality instead of technical concepts.


Keeping each commandhandler in a separate class is especially interesting when they are rather bulky and contain a good amount of logic. You can think of each class as a little piece of functionality in itself. Take a look at the RavenDB codebase to get an idea of what that could look like.

It also feels like this way tends to bring your code closer to the Solution Explorer; just one double-click and you are looking at your implementation; no scrolling or searching between method definitions necessary. Maybe the problem is now just shifted a level higher in the hierarchy though.

Composing a service class

You can also opt to group commandhandler implementations in one service class. This variation might make more sense when your implementations are rather skinny, and don't do a whole lot but translating and forwarding your invocation.
public class SubscriptionService : 
        ICommandHandler<SubscribeCommand>,
        ICommandHandler<UnsubscribeCommand>
{
    public void Handle(SubscibeCommand command)
    {
        throw new NotImplementedException();
    }

    public void Handle(UnsubscribeCommand command)
    {
        throw new NotImplementedException();
    }
}
Use the hints your dependency graph gives you to find a composition that makes sense.

How do you go at organizing commands and queries?

Thursday, January 24, 2013

RavenDB: Drop all collections

I never stub or mock the database when I'm using RavenDB. Generally, I use an embeddable documentstore running in memory, and initialize a new instance on every test. However, I like to run some stress tests against a real instance, and here I found myself wanting to wipe clean the state of previous tests, without having to create a new database (which is rather slow).

First I create the default DocumentsByEntityName index to make sure it's there - it normally gets created when you open the studio for the first time. Then I use one of the advanced database commands: DeleteByIndex, and query all the tags.
using (var session = _documentStore.OpenSession())
{
    new RavenDocumentsByEntityName().Execute(_documentStore);
    session.Advanced.DatabaseCommands.DeleteByIndex(
            "Raven/DocumentsByEntityName",
            new IndexQuery { Query = "Tag: *" });                
}
This technique doesn't seem to be widely used judging by the first page of Google search results. If there is a reason for that though, let me know!

Sunday, January 20, 2013

Separating command data from logic and sending it on a bus

In my first post on this topic, I started out with an attempt to limit abstractions to solely commands and queries. Commands and queries were self-contained and could be invoked by passing them to a context-providing generic handler. The drawback of this approach was that it made constructor dependency injection impossible. In a next post, I separated data from logic, but never got around to writing a dispatcher that associates command data with their handlers. Last week, I revisited the first approach, and added an unconventional implementation of injecting dependencies by an Inject method convention.

I still believe that last approach is very simple and works fine if extra dependencies are exceptional. I do admit that it will make architectural shoots harder to handle; everything is rather tightly coupled. So let's look at an alternative architecture which pulls all the bits apart, and should be better equipped to handle change.

Let's first separate command data from logic.
public class CreateSubscriptionCommand 
{     
    public CreateSubscriptionCommand(string value, string category, string emailAddress)
    {
        Guard.StringIsNullOrEmpty(value, "value");
        Guard.StringIsNullOrEmpty(category, "category");
        Guard.StringIsNullOrEmpty(emailAddress, "emailAddress");

        Value = value;
        Category = category;
        EmailAddress = emailAddress;
    }

    public string Value { get; private set; }

    public string Category { get; private set; }

    public string EmailAddress { get; private set; }

    public override bool Equals(Object other)
    {
        if (other == null)
            return false;

        var otherCommand = other as CreateSubscriptionCommand;
        if (otherCommand == null)
            return false;

        return otherCommand.Value == Value && 
            otherCommand.Category == Category && 
            otherCommand.EmailAddress == EmailAddress;
    }    

    public override int GetHashCode()
    {
        return Value.GetHashCode() ^ 
            Category.GetHashCode() ^ 
            EmailAddress.GetHashCode();
    }
}
The data is just a POCO. Notice the equality overrides; this comes in handy when you're testing.

The class that handles on the data needs to implement the ICommandHandler interface.
public class CreateSubscriptionCommandHandler 
    : ICommandHandler<CreateSubscriptionCommand>
{    
    private IDocumentSession _session;

    public CreateSubscriptionCommandHandler(IDocumentSession session)
    {
        _session = session;
    }

    public void Handle(CreateSubscriptionCommand command)
    {
        var subscription = new Documents.Subscription(
            command.Value, command.Category, command.EmailAddress);

        _session.Store(subscription);    
    }
}
Compared to the previous approach, we're now injecting the session instead of having it handy as a property; that coupling is now completely gone.

Last thing left to do is create an interface that consumers can use to send commands on: a bus.
public class Bus : IBus
{
    private readonly IKernel _kernel;
    private readonly IDocumentSession _session;

    public Bus(IKernel kernel, IDocumentSession session)
    {
        _kernel = kernel;
        _session = session;
    }

    public void ExecuteCommand<T>(T command) where T : class
    {
        var handler = _kernel.Get<ICommandHandler<T>>();

        handler.Handle(command);

        _session.SaveChanges();
    }
}
The ExecuteCommand method dispatches data to the correct handler by resolving it from the container, and also commits the unit of work.

The consumer can execute commands like this.
bus.ExecuteCommand(
    new CreateQueryCommand(queryValue, category, emailAddress));
With this approach having all the bits spread, we have a bit more work gluing all the pieces together. The session is now known in the container, and is request scoped. The commandhandlers are also all registered in the container.
protected override void ConfigureRequestContainer(IKernel container, NancyContext context)
{        
        // you don't want to register them all individually
        container
            .Bind<ICommandHandler<CreateSubscriptionCommand>>()
            .To<CreateSubscriptionCommandHandler>();        
        // snip..
        container.Bind<IDocumentSession>()
            .ToMethod((ctx) => { 
                return ctx.Kernel.Get<IDocumentStore>().OpenSession();    
            })
            .InSingletonScope();                          
}

I think this approach might suit a lot of projects better if your commands are dependency heavy.
What I like most is that handling architectural shoots will be easier. For example: right now, all behaviour is in my entities and in my commands; the segregation of application services and domain services is non-existent. And this works fine so far; I yet have to find a use case where I would benefit from more separation. If that would change in the future though, I can introduce abstractions, and concepts, without breaking consumer code, and without having to do awkward stuff managing the newly introduced dependencies.

As always, your thoughts are appreciated.

Sunday, January 13, 2013

Self-contained commands with dependencies

Also read: separating command data from logic and sending it on a bus

In October I looked at an architecture that limits abstractions to solely commands and queries. In that post, I had some infrastructure that looked like this.
public abstract class Command
{
    public abstract void Execute();
}

public abstract class Query<T>
{
    public abstract T Execute();
}

public interface ICommandHandler
{
    void Execute(Command command);
}

public class CommandHandler : ICommandHandler
{
    public void Execute(Command command)
    {
        command.Execute();
    }
}

public interface IQueryHandler 
{
    T Execute<T>(Query<T> query);
}

public class QueryHandler : IQueryHandler
{
    public T Execute<T>(Query<T> query)
    {
        return query.Execute();
    }
}
Commands and queries are both accompanied by their specific handler. In this example, the handler does nothing but invoking the command or query. In reality, you want your handlers to do a little more. For example: provide the commands and queries with a context to work with, add logging, handle your unit of work and all of that good stuff.

Let's look at the infrastructure, and particularly the handlers of a project that uses RavenDB to store its data.
public abstract class Command
{
    public IDocumentSession Session { get; set; }

    public abstract void Execute();           
}

public abstract class Query<T>
{
    public IDocumentSession Session { get; set; }

    public abstract T Execute();
}

public class CommandHandler : ICommandHandler
{
    public void Execute(Command command)
    {            
        var store = DocumentStore.Get();
        using (var session = store.OpenSession())
        {
            command.Session = session;
            command.Execute();

            session.SaveChanges();
        }
    }
}
    
public class QueryHandler : IQueryHandler
{    
    public T Execute<T>(Query<T> query)
    {
        var store = DocumentStore.Get();
        using (var session = store.OpenSession())
        {
            query.Session = session;

            return query.Execute();
        }
    }
}
The handlers take care of creating and managing the session, but also provide the commands and queries with a reference to the session.

An actual command could look like this.
public class ConfirmOrderCommand : Command
{
    private Guid _token;

    public ConfirmOrderCommand(Guid token)
    {
        _token = token;
    }

    public override void Execute()
    {
        var order = Session.Query<Documents.Order>().Where(x => x.Token == _token).First();

        order.ChangeStatus(Documents.Status.Confirmed);
    }
}
I really like this style of command and queries; very little ceremony. The downside though, is that you can't use constructor dependency injection. Like discussed in this post, you could split your classes in two parts: the handler and the data, and you would solve that problem.

I wasn't that keen on that approach. Also, now that I'm so accustomed to having fast in-memory integration tests with RavenDB, it's exceptional that I have the need to inject dependencies. I worked out an alternative which allows me to inject dependencies without having to put my data somewhere else.

Instead of injecting the dependencies through the constructor, we're going to use an Inject method. This is a convention; there is no interface that enforces this. Returning to our ConfirmOrderCommand, we'll add support for creating a new folder on the file system.
public class ConfirmOrderCommand : Command
{
    private IFileSystem _fileSystem;

    private Guid _token;

    public ConfirmOrderCommand(Guid token)
    {
        _token = token;
    }

    public override void Execute()
    {    
        var order = Session.Query<Documents.Order>().Where(x => x.Token == _token).First();
        
        _fileSystem.CreateDirectory(Path.Combine("D:\", order.Customer.Id));

        order.ChangeStatus(Documents.Status.Confirmed);
    }
    
    public void Inject(IFileSystem fileSystem) 
    {
        _fileSystem = fileSystem;
    }
}
We can now amplify the commandhandler to automatically resolve and inject the dependencies into our commands. The handler uses reflection to look for a method named Inject on the type. If this method exists, it will inspect the method for its expected arguments and try to resolve those, to finally invoke the Inject method with its resolved arguments. 
public class CommandHandler : ICommandHandler
{
    private IKernel _kernel;

    public CommandHandler() { }

    public CommandHandler(IKernel kernel)
    {
        _kernel = kernel;
    }

    public void Execute(Command command)
    {
        if (_kernel != null)
            ResolveDependenciesIfNeeded(command);
            
        var store = DocumentStore.Get();
        using (var session = store.OpenSession())
        {
            command.Session = session;
            command.Execute();

            session.SaveChanges();
        }
    }

    private void ResolveDependenciesIfNeeded(Command command)
    {
        var method = command.GetType().GetMethod("Inject");
        if (method != null)
        {
            var parameters = method.GetParameters();
            var parameterInstances = new List<object>();

            foreach (var parameter in parameters)
            {
                var type = parameter.ParameterType;
                var instance = _kernel.Get(type);

                parameterInstances.Add(instance);
            }

            method.Invoke(command, parameterInstances.ToArray());
        }
    }
}
Consumers now don't have to care about the dependencies; they just have to be registered in the container. In the tests however, we can now explicitly inject mocks or stubs, and take advantage of having discoverable dependencies though the Inject convention.
[TestClass]
public class When_confirming_an_order
{
    private Mock<IFileSystem> _fileSystem;

    [TestInitialize]
    public void When()
    {
        DocumentStore.InitializeEmbedded();

        var cmd = new ConfirmOrderCommand("_token_");
        
        // Inject mock
        _fileSystem = new Mock<IFileSystem>();    
        cmd.Inject(_fileSystem.Object);

        new CommandHandler().Execute(cmd);
    }

    [TestMethod]
    public void the_status_is_changed_to_confirmed()
    {
        ...
    }
    
    [TestMethod]
    public void a_new_folder_is_created()
    {
        _fileSystem.Verify(...);
    }
}
Although I haven't really gone the distance with this implementation - I only have one command that has extra dependencies, I find this technique showing lots of promise. You get the leanness of self-contained commands and queries, while you still allow discoverable dependency injection by convention, supported by a tiny bit of infrastructure in the handlers.

I'd like to hear your opinion.

Sunday, January 6, 2013

Keeping your AppHarbor application pool alive

By default, IIS will shut down your application pool when it has been idle for more than 20 minutes. This is annoying when your website is only visited sporadically; visitors might not have the patience to wait for your application pool to spin up again. When you're running your own machine, you can higher or disable the idle-timeout, but when you're running on a cloud service like AppHarbor you can't.

One solution is to frequently make a request yourself to keep the application pool alive. You can use a third party service (like Pingdom or StillAlive), but chances are you don't want to take an extra dependency for something that trivial.

AppHarbor contains the required infrastructure to do this yourself: background workers and scheduling.

First create a new Quartz job which makes a request to your web application when it's invoked.
public class KeepAliveJob : IJob
{
    public void Execute(IJobExecutionContext context)
    {
        using (WebClient client = new WebClient())
        {
            client.DownloadString("http://your_webapp.com");
        }
    }
}
Then schedule your job to be triggered every 19 minutes or so.
var keepAliveJob = JobBuilder.Create<KeepAliveJob>().Build();
var keepAliveTrigger = TriggerBuilder.Create()
                .WithSimpleSchedule(x => x.WithIntervalInMinutes(19).RepeatForever())
                .Build();

scheduler.ScheduleJob(keepAliveJob, keepAliveTrigger);    
scheduler.Start();   
And that should be it; you're now running your own ping service.