Sunday, October 26, 2014

Programmatically force create a new LocalDB database

I have spent the last week working in an integration test suite that seemed to be taking ages to run its first test. I ran a profiler on the setup, and noticed a few things that were cheap to improve. The first one was how a new LocalDB database was being created.
An empty database file was included into the project. When running the setup, this file would replace the existing test database. However, when there were open connections to the test database - SQL Server Management Studio for example - replacing it would fail. To avoid this, the SQL server process was being killed before copying the file, waiting for it to come back up.

Another way to create a new database, is by running a script on master. You can force close open connections to the existing database, by putting the database in single user mode and rolling back open transactions. You can also take advantage of creation by script to set up a sane size to avoid the database having to grow while running your tests. When you specify the database size, you need to also specify the filename; I'm using the default location.

Switching to this approach shaved seven seconds of database creation.

Sunday, October 19, 2014

Print out the maximum depth of recursion allowed

Karl Seguin tweeted the following earlier this week: "An interview question I sometimes ask: Write code that prints out the maximum depth of recursion allowed."

This question is interesting for a couple of reasons. First, it's a shorter FizzBuzz; can the candidate open an IDE, write a few lines of code, compile and run them? And second, does he know what recursion is?

Now let's say, the interviewee knows how to write code and is familiar with the concept of recursion. If he had to do this exercise in C#, he might come up with something along these lines.

Before you let him run his code, you ask him to guess the output of this little program. If he's smart, he won't give you much of an answer. Instead he will point out that the result depends on the runtime, compiler, compiler switches, machine architecture, the amount of available memory and what not.

If he's not familiar with the C# compiler and runtime, he might even say there's a chance the integer will overflow before the call stack does.
The recursive method call is the last call in this method, making it tail-recursive. A smart compiler might detect the tail-recursion and convert the recursive call into a plain loop, avoiding recursion.

Running this program shows that the C# compiler isn't that smart, and will yield the maximum depth of recursion just before crashing.

If we were to port this snippet to F#, a functional language in which recursion is a first class citizen, the results are a bit different.

This just kept running until I killed it when the count was far over 171427. Looking at the generated IL, you can see that the compiler was smart enough to turn this recursive function into a loop.

If we want the F# implementation to behave more like the C# one, we need to make sure the compiler doesn't optimize for tail recursion.

Running this also ends in a StackOverflowException pretty early on.

I love how this question seems shallow at the surface, but gives away more and more depth the harder you look.

Sunday, October 5, 2014

Read an Event Store stream as a sequence of slices using F#

I'm slowly working on some ideas I've been playing around with whole Summer. Since that's taking me to unknown territory, I guess I'll be putting out more technical bits here in the next few weeks.

Using the Event Store, I tried to read all events of a specific event type. This stream turned out to be a tad too large to be sent over the wire in one piece, leading to a PackageFramingException: Package size is out of bounds.

The Event Store already has a concept of reading slices, allowing you to read a range of events in the stream, avoiding sending too much over the wire at once. This means that if you want to read all slices, you have to read from the start, moving up one slice at a time, until you've reached the end of the stream.

Avoiding mutability, I ended up with a recursive function that returns a sequence of slices.