Sunday, October 3, 2010

It was a busy summer

This last summer I spent time getting embedded in a new group at work.  It's been a blast because the group is really enthusiastic about agile development and adopting new and efficient practices.  I've had a chance to practice a lot of the things I've been doing and studying in my free time including acceptance test driven development, NHibernate, ASP.NET MVC 2 and lots more BDD.

There's nothing sweeter than having a greenfield project and I was able to get my hands on one.  I was tasked with building an internal dashboard for some of our back end (Windows) services.  I'm still in .NET land and we're primarily a Microsoft shop so I used a pretty small but flexible stack,

For testing, I tried out doing acceptance test driven development with SpecFlow which is the .NET analog to Ruby's Cucumber.  My unit/integration testing framework of choice is still MSpec.

I took this experience and made a presentation to the company about BDD.  For anyone that's looked around the web at BDD presentations you won't be incredibly surprised by the content.  Lots of the same points are made and I even borrowed the excellent high gear/low gear analogy of ATDD I found in someone else's presentation (I can't remember who :-(...sorry, uncredited person, whomever you are).  You can download it from my dropbox here.

I know I promised an example of doing ATDD and I'd really like to do it.  Now that work is semi-calm I can try to whip something up.  There are tons of other things I want to talk about like the Java codebase I'm working in now, when NOT to unit test, the current .NET craze around CQRS, and other interesting topics swimming around in my head.  

Sunday, June 27, 2010

BDD: Just tell me what the eff it is!

Here's my big problem with BDD...on first blush the only thing you can see of it are the tools used in it for testing and a bunch of acronyms and buzzwords.  You get all these testing frameworks to try to write natural language tests geared to output something your stakeholder can both read and understand.  For a long time, the question still stood for me; what is BDD?  How do you do BDD?  Sure, my tests were now human readable but it still hadn't added up.

For those that don't understand what BDD is and even those that do, I want to walk you through what I've learned over the last year.  I want to give you an alternative narrative to the stuffy version on wikipedia and the dated top ranked Google results which offer breadcrumbs to understanding but didn't give me the clarity I was looking for.

Let's look at the two line summary Dan North gave last year,
BDD is a second-generation, outside-in, pull- based, multiple-stakeholder, multiple-scale, high-automation, agile methodology. It describes a cycle of interactions with well- defined outputs, resulting in the delivery of working, tested software that matters.
Get it?  Neither did I when I first read it.  After doing a lot of research, investigation and experimentation I finally get it.  Now that I do, that two line definition is an excellent summary.  There's a lot of information packed into those sentences.  We won't try to drill into all of that here.  Let's start with the simple building blocks of the BDD process.

It starts with a story


BDD is firmly rooted in the agile universe.  It relies on a lot of the mechanisms and practices you find in agile processes.  The first and most important is the practice of capturing user requirements as stories.  I'm not going to spend time in this post defining what a story is but let's just use an example we can refer to.
In order to build readership of the blog
As a blog reader
I want to leave comments
This is the first step in BDD, the first tangible piece of the process.  Your stakeholder(s) and your agile group develop stories.

Create examples/scenarios


You have your story.  Take that story and imagine (realistic) scenarios of it in practice.  Who it is that generates the scenarios may be multiple people.  Your scenarios are your acceptance criteria.  They tell you when the story is done.

Here are some samples of scenarios based on the above story.  (Remember, for this story, we're in the context of a reader of the blog and their desire to leave comments.)  The scenarios are written using the Given-When-Then format for describing scenarios.
Given a blog entry
When I write a comment
And I submit the comment
Then I should see my comment under the blog post

Given a blog entry
And the blog entry is no longer accepting comments
When I view the blog entry
Then I should not be allowed to write a comment
You can see how there would be numerous scenarios and contexts of users of your application.  When generating your scenarios, you must take them all into account.

You're just about to write code but...AUTOMATION!


Any good development environment should be as automated as (in)humanly as possible.  With a few quick commands/scripts you should be able to build your app, build your DB (if you have one), execute tests, deploy the app, and run it.  The more you have to do manually the harder this whole thing is.

Also, you should be able to run this in isolation on your workstation, if possible.  If your app is sharing a DB with 5 other workstations in your department, chances are you're going to have collisions.  You won't be able to make predictable tests and you'll be stepping on each others' toes.

When you write your code you want to be continuously running tests.  This should be a push-button activity that does not derail you for several minutes.  If it is then analyze what it is that you're doing and see if you can find ways to streamline it.  Either through tooling or through eliminating deficiencies in your testing strategy.

Outside-in: It's time to TDD


In BDD, we do things outside-in.  Meaning, you start at the level of the user's interaction with your application and then you build downwards.  It's a huge shift for a lot of people.  I haven't been doing it for very long but it's not a huge departure if you were already doing TDD.  Most of my difficulty was just learning how to automate UI interactions.

Here is where the rubber (finally) meets the road.  You have a story, you have some scenarios of your story that crystallize your acceptance criteria, and you have an automated development environment that allows you to build and test your app rapidly.

Using the story and scenarios from this blog post, we can write our first lines of code.  But...I'm not going to do that here.  I have a separate blog post (it certainly deserves its own) coming that will walk through the TDD process under the BDD umbrella.  I'll update this post to refer to it once it's done.

In summary

  1. Write user stories
  2. Create scenarios based on the stories
  3. TDD those stories starting at the top level of your application (typically a UI) and keep developing downwards until it's functionally complete
Yes, that's seriously it.  There are methodologies, practices and tools that go inside of it but, at its simplest, that's all you're really doing.  If you're reading about BDD then I have no doubt that you're already aware of most of these topics anyways. 

The mystical cloak of BDD has now been removed!

DISCLAIMER:  This was meant to be a high level, entry level explanation of BDD because I have rarely seen it simplified.  Most information I've found online, the explanation seems to be a deep dive into how to write stories, stakeholder interaction, how DDD dictates the ubiquitous language, how it's the natural evolution of TDD (without ever showing you how!), and a variety of other entirely relevant information.  The problem is that there's a lot of cognitive overload.

In other words, for BDD enthusiasts, or developers in general because we like to be intellectual, don't dog me out and poke holes in my examples.  It's horribly contrived, I readily admit I am still learning these concepts and I don't believe anything I've written is incredibly misleading.

Thursday, June 24, 2010

wtf, with a side order of bbq

Just a blurb; been trying out new technologies and shiny toys. Here's my naive/ignorant take on things:

Been years since Linq hit .NET. Why is the NHibernate criteria API so incredibly non-intuitive and not making a huge move to linq? It's still clinging to the uber-verbose criteria API that, even fluently, makes hardly any sense. Yeah, I tried using Linq2NHibernate too but it was buggy for anything outside of simple selects and joins. I had an inner join that would magically make itself an outer join when I added an orderby clause. Hot tip for NHibernate; ditch the criteria API and just use Linq like all the other cool kids. It's ok to abandon the bizarre RDBMS abstraction that's been created.

I've been spoiled by Ruby. I can't tell you how many times in my day I think, "goddam, why can't I have mixins!?" It would make a lot of things a lot easier/sexier.

Finally, how can there not be a fully automated solution for deploying .NET web apps? I want something that builds the DB, deploys an MVC app and creates one in IIS if it doesn't exist yet. I haven't found anything like that. Granted I haven't looked too much but I didn't find anything with a handful of google searches. This is another area where the other open source community blows .NET out of the water. If I cared, I'd try to make something like this but I don't have the friggin' time.

Finally, finally I am again reminded of how small the open source community is in .NET. Still can't figure out why.

Sunday, May 9, 2010

Here we go again; where is my mind?

Something is itching at my mind and I haven't given it a name or a face yet. Whether I use Ruby, Java, C#, Scala or Javascript, it doesn't matter. They're details. When you go to Facebook do you think "Wow, this really feels like PHP." or when you hit Twitter do you sense the presence of Ruby and Erlang? Not so much. Then what counts here? What is it that matters?

Software is just a means to an end. I'm not in the business of writing code, I'm in the business of creating tools. Software just happens to be the vehicle in which I deliver them. You need a tool that will remind you to call your mother on her birthday? I'll write software that generates email reminders for important dates. Alternatively, I could have created a phone service that would call your home or cell phone as a reminder. Maybe I could create a network of carrier pigeons that did the same. Unfortunate for the pigeon trade and call centers, writing software is a lot cheaper and maintainable. Do you see what I'm getting at though?

It's more about what I'm delivering than it is how I deliver it. And, for me, the "how" is no longer so narrow. I'm not just a .NET developer and I'm more than comfortable stepping outside of the MS world. I've learned a fair number of engineering practices and principles that are language agnostic like DRY, YAGNI and some of SOLID. I can pick things up quickly no matter which language I choose to use. It's satisfying to know that within 2 weeks of reading and practicing, I have a fairly good idea of how I'd do modular development in Ruby and I even grok some of the deeper concepts like Ruby's compositional nature and metaprogramming.

One benefit of this line of thinking is that I'm no longer narrowing my vision to how I build what the customer wants. I have a much broader perspective of the business I'm in. Ultimately, I'm sure 98% of what I make will be a software product but someday there's going to be something that I think of that can be better serviced with a non-software solution. If I'm lucky, nobody else will be doing it and I can make my fortune from it.

What drove me to this whole post is that I'm trying to figure out what I'd like to create with my new found Ruby skillz. It dawned on me that, while Ruby is a blast to use, it's just the details. The hard work is figuring out what the hell I want to bring to the world. What's my product? What's my vision? What do people want? That's the new mental hurdle for me now.

Sunday, May 2, 2010

Ruby is cool, Time for Rails

I've spent the last couple weeks reading about the fine details of Ruby. My first impressions are good. I dig the language. I haven't put it to good use yet though. Mostly just dicking around in IRB and putting some small classes and modules together. I've been using RubyMine and NetBeans but I haven't had any clear winner on a Ruby IDE. Again, I think once I get into actually writing an application I'll have something more to substantial to share in that regard. Or maybe I'll shed my dependencies to a GUI and go all command line...well, let's not go overboard here. One thing at a time.

Now the next thing to check out is Rails. On first blush, I can see where ASP.NET MVC gets its influence. They share a similar DSL and even class/method names are echoed between the two. But, once again, I'm starved for details. I don't have any books on Rails so I may have to shop for one. I'm mostly just putzing around with what I know from ASP.NET MVC. Did I mention that they're close relatives?

Wednesday, April 21, 2010

Playing with Ruby

With the 1.0 release of IronRuby I decided to take a brief detour from my prior game plan of outside-in development . Part of that decision was because I'd wanted to check out Ruby for a while as a dynamic language (I've lived in a stuffy, static world for too long) with a lot of hype around it. Part of it was because most of the cutting edge BDD work is happening in the Ruby space. And the last part is that Ruby is in the .NET space now (at least up to 1.8.6) so the barrier for entry couldn't possibly get any lower. Yeah, I know the IronRuby project has been going on for years and prior to that it was dirt simple to get Ruby running on Windows but whatever. I'm here now.

Last year I did manage to pick up the book Beginning Ruby which I, unfortunately, didn't take a shine to. The first handful of chapters were light on the details of Ruby and deferred it for later chapters that I never got to. But that's also my fault because I'm a reference book kind of guy. I just picked up The Ruby Programming Language the other day which totally itches that scratch.

I've been playing around with the interactive Ruby shell which has been fun. It gives me a chance to take a spin before I do anything serious. Now I'm looking to pick up an IDE. I downloaded NetBeans and RubyMine. I'll post back with impressions on both the language and the tools.

Tuesday, April 13, 2010

Analysis and Investigation on Outside-In Development

BDD has a variety of testing tools surrounding it in a number of languages. Since I'm primarily in the .NET world (looking to expand that soon), I've dabbled with MSpec and had a lot of fun/success in creating descriptive tests that spoke the language of the business and communicated value. That was my first experiment in the world of domain drive design in an agile process.

The one thing these tools have in common is that they promote outside-in development. In BDD that literally means starting at the touch point with the consumer/customer/stakeholder. For most applications, that's a GUI of some sort but for others it may be a web service or something similar.

I've seen outside-in development go by a couple names like story driven development or acceptance test driven development. Unless I'm mistaken, they're both trying to achieve the same goal. What outside-in development is trying to do is make the software customer-oriented not unlike the agile processes that fit with this thinking. You start building the software from the customer's point of view based on their specification of it (usually as a story). Coupled with agile, you're doing something along the lines of rapid prototyping (with a bit more discipline). It's also keeping you focused on what the customer wants. It discourages gold plating and extra work that are typically waste.

<tangent>
I don't like to say gold plating because I know there are those developers that think that discouraging gold plating is like discouraging free thought and creativity. I would disagree. The hard pill for us to swallow is that what we perceive to be a "better way to do things" is not from the perspective of the actual consumers of the application. They may have no use for the bells and whistles that we believe to be productivity boosters. What it may be doing instead is creating excess inventory that costs money but has no ROI.

However, if you're in an agile process (or even if you're not), don't despair! Agile, by its very nature, relies on high-bandwidth communication. Reach out to the customer and communicate your creative thoughts and ideas. They will rely on your expertise in this arena and most times accept your recommendations. But, just remember, if they don't find use for your widget/tool/interface then don't do it (at least on company time :-P)!
</tangent>

Sooooo this is what I've understood so far. I am by no means an expert on outside-in development and I'm still trying to understand the practice. In the past, I've worked from the application logic and worked my way downwards. I haven't done it from the UI down which is certainly an interesting experiment. I'm also coming to find that I haven't done nearly enough acceptance/functional testing in my travels.

I plan on following up this post with my first attempt at doing outside-in development. I need to find a good book on it. I'm pretty sure the RSpec Book is probably what I'm looking for but any recommendations are appreciated. In the meantime, I've just been reading blogs, articles and comments. One such blog I ran into has a tremendous amount of insight into testing and offers good information to get you started on writing acceptance tests using ASP.NET MVC and some other tools and frameworks. Check out Steve Sanderson's blog to get some really great information.

Wednesday, March 10, 2010

TDD Tidbits: Your unit tests suck (and so does your design)!

TDD is a design process but there are skills required outside of its discipline that you need to master as well. One such skill is writing proper unit tests. Prior to TDD I thought I was unit testing. I really wasn't for a number of reasons. The simple answer is that my design wasn't permitting unit testing and what I was calling "unit" tests were actually integration tests. Classes were tightly coupled, methods were more procedural than OO, I didn't do a good job of separating responsibilities/concerns, etc.

A unit test has some simple requirements you need to fulfill. I'll rattle off a list of them off the top of my head.
  • The test must be fast
  • The test must be repeatable
  • The test must be predictable
  • Only one assertion per test
  • The test must isolate the behavior under observation
Easy, right? You'd think so. I'd heard these in some shape or form before I began to do TDD but it never really hit home. I've seen a couple blog posts with lists like this one so I'll try to describe my thoughts on each of the items above. I'm sure I could go on for hours about them and more so I'll try to keep it brief.

DISCLAIMER: The following code examples are extremely naive and serve only to demonstrate.

The test must be fast

Unit tests are meant to be run frequently. You have them local to your development workstation and they (should) run as part of your continuous integration. Unit tests are regression tests and your first line of defense. They report to you when something is no longer behaving the way it's supposed to. You need that feedback immediately and there's absolutely no reason for them to be slow.

For any system of significant complexity you don't want to wait 30 or even 10 minutes to find out that a bug was introduced. Unit tests should execute in milliseconds and running a whole suite of them should execute within seconds. For example, a project I had at work had around 300 tests that would execute in 4 to 5 seconds. That's the type of speed you should be aiming for.

Having tests that run in the timespan of minutes immediately introduces context switching. What happens when you kick off a full system build for a project that may take a minute or two or more to complete? You open up your browser and see what's going on in the Twitterverse, Facebook, StackOverflow, etc. You don't want that to happen for your testing. It's good to keep focused on the task and to plug along without too much interruption.

The test must be repeatable

This one is simple. I can run a unit test as many times as I want and it will not fail. Take the following for example of what would not be a repeatable test. This is a domain of rabbits. You have some rabbits, you add them to a collection of rabbits. It's a cruel domain and no two rabbits may have the same name. When you run this test a second time it will fail since it's picking up state from the last run. "Thumper" will already be in the database so adding him again will cause an error.


[TestFixture]
public class RabbitFixture
{
[Test]
public void Should_create_rabbit()
{
var rabbitRepo = new RabbitRepository();

rabbitRepo.Add(new Rabbit("Thumper"));

var thumper = rabbitRepo.GetByName("Thumper");

Assert.IsNotNull(thumper);
}
}

public class Rabbit
{
public Rabbit(string name)
{
Name = name;
}

public string Name { get; private set; }
}

public class RabbitRepository
{
private readonly ISessionFactory _sessionFactory;

public void Add(Rabbit rabbit)
{
using (var session = _sessionFactory.OpenSession())
{
if(Exists(rabbit))
{
throw new Exception("Rabbit of the same name already exists!");
}

session.Save(rabbit);
}
}

private bool Exists(Rabbit rabbit)
{
return GetByName(rabbit.Name) != null;
}

public Rabbit GetByName(string name)
{
using (var session = _sessionFactory.OpenSession())
{
return session.Linq<Rabbit>()
.FirstOrDefault(r => r.Name == name);
}
}
}

There are various tricks to make this test repeatable but not without violating some of the other guidelines like test isolation and speed. Unit tests shouldn't rely on external state and likewise they should not be creating any state that will last beyond the life of the test itself. It will be hard to keep the test repeatable and is brittle.

The test must be predictable

1 + 1 will always equal 2, right? Conceptually, your test should do the same. The inputs will always yield an expected output. I don't think this needs much more explanation.

Only one assertion per test

This is one of the more misunderstood practices of unit testing. Don't mistake mapping the word "assertion" to the Assert function that is called within a unit test. In this instance, they are not the same. You may call the Assert function multiple times to assert a single behavior. Take the following example.


[TestFixture]
public class RectangleFixture
{
[Test]
public void Should_resize_rectangle()
{
Rectangle rectangle = new Rectangle(40, 20);

rectangle.ReduceSizeByPercent(50);

Assert.AreEqual(20, rectangle.Length);
Assert.AreEqual(10, rectangle.Width);
}
}

public class Rectangle
{
public Rectangle(int length, int width)
{
Width = width;
Length = length;
}

public double Length { get; private set; }

public double Width { get; private set; }

public void ReduceSizeByPercent(int percent)
{
Length *= (percent * .01);
Width *= (percent * .01);
}
}

There're two calls to Assert to verify that the rectangle's size was reduced by 50%. That's what we're talking about when we say a test makes only one assertion. It's asserting the one behavior. Don't hate yourself if you call Assert more than once and don't feel the need to create another test.

The test must isolate the behavior under observation

Now that we're warmed up let's look at the one guideline to rule them all. This is the one that will have the greatest impact on the design of your code. You have to isolate the behavior you wish to test.

You can read this another way. The unit test cannot fail for any reason other than the implementation of the behavior being tested is incorrect. I have read blogs that mandate that a unit test cannot touch the file system, a database, a network, etc. That's not just because of how slow that may be. It's because now your unit test may fail for any reason related to an external dependency that has absolutely nothing to do with what you're unit testing. When you get the red light that a unit test fails it shouldn't be because you didn't install a database, it shouldn't be because you neglected to include a configuration file, it shouldn't be because your network cable isn't plugged in. Those have nothing to do with the unit test at hand.

Following this guideline will teach you how to detach your class from its external dependencies. The external dependencies aren't just things like databases and LAN access. They can be other classes that have rules of their own that need to be satisfied. You do not want your test to fail if some rule in a class that you consume isn't satisfied. That's outside of the scope of the behavior you're testing.

This is where people start to introduce language like "mocking" or "faking" your external dependencies. These are ways of allowing you to truly isolate the behavior you're trying to test. You're replacing pieces of your class that would cause the test to fail for reasons outside of the class's cares or awareness.

When you first start to really isolate the code you're trying to unit test the first thought that may pop into your mind is that "I'm just writing it this way so that I can test it!" Yes, you are but there is a truth you do not yet realize (and may take some time to settle in). Well designed code is easily testable and easily testable code is well designed. Writing proper unit tests can expose major (or minor) issues in the your design.

For me, this is where I learned the most lessons and what made me truly appreciate things like the SOLID principles. It leads you there and, in my opinion, you would have to go way out of your way to succeed in isolating your tests but avoid good design. It puts bumpers on your bowling lane so to speak.

Summary

Unit testing in and of itself is huge to the design process. Without even engaging in TDD, it is going to highlight deficiencies in your design. There is a lot of talk that TDD isn't about testing, it's about design. Unit testing is also about design so it's a perfect marriage.

Tuesday, March 9, 2010

TDD Tidbits: Red, Green, REFACTOR!!!

So I was a TDD noob at one point. I had to start somewhere so I picked up some recommended reading, Kent Beck's Test Driven Design: By Example. I'll be honest; I didn't like it. I actually returned the book. Was I stupid? No! It just didn't seem to add up.

To Mr. Beck's benefit, I wasn't rejecting his book but rather the notion that I should put myself on a leash when I code and do "stupid" stuff like return static data to satisfy a test. I was feeling the discomfort a lot of developers feel when they first dive into this stuff. It's a whole different way of writing code. I would never fault anyone for having reactions like my own.

The idea of returning junk data to satisfy a test was probably the hardest thing for me to overcome. I spent many frustrating sessions trying to figure out when to stop returning dummy data or why I was even doing it. It was frustrating to know how to code things but having to hold myself back and continue with the TDD process. It won't let you wander off and start implementing all sorts of code.

Why the hell would I return dummy values in my production code? Why would I fake out my code at all? I could write code all day that would return dummy data. Hey look, my code returns zero! On the next unit test I can have it just return one and fake that out as well! What the hell is this proving? I'm not writing any meaningful code! I KNOW what the answer is. Why do I have to go about it this way? What a waste of time! *head explodes*

To speak to my frustrations I didn't fully understand the process and I didn't (immediately) see the benefits. I was missing the important step of refactoring my code after the test passes. Sure, you may write a test that returns a static value of zero. Then you write another test that expects the code to behave a new way and return another value. At this point, feel free to refactor your guts out. You don't need permission to write code. Don't let the dogma of "only write code when you have a failing test" fool you. It comes in two stages. You write code to get the test to pass and then you write code again to refactor! Your test passes but that doesn't mean all hands have to come off the keyboard!

For example, here's a bucket. You put apples in the bucket and then you want to know how many apples you have. The first test and implentaton may look like this.


public class BucketFixture
{
[Test]
public void Should_have_five_apples_in_bucket()
{
Bucket bucket = new Bucket();
bucket.AddApples(5);

bucket.TotalAppleCount.ShouldEqual(5);
}
}

public class Bucket
{
public void AddApples(int appleCount)
{
}

public int TotalAppleCount
{
get
{
return 5;
}
}
}

Then you want another test to hammer out more of the behavior of this awesome bucket.


public class BucketFixture
{
[Test]
public void Should_have_five_apples_in_bucket()
{
Bucket bucket = new Bucket();
bucket.AddApples(5);

bucket.TotalAppleCount.ShouldEqual(5);
}

[Test]
public void Should_have_no_apples_in_bucket()
{
Bucket bucket = new Bucket();

bucket.TotalAppleCount.ShouldEqual(0);
}
}

The new test breaks as expected but now you can't pass back dummy data. What code can you write to satisfy both tests?


public class Bucket
{
int _totalAppleCount;

public void AddApples(int appleCount)
{
_totalAppleCount += appleCount;
}

public int TotalAppleCount
{
get
{
return _totalAppleCount;
}
}
}

Huzzah! But I don't necessarily have to stop there. I can still refactor if my heart so desired. I don't need a failing test. What if I wanted to be super cool and use auto properties? Don't bother to write a new test for it. Just do it.


public class Bucket
{
public void AddApples(int appleCount)
{
TotalAppleCount += appleCount;
}

public int TotalAppleCount
{
get; private set;
}
}

No failing test required! But bear in mind this is refactoring; we aren't changing behavior, only structure. The code must continue to behave the same. The tests will verify that.

Another thing to point out is that if the only behavior the bucket exhibited was that it had 5 apples then the test and implementation stops immediately. This is when it's called out that you're trying to do the simplest thing to satisfy the requirements. Even when new behavior is added, you're taking the shortest route to functional without being a complete chimp about it. Add patterns where applicable and such.

Last point I'll make is that now I view my first test as probably the most important to my API. The first test is figuring out the names of my classes, interfaces, properties, methods and arguments. It's the first stab at the design of the class that you're making. This is where I'm extra sensitive to the needs of the client code (typically called dog fooding). It's a big deal to me that my code demonstrates its intent and usage without the need for lengthy comments.


So there you go kids. Take advantage of the refactoring step! Do all the stuff that you want to do when you think that the TDD process has put you on a leash.

TDD Tidbits

I've been practicing TDD for two years and when I take a step back and look at what it's done for me I see I've learned a lot. I'm a far better coder because of it. Had I not picked it up, I doubt I'd be as well off. I began writing a post about it but it became monolothic. There's too much to be said. I decided to break it down into individual posts to make them easier to digest.

Just doing superficial searches on TDD in google, I don't see much in the way of individual experiences on TDD practices. I'm sure it's sprinkled about in blogs but most of what I find are the howto's, tutorials and videos surrounding it. I think what I plan to write is a mixture of tips, insight and experiences in TDD. I'm hoping it has value for the novice, the intermediate and the seasoned veteran. It's probably going to be in C# but I'll see if I can't do examples in other languages (feel free to comment/request one for all 3-4 of my possible readers).