[Jeremy’s Fourth Law of TDD: Keep Your Tail Short]
When you pull a class off the shelf, what else is coming with it? – Stuart Holloway
My First Foray into TDD Bombed
Move all of the persistence out into a Database Mapper, or just use NHibernate, so the core workflow logic doesn’t even know the database exists. Isolate the state machine code into a smaller class that basically only knows how to change its own state as a result of workflow state transitions, and maybe direct other services to perform actions. I would tie the whole thing with some sort of Controller class that directs both the state machine class and the persistence service. The Controller would be tested with mock objects in place of the persistence and the actual workflow state machine logic.
Aerosmith and Roadies
A key concept in Responsibility Driven Design is to think of classes in terms of stereotypes.
reasons of limit the responsibility of the analytics engine
- The analytics code is potentially very complex.
- The analytics code has to be easy to test, and the easiest possible class (or cluster of classes) to test is one that takes in data through it’s API and returns a result without calling out into anything else. … In the automated tests, I can just set up the market and trade data in memory, run it through the analytical engine, and check the results again in memory.
- The analytics code will change over time as the trading gurus tweak their trading algorithms. The underlying data source and the downstream systems probably won’t change at the same time.
Aerosmith/Roadie/Class Stereotype exercise in designing software
- Applying the concept of class stereotypes to the large problem quickly suggests some division of responsibilities within the larger trade analysis subsystem. “divide and conquer”
- The analytical code has a short tail and can be tested in isolation
- The data provider is only that, a data provider. We can test the data provider in isolation from the analytics.
- Because the analtical code has no tight dependency on the data source, we can potentially use the analytics engine in a completely new context.
Can you get there from here?
- The unit tests required less mechanical setup work and therefore unit testing was faster.
- The unit tests were easier to understand because there was less code noise from all of the test data setup. That’s an important quality because unit tests serve an important secondary role as low level API documentation.
Test by Measuring What You are Trying to Test!
separate the responsibilities for the user display and the business logic into different class or subsystems.
The user interface and business logic are likely to change at different times.
Isolate the Churn
Some elements of your codebase are going to change much more frequently than the rest of the system. Some modules may need a very large number of testing permutations to fully cover all of the input possibilities. In either case it’s very advantageous to isolate these areas of your code from everything else. It’s smart to optimize the mechanics of testing for these modules by having a quick path to create test inputs and measure the outcome without involving any other piece of code.
Whenever you can, keep the Database on the Sidelines
- Database access in tests will make the tests run slower than tests that run completely within an AppDomain. Automated test exection time is a big deal, worthy of serious design consideration.
- You have to make sure that dependent data is loaded first for referential integrity.
- You have to supply some data to the database for not-nullable fields that isn’t relevant to the test.
You may not interpret anything in this section to mean you shouldn’t use referential integrity checks in the database.
Conclusion
Sometimes the fasest way to get code working is to write code in more, smaller pieces.
Software design and construction is all about divide and conquer.