Three levels of TDD

Introduction

I’ve been using TDD technique for a few years. Most of the time with satisfactory a result. But it wasn’t an easy journey; it was a trip full of ups and downs. During this period my thinking about TDD has changed dramatically, or maybe I have changed my perception of testing and software development during this time? Indeed, yes I have.

Lasse Koskela in his book called “Test Driven: TDD and Acceptance TDD for Java Developers.” wrote that “TDD is a technique that evolves together with the practitioner.” In this blog post, I would like to describe my own evolution in this matter.

Red Green Refactor Circle

Level 1 - Fundamentals

You begin your journey with TDD. When you are new into something, you want to follow the rules strictly. One is TDD circle, which is “RED, GREEN, REFACTOR”. You have also heard about three laws of TDD defined by Uncle Bob:

  • You are not allowed to write any production code unless it is to make a failing unit test pass.
  • You are not allowed to write any more of a unit test than is sufficient to fail, and compilation failures are failures.
  • You are not allowed to write any more production code than is sufficient to pass the one failing unit test.

You are very confused about TDD because all examples that you can find relate to some mathematical/algorithmic problems. But in your daily job, you are obligated to write features, talk to DB and the other systems. You probably are struggling with complex dependencies, maybe you have to use mocks.

But finally, after some practice you start to see the benefits, which are:

  • You’ve noticed short feedback loop. Immediately, when you complete your implementation, you can launch the test to verify the correctness of your code.
  • Test code coverage gets higher. No matter how you measure it, it will be higher.
  • Regression is not a problem. Because when you break previous functionality during refactoring, you will instantly know that.

Level 2 - Requirements

Task lists

Task lists work perfectly for me. When I implement a business requirement, each small step or each corner case is represented by one task in my task list.

Then for each task I write one test, often I use parametrized tests to extend tests quickly. Finally, after a few TDD circles, my task is finally completed, and I can move on.

But sometimes during my work new system requirements appear. Often because the domain is so complicated that it’s hard to predict all the functionality up front. There is a big temptation to do it now, during the work on the current task, but it is dangerous. By doing it, you can lose your focus on your current goal.

I’ve practiced the habit which consists of adding this new requirement as a new task to my task list and complete it after the current one. Then you gain some time to think about this need, to decide if it is an essential functionality to do.

BDD

At some day, you will discover Behaviour Driven Development. For example, look at this specification:

Scenario
1
2
3
4
5
Scenario: Customer has a broker policy so DOB is requested
Given I have a "Broker" policy
When I submit my policy number
Then I should be asked for my date of birth

It is a very well written test scenario. Moreover, it is an executable scenario. This text can be executed with the tool called Cucumber. You don’t have to use it. You can use standard test framework and write your test using fluent test libraries or you can build your fluent API for tests if needed.

Start writing tests that will not only check your code but also be valuable documentation for your system.

Level 3 - Understanding

Show me your tests and I will tell you everything about your code.

TDD sometimes can also mean “Test Driven Design”. When you start thinking about it, your main reason for writing the tests is to refactor and re-engineer your codebase freely. For me, it is the highest value which you can get from TDD. How to achieve it? Try not to “cement” your code. Try to test interfaces or facades but not bolts and nuts of the implementation.

How to check if your tests are correct? Remove production code and try to rebuild it in a different way basing only on tests.

Summary

In this article, I presented fundamental rules of TDD. The topic of requirements were also discussed. In the end, I told you about Test Driven Design which for me is a valuable part of this technique. I hope that your understanding of TDD will improve and you will start writing better tests and better systems.

I gave a speech about TDD. Slides available at tdd.lewandowski.io

Photo credits: Banner, Thumbnail

Fish shell - Load environment variables from file

In many places, you can find environment files, with the following structure:

1
2
FOO1=BAR1
FOO2=BAR2

When you try to evaluate, this file using source command, you get an error with the fish shell.

1
2
$ source web.env
Unsupported use of '='. In fish, please use 'set FOO1 BAR1'.

This is very annoying, so I’ve decided to write a function that can read this file and load those variables. Here is how you can use it:

1
2
3
$ posix-source web.env
$ echo $FOO1
BAR1

The source code of this function. Enjoy:

1
2
3
4
5
6
7
$ cat .config/fish/functions/posix-source.fish
function posix-source
for i in (cat $argv)
set arr (echo $i |tr = \n)
set -gx $arr[1] $arr[2]
end
end

Photo credits: Banner, Thumbnail

Lessons learned from Decision Maker

In the past few weeks, I’ve read Getting Things Done, Technical Leadership, Elon Musk Biography and The Decision Maker.

Each of these books was good. But “The Decision Maker” is a game changer and I can’t stop thinking about this book. It was worth reading - for sure. I’ve decided to write a short book review and note the most important facts that I’ve learned from this book.

Review

This book is a story about a company and its new owners who have left the corporation and decided to build a great place to work. It is full of dialogues, issues, and situations.

By observing those scenes, the author presents ideas and values that matter when you have to lead the team or the company.

Is this book only for managers or bosses? Certainly not. If you work with other people or deal with non-trivial tasks, this book is for you. For me, it is an appropriate supplement for any “Agile” book.

Blueprint presented in this book is a good starting point for setting up company culture.

The story did not take place in reality. Each scene looks genuine, but as a whole, it seems artificial. Like a romance from 90’s, when you know they will live happily ever after.

Lessons Learned

Spoiler Alert!

In the following section, some elements of the book are about to be revealed.

People

To begin with, you have to change your thinking about other people.

People:

  • are unique,
  • are creative,
  • are able to learn,
  • have different strong points,
  • have different needs,
  • like a challenge,
  • are capable of changing the environment,
  • are capable of making contribution,
  • can be trusted.

Among some people, you can see those values. Among others, you have them hidden, and you have to unlock them.

But there is always somebody who disagrees with it and this is important to remember it. Do you see any similarities with Theory X and Y employees?

Decision Maker

Secondly, you have to choose the Decision Maker. It is a person who makes a decision. How to find them? It is simple.

The Decision Maker is a person, who is closest to the action. Bosses or leaders are not often deeply familiar with the situation. Usually, team members are often closer to the problem.

The Decision Maker has to be capable of listening and understanding other people. Making a decision is a process, in which you have to talk and listen to the others.

The Decision Maker should be aware of what is going on. Awareness of facts and consequences is crucial. If the person does not have basic data for making decisions - like company current finance status - you are responsible for unlocking that data.

Wisdom and knowledge are desirable qualities of that person.

It is a leader’s job to choose the Decision Maker. The leader should also observe and monitor the Decision Maker to see if he makes good decisions. If not, something should be done by the leader.

Results of making decision

It turns out that your employees’ decisions are often as good as or even better than yours can ever be.

People who are allowed to make the decision feel the ownership, because of that they will do everything to make the best possible decision.

Advisory process

The purpose of the advisory process is to look for a wider perspective.
The Decision Maker should ask at least a few people what they think about the decision.
He or she should ask:

  • team members,
  • other people with experience,
  • subordinates and superiors,
  • anyone who can help.

But the Decision Maker should take the final call.

Silver bullet

The decision maker process is not a silver bullet. It is only one tool or technique. The bigger picture is not straightforwardly visible in the book.

Between the lines, you can see many behaviours and dialogues which look familiar in “Teal Organizations”. If your organization is not ready, the decision maker process is definitely not the road to follow.

Photo credits:
Banner
Thumbnail

Formatting Java Time with Spring Boot using JSON

The aim of this post is to summarize and review ways of formatting Java Time objects using Spring Boot and Jackson library.

This post is organized in five steps. Each step represents one aspect of the issue and it is also related to one commit in example project repository.

Step 0 - Prerequirements

Versions and dependencies

This tutorial is based on Spring Boot version 1.3.1.RELEASE with spring-boot-starter-web. It uses jackson-datatype-jsr310 from com.fasterxml.jackson.datatype in version 2.6.4, which is a default version of Spring Boot. All of these is based on Java 8.

The Code

In the example code repository, you can find one HTTP service made with Spring Boot. This service is a GET operation, which returns a class with Java Time objects.
You can also find the integration test that deserializes the response.

Step 1 - The goal

I would like to return class Clock, containing LocalDate,LocalTime and LocalDateTime, preinitialized in constructor.

Clock - Service response class
1
2
3
4
5
6
public final class Clock {
private final LocalDate localDate;
private final LocalTime localTime;
private final LocalDateTime localDateTime;
...
}

Response class is serialized to JSON Map, which is a default behaviour. To some extent it is correct, but ISO formatted Strings in response are preferable.

LocalDate - response as JSON Map
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
{
"localDate":{
"year":2016,
"month":"JANUARY",
"era":"CE",
"dayOfYear":1,
"dayOfWeek":"FRIDAY",
"leapYear":true,
"dayOfMonth":1,
"monthValue":1,
"chronology":{
"id":"ISO",
"calendarType":"iso8601"
}
}
}

Integration testing is an appropriate way to test our functionality.

Example of integration test
1
2
3
4
5
6
ResponseEntity<Clock> resp = sut.getForEntity("http://localhost:8080/clock", Clock.class);
assertEquals(OK, resp.getStatusCode());
assertEquals(c.getLocalDate(), resp.getBody().getLocalDate());
assertEquals(c.getLocalTime(), resp.getBody().getLocalTime());
assertEquals(c.getLocalDateTime(), resp.getBody().getLocalDateTime());

Unfortunately, tests are not passing, because of deserialization problems. The exception with message is thrown can not instantiate from JSON object.

Step 2 - Adds serialization

First things first. We have to add JSR-310 module. It is a datatype module to make Jackson recognize Java 8 Date & Time API data types.

Note that in this example jackson-datatype-jsr310 version is inherited from spring-boot-dependencies dependency management.

Dependency in pom.xml
1
2
3
4
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</dependency>

Response is now consistent but still, not perfect. Dates are serialized as numbers:
Dates serialized to numbers and integers
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
{
"version":2,
"localDate":[
2016,
1,
1
],
"localTime":[
10,
24
],
"localDateTime":[
2016,
1,
1,
10,
24
],
"zonedDateTime":1451640240.000000000
}

We are one step closer to our goal. Tests are passing now because this format can deserialized without any additional deserializers.
How do I know?
Start an application server on commit Step 2 - Adds Object Mapper, then checkout to Step 1 - Introduce types and problems, and run integration tests without @WebIntegrationTest annotation.

Step 3 - Enables ISO formatting

ISO 8601 formatting is a standard. I’ve found it in many projects. We are going to enable and use it.
Edit spring boot properties file application.properties and add the following line:

application.properties file - disabling timestamps write
1
spring.jackson.serialization.WRITE_DATES_AS_TIMESTAMPS = false

Now, the response is something that I’ve expected:

Dates serialized as Strings
1
2
3
4
5
6
7
{
"version":2,
"localDate":"2016-01-01",
"localTime":"10:24",
"localDateTime":"2016-01-01T10:24",
"zonedDateTime":"2016-01-01T10:24:00+01:00"
}

Step 4 - Adds on demand formatting pattern

Imagine one of your client systems does not have a capability of formatting time. It may be a primitive device, or microservice that treats this date as a collection of characters. That is why special formatting is required.

We can change formatting in response class by adding JsonFormat annotation with pattern parameter. Standard SimpleDateFormat rules apply.

Using @JsonFormat annotation
1
2
3
4
5
@JsonFormat(pattern = "dd::MM::yyyy")
private final LocalDate localDate;
@JsonFormat(pattern = "KK:mm a")
private final LocalTime localTime;

Below there is a service response using custom @JsonFormat pattern:

Custom response style
1
2
3
4
5
6
7
{
"version":2,
"localDate":"01::01::2016",
"localTime":"10:24 AM",
"localDateTime":"2016-01-01T10:24",
"zonedDateTime":"2016-01-01T10:24:00+01:00"
}

Our tests are still passing. It means that this pattern is used for serialization in service and deserialization in tests.

Step 5 - Globally changes formatting

There are situations where you have to resign from ISO 8601 formatting in your whole application, and apply custom made standards.

In this part, we will redefine format pattern for LocalDate. This will change formatting of LocalDate in every endpoint of your API.

We have to define:

  • DateTimeFormatter with our pattern.
  • Serializer using defined pattern.
  • Deserializer using defined pattern.
  • ObjectMapper bean with custom serializer and deserializer.
  • RestTemplate that uses our ObjectMapper.

Bean ObjectMapper is defined with annotation @Primary, to override default configuration.
My custom pattern for LocalDate is dd::MM::yyyy

Object mapper bean with custom pattern
1
2
3
4
5
6
7
8
9
10
11
12
13
14
public static final DateTimeFormatter FORMATTER = ofPattern("dd::MM::yyyy");
@Bean
@Primary
public ObjectMapper serializingObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
JavaTimeModule javaTimeModule = new JavaTimeModule();
javaTimeModule.addSerializer(LocalDate.class, new LocalDateSerializer());
javaTimeModule.addDeserializer(LocalDate.class, new LocalDateDeserializer());
objectMapper.registerModule(javaTimeModule);
return objectMapper;
}

Definitions of serializer and deserializer for all LocalDate classes:

Custom serializer and deserializer
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class LocalDateSerializer extends JsonSerializer<LocalDate> {
@Override
public void serialize(LocalDate value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
gen.writeString(value.format(FORMATTER));
}
}
public class LocalDateDeserializer extends JsonDeserializer<LocalDate> {
@Override
public LocalDate deserialize(JsonParser p, DeserializationContext ctxt) throws IOException {
return LocalDate.parse(p.getValueAsString(), FORMATTER);
}
}

Now, the response is formatted with our custom pattern:
Formatted response
1
2
3
{
"localDate":"01::01::2016"
}

Tests

When we define custom serializer, our tests start to fail. It is because RestTemplate knows nothing about our deserializer. We have to create custom RestTemplateFactory that creates RestTemplate with object mapper containing our deserializer.

Custom RestTemplateFactory
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
@Configuration
public class RestTemplateFactory {
@Autowired
private ObjectMapper objectMapper;
@Bean
public RestTemplate createRestTemplate() {
RestTemplate restTemplate = new RestTemplate();
List<HttpMessageConverter<?>> converters = new ArrayList<>();
MappingJackson2HttpMessageConverter jsonConverter = new MappingJackson2HttpMessageConverter();
jsonConverter.setObjectMapper(objectMapper);
converters.add(jsonConverter);
restTemplate.setMessageConverters(converters);
return restTemplate;
}
}

Conclusion

Custom formatting Dates is relatively simple, but you have to know how to set up it. Luckily, Jackson works smoothly with Spring. If you know other ways of solving this problem or you have other observations, please comment or let me know.

Photo credits: Banner, Thumbnail

Clojure - Fascinated, Disappointed, Astonished

I’ve had a pleasure to work with Piotrek Jagielski for about two weeks on Clojure project. I’ve learned a lot, but there is still a lot to know about Clojure for me. In this post I’ll write what fascinated, disappointed and astonished me about this programming language.

Clojure & InteliJ IDEA tips

Before you start your journey with Clojure:

  • Use Cursive plugin for InteliJ IDEA. In ‘14 Edition it was not in the standard plug-in repository (remove La Clojure plug-in and Cursive repository manually). For IDEA ‘15 it is in repository.
  • Colored brackets help me a lot. You can find configuration for colored brackets on Misophistful Github.

Fascinated

Syntax

For many people Clojure brackets are reasons to laugh. Jokes like that were funny at first: “How many brackets did you write today?”
I have to admit, that at the beginning using brackets was not easy for me. Once I’ve realized that the brackets are just on the other side of the function name, everything was simple and I could code very fast.
After few days I’ve realized that this brackets structure forces me to think more about the structure of the code. As a result the code is refactored and divided into small functions.
Clojure forces you to use good programming habits.

Data structure is your code

Clojure is homoiconic, which means that the Clojure programs are represented by Clojure data structures. This means that when you are reading a Clojure code you see lists, maps, vectors. How cool is that! You only have to know few things and you can code.

Do not restart your JVM

Because Clojure code is represented as data structures, you can pass data structure (program) to running JVM. Furthermore, compiling your code to bytecode (classes, jars) may be eliminated.

For example, when you want to test something you are not obligated to start new JVM with tests. Instead you can just synchronize your working file with running REPL and run the function.

Traditional way of working with JVM is obsolete.

REPL code synchronization

In the picture above, on the left you can see an editor, on the right there is running REPL.

The same way you can run tests, which is extremely fast. In our project we had ~80 tests. Executing them all took about one second.

Easy to read

Simplicity is the ultimate sophistication.

Leonardo da Vinci

After getting familiar with this language, it was really easy to read code. Of course, I was not aware of everything what was happening under the hood, but consistency of the written program evoked sense of control.

Disapointed

Data structure is your code

When data structure is your code, you need to have some additional operators to write effective programs. You should get to know operators like ‘->>’, ‘->’, ‘let’, ‘letfn’, ‘do’, ‘if’, ‘recur’ …

Even if there is a good documentation (e.g. Let), you have to spend some time on analyzing it, and trying out examples.

As the time goes on, new operators will be developed. But it may lead to multiple Clojure dialects. I can imagine teams (in the same company) using different sets of operators, dealing with the same problems in different ways. It is not good to have too many tools. Nevertheless, this is just my suspicion.

Know what you do

I’ve written a function that rounds numbers. Despite the fact that this function was simple, I wanted to write test, because I was not sure if I had used the API in correct way. There is the test function below:

Rounding test
1
2
(let [result (fixture/round 8.211M)]
(is (= 8.21M result))))

Unfortunately, tests were not passing. This is the only message that I received:
Rounding test
1
2
3
4
:error-while-loading pl.package.calc-test
NullPointerException [trace missing]
(pst)
NullPointerException

Great. There is nothing better than a good exception error. I’ve spent a lot of time trying to solve this, and solution was extremely simple.
My function was defined with defn-, instead of defn. defn- means private scope and test code, could not access testing function.

Do not trust assertions

Assertions can be misleading. When tested code does not work properly and returns wrong results, error messages are like this:

Assertions problems
1
2
3
4
ERROR in math-test/math-operation-test (RT.java:528)
should round using half up
expected: (= 8.31M result)
actual: java.lang.IllegalArgumentException: Don't know how to create ISeq from: java.math.BigDecimal

I hadn’t got time to investigate it, but in my opinion it should work out of the box.

Summary

It is a matter of time, when tools will be better. Those problems will slow you down, and they are not nice to work with.

Astonished

The Clojure concurrency impressed me. Until then, I knew only standard Java synchronization model and Scala actors model. I’ve never though that concurrency problems can be solved in a different way. I will explain Clojure approach to concurrency, in details.

Normal variables

The closest Clojure’s analogy to the variables are vars, which can be created by def.

Vars
1
2
3
4
(defn a01 []
(def amount 10)
(def amount 100)
(println amount))

We also have local variables which are only in let scope. If we re-define scope value of amount, the change will take place only in local context.

Lets
1
2
3
4
5
(defn a02 []
(let [amount 10]
(let [amount 100]
(println amount))
(println amount)))

The following will print:

Lets output
1
2
100
10

Nothing unusual. We might expect this behavior.

Concurrent access variables

The whole idea of concurrent access variables can be written in one sentence. Refs ensures safe shared access to variables via STM, where mutation can only occur via transaction.
Let me explain it step by step.

What is Refs?

Refs (reference) is a special type to hold references to your objects. As you can expect, basic things you can do with it is storing and reading values.

What is STM?

STM stands for Software Transactional Memory. STM is an alternative to lock-based synchronization system. If you like theory, please continue with Wikipedia, otherwise continue reading to see examples.

Using Refs

Refs reads
1
2
3
(defn a03 []
(def amount (ref 10))
(println @amount))

In the second line, we are creating reference. Name of this reference is amount. Current value is 10.
In the third line, we are reading value of the reference called amount. Printed result is 10.

Modifying Refs without transaction

Refs writes without transaction
1
2
3
4
(defn a04 []
(def amount (ref 10))
(ref-set amount 100)
(println @amount))

Using ref-set command, we modify the value of the reference amount to the value 100. But it won’t work. Instead of that we caught exception:

Exception
1
IllegalStateException No transaction running clojure.lang.LockingTransaction.getEx (LockingTransaction.java:208)

Using transaction

Refs writes with transaction
1
2
3
4
(defn a05 []
(def amount (ref 10))
(dosync (ref-set amount 100))
(println @amount))

To modify the code we have to use dosync operation. By using it, we create transaction and only then the referenced value will be changed.

Complete example

The aim of the previous examples was to get familiar with the new operators and basic behavior.
Below, I’ve prepared an example to illustrate bolts and nuts of STM, transactions and rollbacks.

The problem

Imagine we have two references for holding data:

  • source-vector containing three elements: “A”, “B” and “C”.
  • empty destination-vector.

Our goal is to copy the whole source vector to destination vector. Unfortunately, we can only use function which can copy elements one by one - copy-vector.

Moreover, we have three threads that will do the copy. Threads are started by the future function.

Keep in mind that this is probably not the best way to copy vectors, but it illustrates how STM works.

Refs writes with transaction
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
(defn copy-vector [source destination]
(dosync
(let [head (take 1 @source)
tail (drop 1 @source)
conj (concat head @destination)]
(do
(println "Trying to write destination ... ")
(ref-set destination conj)
(println "Trying to write source ... ")
(ref-set source tail)
(println "Sucessful write " @destination)))))
(defn a06 []
(let [source-vector (ref ["A" "B" "C"]) destination-vector (ref [])]
(do
(future (copy-vector source-vector destination-vector))
(future (copy-vector source-vector destination-vector))
(future (copy-vector source-vector destination-vector))
(Thread/sleep 500)
@destination-vector
)))
Execution

Below is the output of this function. We can clearly see that the result is correct. Destination vector has three elements. Between Sucessful write messages we can see that there are a lot of messages starting with Trying to write.
What does it mean? The rollback and retry occurred.

Printed messages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
(l/a06)
Trying to write destination ...
Trying to write source ...
Trying to write destination ...
Trying to write destination ...
Sucessful write (A)
Trying to write destination ...
Trying to write destination ...
Trying to write source ...
Sucessful write (B A)
Trying to write destination ...
Trying to write source ...
Sucessful write (C B A)
=> ("C" "B" "A")
Rollback

Each thread started to copy this vector, but only one succeed. The remaining two threads had to rollback work and try again one more time.

Software Transaction Memory

When Thread A (red one) wants to write variable, it notices that the value has been changed by someone else - conflict occurs. As a result, it stops the current work and tries again whole section of dosync. It will try until every write operation succeed.

Pros and cons of STM

Cons:

  • Everything that happens in dosync section has to be pure, without side effects. For example you can not send email to someone, because you might send 10 emails instead of one.
  • From performance perspective, it makes sense when you are reading a lot from Refs, but rarely writing it.

Pros:

  • Written code is easy to read, understand, modify.
  • Refs and transactions are part of standard library, so you can use it in Vanilla Java. Take a look at this blog post for more examples.

Summary

There is a lot that Java developers can gain from Clojure. They can learn how to approach the code and how to express the problem in the code. Also they can discover tools like STM.

If you like to develop your skills, you should definitely experiment with Clojure.