Does TDD really matter?

Test Driven Development (TDD) has been around for a quite a while (since 2003) and nowadays, somehow works like a buzzword in developers’ resumes.

Many decent companies around the world would highly consider TDD skills, experience, and more importantly TDD tendency of their job applicants, in their recruitment process. Sometimes, their candidates’ TDD adherence even matters more than any other skill such as knowing about new frameworks and technologies. But why is that? Does TDD really matter in real world projects or it’s yet another buzzword in developer’s recruitment?

unit-testing-joke

In this article I’m not going to bother writing yet another article about TDD, as there is already more than enough. The thing that I’m going to focus here is: what difference it would make on the software design and software and code quality as well as software developer’s level of expertise.

Let’s have a quick look into TDD to see what is it about and then we’ll get back to our main point:

By definition Test Driven Development or TDD is a software development process that relies on the repetition of a short development cycle. Each cycle consists of the following steps:

  • Add some tests
  • Run the tests and check if they all fail
  • Write some code to pass the tests
  • Run the tests and check if they all pass
  • Refactor the code if needed
  • Repeat the cycle for the next requirement

test-driven-development

First of all, once we’re talking test, we mean automated test as opposed to end-to-end manual test. Which means we’re writing code to test the actual code!

Generally speaking, we’ve got the following 3 types of automated tests (specifically functional tests):

  • Unit testing : which tests one and only one method(function) regardless of all the dependencies of the method (such as database, file, network resource, etc.)
  • Integration testing : that tests a combination of methods and components to see if they work properly while integrated with each other.
  • End-to-end (E2E) automated testing (using tools such as Selenium or Protractor in AngularJS)

In TDD we could have all 3, traditionally though, we mean Unit and Integration testing once talking automated tests.

At first as a developer, it doesn’t make sense to write a code that tests something which doesn’t actually exist. But that’s one of the main points. The tests actually represent the specs we are about to implement. All the components in our software are there because we are expecting them to do something. So before writing the code in a TDD manner, we need to be clear about specifications (expected behavior) of the system. If we have clear specs we would be able to write the code that exactly needed.

So let’s highlight the first point in TDD : be clear about the software module’s specs . Why is it important? To answer this question you need to experience working in both TDD and non-TDD teams. Well … I can bet you on this, If you do TDD for a while and then get back to a non-TDD environment you can feel the chaos, reworks and ping pong game between the Business Analysts (BAs) QA (End to end testing team) and developers. Which is a bloody vicious and tedious game.

In the future I’ll be writing about BDD (Behavior Driven Development) which is a kind of evolution in TDD in terms of having clear specs, I’m not going to get into more details on that topic here though.

When you write test and then the code, you are actually covering the code’s health and integrity with your unit tests. This means that if you develop a component and in the future another developer comes and changes the component to add a new feature, and his code breaks some functionalities of your code, he will realize the problem as soon as possible. So he’d be able to fix it easily and briskly. Even if he’s a careless developer and doesn’t run the tests before pushing to the code repository, using a CI (Continuous Integration) server, everyone in the team will receive a notification email which indicates the change has broken the code and caused a test to fail.

So another benefit we can get from TDD is : protecting our code from breaking changes and enabling the team to find the bugs at the very beginning of their occurrence. Remember though, that we need to do code review plus Continuous Integration (CI) in conjunction with unit testing and integration testing  to achieve a robust mechanism to protect the code against breaking changes.

Note that, Code review plays a crucial role in this process, for instance if a developer changes the code and tests in a way that tests are wrong and pass with a wrong code then we’re screwed! :))) Therefore we can say that reviewing tests is even more substantial than reviewing the actual code!  (I’ll dedicate separate articles on Code Review and CI/CD quite soon.)

To me, the most important plus of doing TDD is its impact on design and quality of the code. Writing code in a TDD fashion is subject to writing testable code which takes a different style of software design. At first, testability might seem trivial, but quite a lot of value is buried behind it.

A testable code, tend to adhere SOLID design principles which are fundamental object oriented design principles and the root of many design patterns and best practices. SOLID is actually an abbreviation of 5 basic principles:

  • (S): Single responsibility Principle
    Which says: each method should be only responsible to do one and only one thing.A class should be responsible to do a single job.
  • (O): Open/Close Principle
    Your code should be open to extension and close to modification. Which means too extend the software’s functionality, we should be adding code rather than modifying the current code.
  • (L): Liskov Substitution Principle
    Derived classes must be completely substitutable with their base classes. (This principle is subject to more explanation which is not relevant to this topic)
  • (I): Interface Segregation Principle
    Classes should not be forced to depend upon interfaces that they don’t use.(This principle is subject to more explanation which is not relevant to this topic)

  • (D): Dependency Inversion Principle
    Depend upon abstractions instead of concrete classes. Which means we need to write a code that all dependencies are interfaces or pure abstract classes (with no implementation) rather than concrete types (non-abstract classes). In fact, Dependency Injection which is a widely used design pattern is based on this principle. I’ll cover this topic later on, in another topic.

Writing testable code, forces developers to stick to the Single responsibility principle. As we need to write unit test for our methods and each unit test should test only one thing. So a method with multiple responsibilities would be hard and cumbersome to test.

We have to apply Open/Close principle while doing TDD, because if we do so, we can just add new code and write test for that specific extension rather than modifying the code and its pertinent tests. Therefore, a developer that breathe in a test driven environment, tend to adhere Open/Close principle by using best practices and design patterns (such as strategy, decorator, bridge, etc.)

More importantly, the need for testability makes developers to use dependency injectionin code to be able to replace the actual dependencies with a fake implementation. For instance consider a repository class that relies on a database context object (say it’s using Hibernate session classes or Entity Framework’s DbContext object). If this database context is injected into the repository, we could unit test the repository by faking the DB context object and without any need to connect to the actual database. Technically, it’s called testing in isolation. Once we’re unit testing, we should isolate the component or the system under test (SUT) from all of it’s dependencies.

You can see a code I’ve pushed into github, as a sample of unit testing repository classes to make a better sense of how dependency injection facilitates unit testing.
To see the sample repository class click here and to see the unit test class for that repository click here!

Using dependency injection, would result in a clean component decoupling in our app which is a crucial factor in software design. And test driven attitude would force developers to write loosely coupled classes to be able to test them in isolation.

The point I’m trying to make in this article is that a test driven attitude is not just about writing automated tests for our code, but rather leads to an immense impact on our design and coding style as well. In fact, a developer with TDD skills and experience is the one who a professional development team can count on.

To wrap up, I would say “Hell YES!!! TDD really matters and is a must for a developer who wants to work in a professional development team.” So if you want to be a real developer, start learning or brush up your TDD expertise, rather than whining about the companies that excessively care about TDD!

Advertisements

.NET Core 1.0 : a giant leap in .net world

Microsoft, finally released its new generation of .net framework named .NET Core on June 27th 2016. I believe, this version would be a turning point in .net development and would be a giant leap in .net stack.

The reason why I believe so, being the combination of C# language  features and convenience of coding plus a significant improvement in the framework’s performance, as well as cross-Platform support for major operating systems like Windows, Linux and MacOS . Adding the open source code and potentially growing community to that, I believe it’s not just yet another .net version, but a foundation for a new powerful technology stack.

The benchmarks published on Github are showing exciting results :

https://github.com/aspnet/benchmarks

Here is an example of benchmarks done on HTTP server, showing that .NET performs 3x faster than NodeJS!!! Which sounds unbelievable to me! Check out this comparative benchmark copied from the mentioned GitHub page :

Plain Text Performance benchmark

Similar to the plain text benchmark in the TechEmpower tests. Intended to highlight the HTTP efficiency of the server & stack. Implementations are free to cache the response body aggressively and remove/disable components that aren’t required in order to maximize performance.

Stack Server Req/sec Load Params Impl Observations
ASP.NET 4.6 perfsvr 57,843 32 threads, 256 connections Generic reusable handler, unused IIS modules removed CPU is 100%, almost exclusively in user mode
IIS Static File (kernel cached) perfsvr 276,727 32 threads, 512 connections hello.html containing “HelloWorld” CPU is 36%, almost exclusively in kernel mode
IIS Static File (non-kernel cached) perfsvr 231,609 32 threads, 512 connections hello.html containing “HelloWorld” CPU is 100%, almost exclusively in user mode
NodeJS perfsvr 106,479 32 threads, 256 connections The actual TechEmpower NodeJS app CPU is 100%, almost exclusively in user mode
NodeJS perfsvr2 (Linux) 127,017 32 threads, 512 connections The actual TechEmpower NodeJS app CPU is 100%, almost exclusively in user mode
ASP.NET Core on Kestrel perfsvr 313,001 32 threads, 256 connections Middleware class, multi IO thread CPU is 100%
Scala – Plain perfsvr 176,509 32 threads, 1024 connections The actual TechEmpower Scala Plain plaintext app CPU is 68%, mostly in kernel mode
Netty perfsvr 447,993 32 threads, 256 connections The actual TechEmpower Netty app CPU is 100%

To be honest, I can’t believe such a performance improvement yet, I need to touch it myself and do my own benchmark! I’ll be sharing my results and writing about it in the future.

By the way, if you’re into Docker, there is a docker image for .NET core downloadable from here : https://www.microsoft.com/net/core#docker

Check usernames availability!

What I’m gonna introduce in this post, has nothing to do with tech or software development stuff, but it could be handy once in a blue moon!

Today, I was looking for something else on the web and  just came across a cool website in which you can easily check the availability of a username in heaps of social networks in just a few seconds.

Here’s the address: http://checkusernames.com/ , have fun with it! :))

Tools to detect websites’ techs

As a web developer, most of the time I come across cool stuff (technology-wise) once I’m surfing the web and I’m always wondering, what kind of framework, library or technology has been used for their development.

I used to go with old school methods like viewing the website’s source code or inspecting elements (through dev tools) in my browser to grasp the libs and frameworks used in the web page. But you know, it takes time and thought, sometimes the code doesn’t make much sense and for some libraries about which we know almost nothing , it gets tricky to realize how things are working in the page.

In this post, I’m goint to introduce some handy tools I’ve found to detect tech stuff quickly with actually no effort.

One of my favorite tech detectors is Wappalyzer, which is actually an add-on for Chrome, Firefox and Opera browsers. You just need to simply install the add-on and that’s it! Once you’re surfing the web it’s going to show you all the detected technologies through icons in your address bar.If you click on the icons, it shows a full list of the used stuff. It can detect the server-side tech stacks/platforms, web servers, even the CMS used, and all the JavaScript libraries. If you’re wondering about any of them you can see a brief introduction to the tech in the Wappalyzer website (by clicking on tech name in the list) and go to the official website of the pertinent technology.

wappalyler

Builtwith is another cool  web-base tech detector I normally use. All what you need to do is to put the URL in www.builtwith.com and see the result. It shows much more details about the hosting, and server-side aspects.

ebay

Robotics and IoT through Cylon.js

In this topic, I’m going to introduce a cool JavaScript library for Robotics and IoT lovers. It’s called Cylon.

Using this library you can easily interact with other devices whether through a web browser or server-side NodeJs code.

Cylon brings the exciting robotics and the Internet of Things world to JavaScript. At this point of time it supports 50+ different platform devices plus a general purpose I/O supports with a shared set of drivers provided by cylon-gpio module.

The figures below show the most famous platforms supported by Cylon. You can find a full list of them here.

Cylon supported platforms
Cylon supported platforms
2
Cylon supported platforms

Using these 3 basic plugins, Cylon enables you to send and receive data with other devices even in a real-time streaming manner:

  • http/https (REST)
  • socket.io library
  • MQTT(an Internet of Things machine to machine connectivity protocol)

Here are some examples of using Cylon :

The code below contols an ARDrone, takes off and then lands again:

"use strict";

var Cylon = require("cylon");

Cylon.robot({
  connections: {
    ardrone: { adaptor: "ardrone", port: "192.168.1.1" }
  },

  devices: {
    drone: { driver: "ardrone" }
  },

  work: function(my) {
    my.drone.takeoff();
    after((10).seconds(), my.drone.land);
    after((15).seconds(), my.drone.stop);
  }
}).start();

Working as a software architect and developer on Network and Environment monitoring applications, I think it would be a great API gateway for the next generation of monitoring applications.  And I strongly believe that the combination of devices like smart phones, drones, sensors, robots, and so forth plus libraries like Cylon.js and cloud services like AWS IoT, would open new horizons of innovation in the new future, and makes the world much closer to what we would imagine in the Sci-Fi movies.

If Cylon is of your interest take a look into these tutorials and samples:

https://cylonjs.com/documentation/examples/

https://cylonjs.com/documentation/tutorials/dreamforce-2014/

var codopia=new Blog();

Hey guys!

We’re a team of Sydneysider software developers, obsessed with technology and development.We just stated this blog for talking about coding and software development. Our main focus in this blog would be on software architecture, design, frameworks and technologies.