Profiling

CPU profiling for rescue

Prev – Test on smaller data set

I use Eclipse editor for my development since many years and couple of years ago I had looked for good CPU profiling plugins for Eclipse and had not found one, since then just for profiling java code, I use NetBeans editor as it had in-built support for CPU and Memory profiling. I started NetBeans for carrying out CPU profiling and for some unknown reasons, I was just not able to profile my JUnit test suite in NetBeans. Since I use NetBeans just for profiling, I have to keep doing lot of setup on it, which I hate, and now I also had to figure out why the profiling was not running. I had no patience to figure it out, instead I attempted to check if there are any new profiling plugins available for Eclipse and I came across JVM Monitor and boy, now I am loving it!

Profiling our tests highlighted following issues in our code.

  • The code was scanning resource bundle files, a disk IO intensive operation, multiple time. I just cached them.
  • Many tests were loading the spring file system application context. Some tests were doing it in the @Before method, causing the context to be loaded before every test method in that class. This is again a disk IO intensive operation. After some refactoring of the test code, I could reuse the context in majority of the tests.
  • Our code was sending out emails during our tests, which was taking time and not required. I skipped sending out emails.
  • There was a class which was accepting a java Date object and the number of days to be added to the date. Code snippet below. This method was invoked thousands of time.

public static Date addDays(Date date, int numberOfDays)

{

Calendar  calendar  = Calendar.getInstance();

calendar.setTime(date);

calendar.add(Calendar.DATE, numberOfDays);

return calendar.getTime();

}

To my surprise, this simple looking code is not efficient at all. I refactored the code as below and it is much more efficient.

public static Date addDays(Date date, int numberOfDays)

{

long number = ((long)numberOfDays) *MILLISECONDS_IN_ONE_DAY;

return new Date(date.getTime() + number);

}

After all the profiling and refactoring, most of the test jobs started getting over within 15 minutes. Without using SSDs/Hybrid disks, I was able to get the CI build pipeline time down to about 25 minutes. Now it is not a surprise as to why the RAM Drives did not show much improvement on our actual Jenkins. Disk IO not related to the DB was our bottleneck. So though SSDs are fast, it turns out that SSDs are not alternatives for sloppy programming! lol.

Next – On a quest of reducing Jenkins build time.

 

Advertisement

On a quest of reducing Jenkins CI build time

In my organization we are using Jenkins as our CI tool. The core build is followed by multiple jobs consisting of unit tests, integration tests, SAS integration tests, PMD, all running in parallel, running over 3000+ tests which took the entire build pipeline to run over 1 hour 30 minutes to produce the build artifacts. The amount of time taken was too high and it was very frustrating specially when the tests failed, as multiple developers would check in files while the earlier build was in progress, the next job would start and by the time issue was identified and fixed, it would take more than 4/5 hours to get a stable build. QA would not get build artifacts on time. Valuable development time was getting lost. There were frustrations all around.

This persistent issue pushed me on a quest to reduce the CI build time.

The problem with duplication

The discovery of Ant JUnit task options

The assumptions around IO and SSD

The alternative for SSD – in-memory/in-process db

The “eureka” moment – discovery of RAM Disk Drives

The excitement and the disappointment

Test on smaller data set

CPU profiling for rescue

Today Ajay moved our Jenkins VM to a box having hybrid disk and now the build pipeline time has reduced from 25 minutes to 15 minutes and all the test jobs run in less than 10 minutes!!! And I am feeling very happy and satisfied on my quest of reducing the build time. This journey took more than two months, during which I have learnt a lot.

 On a quest of reducing Jenkins build time – Part 2

Resources