Long time ago, we were a typical java waterfall shop and our build and deployment practices by today’s standard were Stonehenge… we had 6 monthly release cycles, tentatively 2 months for discovery/design, 2 months for development and 2 months for testing. We used to have weekly builds on every Wednesday, we had to maintain change logs manually, tag files in CVS manually and have spent countless nights on the build days to integrate all the code changes by all the developers and stumble through the erroneous process to produce a stable build, really speaking a build that would compile. Lot of time was spent on manual testing and we had long regression cycles, our discovery process would also last till our testing cycle as during development as well as testing lot of requirements got cleared as not everyone was involved in the discover and the design cycle, naturally requiring more time to complete the feature. Working late nights and on weekends was a norm. There was very little trust between product management, developers and testers, requiring every piece of information to be written down as a requirement, even the smallest of smallest things. By the time information reached the developers and testers, the “why” component was lost, the goal was to complete what was written in the documents and not really on delivering value. Reading a document is very subjective as individuals based on their experiences, exposure and preconditioning can interpret a statement in any way, leading to not implementing a feature as envisioned by the product management and it was always too late to discover that something was missed. With long release cycle, it was typical to think of every possible feature and pack it in the release, because if you miss the train then you will get the feature after a year. Release delays were typical and would not meet the client expectations. None of the parties would feel satisfied no matter how much efforts they put in.
The performance of testing team members was measured based on the number of defects they find. So there was no vested interest to prevent bugs from getting into the product during the development cycle. The developers thought their job was over once they committed their code and thought that testing was testers job. There was almost a wall between every team. Developers and testers had to estimate the amount of time required to complete the feature and test the feature and the estimates were supposed to be “correct”. If the estimates were not “correct”, then the developers and testers were not doing a good job.
All this lead to lot of bad behavior and left bitter taste.
Sounds familiar?
From the start of my development career, till 2009, I had been writing code and I would spend a lot of time to make sure that the code works as expected by doing lot of manual unit testing. In 2009, my colleague Tristan Bezotte introduced me to JUNITs. I started writing JUnit tests and was in awe of the power that JUnits brought. All the tests that I had thought in my mind, which I would run with every code change multiple times manually, spending lot of time setting up the container etc, now I could code them in JUNITs and run those outside the container in seconds. It saved me immense time as they would run very fast and I would be able to run them every time the code was changed. I just couldn’t believe how I could survive without writing JUNIT tests for so long in my career. Now I cannot write code without having tests.
In the same year Tristan Bezotte also introduced me to Hudson CI. As I had spent countless nights integrating the code on the weekly build day, the power of Continuous Integration was evident and again I was in an awe of CI. I introduced Hudson for one of our core product in early 2010. It was a no-brainer and an easy sale.
- No need to maintain change logs.
- Automatic build whenever someone committed the code.
- We still had weekly builds, but the amount of time spent on those builds had reduced significantly. No more late nights for code integration.
- Quick feedback on build failures.
- Ease of identification of the root cause of why the build failed because of few commits per build.
- All the unit tests run after the compilation, so if someone commits code that causes tests to fail, you would immediately know that you have introduced a bug.
- Encouraged developers to commit code frequently for faster feedback.
- Code quality violations using PMD, Findbugs.
- Constant availability for the last stable build.
- Earlier QA would have to wait for 7 days to get a build. Now they can pick up the next stable build and start testing.
- Ability of deploying the war after the build.
- Ability to get code coverage to understand which flows are not getting covered using automated tests.
The team benefited a lot from this small change. I still remember that sometime in 2006, when I also played a build engineer role, Jim Mathews had asked me to think over having multiple builds a day instead of one weekly build and I was like “why”? what’s the “need” of having frequent builds a day? I realized my naivetés only after I was introduced to Hudson in 2010. I still regret not going Continuous Integration route as early as in 2006. Anyways, better late than never and I again started wondering how could I as a developer survive without having Continuous Integration in place. The next task was to showcase the power of JUNITs to my colleagues. I could manage to pursue some of them.
By end of 2010, I moved to a new product which was being developed using the Agile philosophy and I started working as a product owner. I was glad to get the initial coaching from David Hussman. I also got an opportunity to attend couple of devjam sessions. With the two years that I spend with the new product development team and the agile approach, I was again in awe of Agile. I connected with the agile approach of focusing on “value”, “failing fast”, “learning and course correcting” over “lengthy project plans“, “estimation“, “delivering what was written” . It took me some time to go away from thinking about the entire problem at one go, to thinking in terms of “mile wide and inch deep” or the steel thread approach. Sometimes, though the concepts were understood, it was getting difficult to apply those in real life when lot of back-end activity was involved. In those two years I learned a lot.
In 2012, while I was working on the new product, I had to come back to work on the existing product and had to manage the team and the delivery of that product. This team was still working with the waterfall mindset. The team had many new members, product management got new members, and we had to create a new product offering. I had got a big requirement document and had to deliver this new product offering in time while focusing on the new product that I was already working on simultaneously. From my point of view, there was no way for me to deliver the product with almost new team while continuing with waterfall model. I was also not in a position to completely alter landscape of the way this product was being developed. Every change has its own learning curve and I did not want to cause a disruption. My option was to run a covert operation based on all the experience that I had gained in the prior two years. I decided to pick up few agile practices that I had learnt while working on the other product and to apply them in the current context without uttering the word “Agile”.
Practice 1 – Acceptance Test & Value – I started adding Acceptance tests within the requirement document. This helped a lot in terms of getting clarity on the requirement. Product Management was forced to think “why” this feature is being added and what is the value it is generating, how to know whether the development and testing team delivered as per expectation. Product management also had to go through this new learning curve of writing acceptance test and figure out the relative priority and over a period of time we all improved. The development and testing teams were involved in going through the acceptance tests and adding acceptance tests, so many surprises where taken care off and it helped teams to get on the same page for most of the requirements.
Practice 2 – Prioritizing the requirements based on their perceived value helped us in delivering the most valuable feature first. The idea is that if there are delays, then the least valuable (nice to have, good to have) items get dropped and the most valuable items are taken care off. Teams also started understanding that not every piece of code/feature is important, so they could focus their energies well.
Practice 3 – Weekly Demos – As the team was new and lacked in domain, it was imperative to gives weekly demos to the product management to ensure that we were not deviating from the expectations and red flags were highlighted early.
Practice 4 – Daily product management road block removal meeting (or stand-ups) – text is very subjective to read, this meeting gave the team members involved an opportunity to directly interact with product management and get clarification on text that was ambiguous resulting in failing fast and early course correction.
Practice 5 – Continuous Integration & Dev environment deployment – Hudson was already in place, by now it had forked into Jenkins and we opted to go with Jenkins. The goal was to add as many tests as possible and deploy the war once a day using Jenkins so that product management can look at progress on daily basis and accept the features as they were completed. The second goal was to make build artifacts available after every build so that the concerned testers can pick up the latest artifacts and start testing immediately and go away from weekly builds.
With this covert operation, we got great success and I was very glad that the small changes worked without too much of disruption. With successful deliveries of few more releases using the same approach, the trust between Product Management and SD/QA started increasing. As trust increases, the amount of what you can achieve goes up because the energy is not spent on blame games and focus is on what we can improvise. The next goal was to formally introduce agile & agile tools, I started with having sessions on Agile, Stories, Rally, Continuous Integration and Deployment for both Product Management and SD/QA. With the success in 2012, Product Management and the SD/QA team were very supportive of the changes done and were ready to move and embrace additional agile practices.
In 2013, I started completely focusing on this product. In addition to earlier practices, we introduced other practices like iterations, stories, backlog grooming. The biggest challenge of 2013 was to bring SD & QA to work together as one team, i.e. as partners and not see each other as enemies. The second challenge was to complete both development and testing within the same iteration. The third challenge was to get every one to write JUNIT tests. The fourth challenge was to get around the notion that if now I have to also write Junit test cases, along with my code, I will need more time.
QA was no more rated based on the number of issues they find and developers job was not over after committing the code. Development and testing members had to collaborate on a story and their collective goal was to make sure that bugs do not get into the code in the first place and the work is complete and delivered within an iteration as per Product Management expectations with appropriate tests. This was not so smooth journey. For both the teams, collaboration, pairing, writing automated tests and completing testing within the same iteration was a difficult habit to build. I had to spend a lot of time and energy with my leads to give them an alternative perspective of the future and helped them through the change required in the mindset and later changes required in the teams mindset. Looking back, it was a great experience and I think the blood, sweat and tears shedded were worth it. Gradually our focus shifted from manual testing to automation testing, which helped in getting closer to our goal of completing the testing within the same iteration. We started introducing lot of GUI automation tests using QTP and then Selenium and the data validation was being done on the GUI. Our automation tests started running slow. Only one person was able to write these tests and it started creating bottlenecks. By end of 2013, we had started getting better at the process part but we had not focused so much on the engineering practices.
For the complete 2014, we had Naresh Jain and Dhaval Dalal in our Pune office to coach the teams on Agile and Engineering practice. We were very glad that we had Naresh Jain with us who had started Agile India movement. Via Naresh, I got a chance to meet and attend Jeff Patton‘s “Story Mapping” workshop. Naresh & Dhaval helped the development teams on their engineering practices. Naresh helped us realize the importance of writing blogs, community contribution and presenting in various conference. He also helped us with our hiring process.
I introduced IDeaS Rock Star, a gamification approach to help team members adopt agile practices and values. My blog talks about our effort on reducing Jenkins Build Time. Due to Naresh’s influence, I think I have discovered that sharing what I have learnt, in conferences is my new found love and I have talked about both “Gamifying Agile Adoption” and “On a Quest to reduce Jenkins Build Time” and various conferences. So far I have presented at Agile Pune 2014, Agile India 2015, Indic Thread 2015 and RallyOn 2015. My other blog talks about my first public speaking experience. There is so much to learn during and after the sharing experiences in these conferences.
Naresh helped QA with the automation tests. We soon realized that our approach with automation testing was wrong. He helped us with understanding the inverted test pyramid and changed our perspective of automation testing. Sachin Natu gave a presentation on Inverted Test Pyramid at both Agile Pune 2014 and Agile India 2015. Aditya Saigaonkar & Kirtesh Wani presented “Selenium DeTox for Achieving the Right Testing Pyramid” at Selenium conference 2014.
This year we introduced more practices like pairing, test driven development, devbox testing. We had not hired our testing team to write code. At the time of their hiring the focus was on manual testing so coding skills were not required. Now in the changing circumstance, we were expecting the testing team members to pair with the development teams and ensure that they are covering most of the scenarios as part of the unit tests and eventually start contributing to writing tests. Because of their lack of coding skills it was becoming very difficult for them to really contribute in figuring out whether the automation tests were actually testing all the scenarios as well as write any new. So they would land up running the same scenarios manually. After realizing this problem, I decided to coach them on reading and writing in Java language. We started with test driven development. I have come to realize that when any one is taught a new language, one should always start with tests and right from the get go, imbibe in their mind that without tests coding cannot be done. I think the same approach should be followed in our educational institutions. This effort was carried forward by Kirtesh Wani. This effort gave varying levels of success. Most of the testing team understood the importance of upgrading their skills, writing automated tests and knowing how to read and write code. Cucumber was introduced and most of the testing team started getting involved in the coding and over a period of time were able to contribute in the automation effort.
Based on what I learned from Naresh & Dhaval, for one of our internal support product, I decided to apply agile approach and decided to go away from 4 monthly waterfall release cycle to weekly deployments in production. The team worked with the internal clients directly, focused on story mapping and on items that added highest value, started writing automation tests in JUNITs, Cucumber and Spock. Naresh and Dhaval helped in refining our processes, practices and reviewing our code and tests. The internal clients were involved during prioritization, story mapping, planning, stand-ups and demos. The outcome was magnificent. We had very happy internal clients as we were able to deliver high value in a given time. We had managed to reduce a lot of wasted efforts and were able generate good returns on our development efforts.
In 2015 I moved back to the new product. By end of 2014 and early 2015 our expectations were that complete automation testing should be done with the same iteration. Jasmine was introduced for GUI Unit Tests. I am very glad that Aditya Saigaonkar and Sameer Shah are leading this effort and taking it forward from where it was started. They have managed to actually realize the dream. They are also working on getting our regression time down by focusing on automation testing. We are also working on continuous deployment and reducing our release cycles.
Every day we are getting better at what we are doing and we are getting closer, in terms of agile adoption and engineering practices, to our new product which we deploy to production every two weeks. This is not the end, the journey continuous. The next I believe is to embrace the lean startup mindset… We are on our way forward!