LEGO Case Study – shop.lego.com

When we joined LEGO’s Digital Brick project – a complete rebuild of their website – they had a test team split across 3 locations, Delhi, London and Connecticut. 70-80% manual regression testing was still very much the order of the day with the little automation that was in place being owned solely by the offshore team. A traditional selenium/java framework was in place with approximately only a 1/3 of the required regression testing coverage and a cycle of testing for the entire team still taking 4-5 days with necessary chasing-the-sun handovers from India to London to the US on a daily basis. Very little reporting was available from the automation suite and whilst the US lead team knew that a certain level of coverage was in place, it became harder and harder to quantify exactly how much value it provided to support ongoing project delivery.

The website rebuild brought with it both substantial architectural changes – a decoupled, front end/back end split with a brand new set of Oracle Web Commerce (ATG) APIs and a new react/node driven front end – and a significant shift in agile ways of working for the entire LEGO IT department. The project has run for close to two years with ongoing two week sprints, frequent live releases, pair programming and test driven development, a DevOps stability stream and a complete overhaul of how the test team operated and delivered their own code.

From day 1, we introduced our Test Evolve agile automation framework and it immediately enabled two distinct streams of testing delivery in one package – API and Front End. The API suite of tests now stands at around 10,000 tests that are run on a daily basis and the browser automation contains around 1,800 tests, also run on a daily basis. A methodical “tagging” strategy has been employed giving the regression packs the scalability required to run a smoke test, a production test, a nightly test or a full regression test. By using this method, a quick 20 minute job or a full cycle 2 hour job can be executed as required.

From the outset, we ran requirements gathering workshops with the business owners to harness the power of Behaviour Driven Development and Specification by Example methods. By engaging product owners, business analysts, developers and testers, we can ensure that the requirements are gathered in such a way that meets the needs of all parties whilst at the same time, beginning the process of collectively capturing some high level acceptance tests. Most importantly, this enables multi directional conversation about what is expected from the application in a clear and measurable manner.

To enable a cohesive continuous integration strategy, both the API and browser test packs have had nightly and on-demand Jenkins jobs created and the framework itself enables the LEGO test team to target any of the 5 or 6 environments in existence. A real time dashboard for each suite exists to provide immediate reporting on test success – refreshing with each single test that completes providing reporting at a very business readable BDD feature level. The Jenkins jobs also provide an at-completion granular level of reporting displaying any failed tests with debugging information for immediate investigation by the team. We have also built in a “living documentation” feature within the CI jobs for complete peace-of-mind for the business owners, knowing that as long as a feature is represented within the documentation, it has a passing automated test and is representative of how the application is behaving in production.

With a brand new responsive front end and a strong business focus on mobile transactions, a complete test automation strategy needed to facilitate desktop cross browser testing (IE9 – Edge, Chrome, Firefox, Safari) as well as mobile browser testing (Android & Chrome 3-4 versions, iOS & Safari, Windows). Within the Test Evolve framework, we have enabled basic browser based emulation of specific devices and we have also built plug-ins for Sauce Labs and our partner Perfecto Mobile, to enable cloud based mobile and desktop cross browser testing without the need to host your own device/os & browser infrastructure. However, should a client choose to build their own device farm, the Test Evolve framework also enables the automated front end tests to run against targeted real devices in your own lab. All of this is controlled from a very simple, tester driven config/menu layer at the heart of the framework.

The Test Evolve framework also enabled us to re-use the desktop automation and mobile tests to create a starting suite of Apache JMeter load tests. In doing this, the number of load tests employed for 2015 Black Friday/Cyber Monday peak period was approximately 3 times what it was for 2014 with a much increased range of functionality covered and the result being the Oracle Web Commerce (ATG) application remaining online throughout this peak period for the first time in a number of years.

Finally, from a customer independence perspective, from the very beginning of this project journey, we have provided almost daily 1-2-1 pairing between Test Evolve and LEGO test teams. LEGO Testers have been fully immersed in everything needed to take full ownership of the framework at project closure and maintain it moving forwards. A team of largely manual testers 18 months ago now has a robust, scalable and maintainable API, browser and mobile automation framework at its fingertips with full transparency of reporting and alignment to business requirements and the knowledge and confidence to move this into other areas of the LEGO organisation. A regression cycle can now take from 20 minutes to 2 hours instead of 5 days, be run multiple times daily/weekly, against multiple environments and provide all of the information necessary for the product owners and release management process to make a sound risk based decision on whether to deploy to production or not.

          Request a Demo
Evaluate Test Evolve          
 
Previous
Previous

Omnichannel Functional Test Automation

Next
Next

Aiming for Agile Test Excellence