Automation Journey in the world of L10n!!
Feb’14, Reetika Ghai
The blog talks about the importance of automation in the world of localization and its increased need in the world of Agile
Paradigm Shift from Waterfall to Agile in the World of Localization
Do you know which is the fastest land animal in the world reaching speeds up to 113km/h?
Over the last two years, there’s been a gradual shift in the Software Development Life Cycle (SDLC) methodology for most of the Adobe flagship products. Product management has moved from yearlong waterfall product development life cycle to sprint-based Agile methodology (based on iterative and incremental development, where requirements and solutions evolve through collaboration between self-organizing, cross-functional teams).
As a result of changing market trends, we need to reinvent our approach to localization testing to meet the changing requirements of Agile methodology. In Agile, development and test cycles are shorter with quick turnaround time. In localization, test volumes have spikes considering the duration of the sprint cycle of 2-3 weeks. Features require frequent validation across multiple locales and platforms before certifying for a release in a simultaneous release model (sim-GM). In the Agile framework, it’s important to be cognizant of the business goals from localization perspective. I would categorize these in three broad areas:
- Time boxed frequent releases to market: With agile most of the Adobe products have at least one release every quarter to a frequency as high as weekly/monthly releases
- Increased test scope leading to increased localization efforts: With each sprint the legacy scope to certify LOC build increases
- Higher focus on rapid new feature development with simultaneous release to market: Certifying features on N Locale and M platforms in a sprint of 3-4 weeks
These goals create the following challenges for an International Quality Engineering (IQE) team while deciding the scope on the localized builds for localization testing:
- Ensuring increased test coverage on the new features while balancing the coverage for legacy feature areas
- Ensuring re-usability of tests across various platform variants
- Ensuring test accuracy across repetitive scenarios on multiple languages
- Ensuring faster execution to uncover defects early on
- Ensure all the features work as expected on all supported platforms and locales
- Ensure co-existence with different versions of the released product/patches
- Ensure shipping the product simultaneously in all supported locales across geographies (sim-GM)
- Ensure optimized test coverage on all the supported locales and platform variants
Automation in Agile & Localization
Why automation testing in Localization?
– Multiple Releases in an year
– High volume of testing
– Complexity of Platform: Locale combination
– Improved test coverage leading to better quality
– Faster time to market
– Cost Effectiveness
With these initial thoughts, we proposed to expand the automation coverage of Dreamweaver product features from English language to localized languages in September 2012. Our initial Goal was to attain 45% feature automation coverage on localized builds with respect to coverage on English build on Mac platform.
Gradually we built the feature automation capabilities in the next six months, starting from enabling the automation framework for localization (i.e., added support to the automation framework to run scripts on the localized operating system) to running daily smoke tests on all the 15 supported languages, and eventually having good feature level automation coverage.
Automation is a great way to overcome the above challenges and effectively achieve optimized test coverage on localization builds. With automation, it would be possible to certify incremental creative cloud releases for all the supported operating systems and language combinations supporting the time-bound releases.
With multiple releases to market in a year, manual execution of the repeatable test scope by the localization vendors leads to increased test efforts. The major part of the increased test effort can be attributed to incrementally increasing legacy test scope, i.e., legacy scope is cumulative the sum of all the deliverables in the past of a product and would increase with each sprint. On the other hand, automated tests can be run over and over again ensuring defined coverage across platforms and language combinations, thereby contributing to the overall product quality for the time boxed release. This not only eliminates the test redundancy but also helps in producing faster test results.
Having the legacy area automated would help the localization tester focus manually on the current sprint deliverable, hence uncover defects early in the test cycle.The IQE needs to be cautious in deciding the scope of automation on localized builds. Prioritizing the automation coverage is very important.
With each quarterly release to market, the certification scope of the legacy features set for a product is increasing, leading to amplified repeatable test effort across multiple sprints compared to one time validation in the yearly releases model
Journey into Dreamweaver Automation
DW Localization team has 88% Functional Coverage & 86.5% of conditional coverage w.r.t core coverage of 50% conditional and functional coverage in CC release for MAC!
For adopting product automation in localized languages, our journey stared by answering a few of our initial questions:
- What features do we need to automate?
- What will be the sequence of feature automation?
- What locales should we consider to start with, based on data from prerelease and bug history?
- What would be the best approach for optimized test coverage in the different locales?
- In the automation framework, where should the locale specific strings (used in Test scripts) be placed? Or should we pull the strings directly for comparison from the Adobe Localization Framework (ALF) at runtime?
- How much effort is required for adopting automation on locales?
- What would be the initial setup required to start automation in the different locales?
- How much additional effort is required for running automation scripts in localized builds regularly?
- What will be the hardware needs and the challenge to meet them?
- What should be the frequency of automation runs (daily smoke, basic feature plan, and new feature plan)?
- How to have the best execution turnaround time on all locales? What should be the optimization matrix considering fast turnaround time in agile?
Initial 6 months journey into adoption of automation framework for Dreamweaver localization
Dreamweaver Automation Framework
Dreamweaver automation is based on the Adobe homegrown automation framework called ‘Jerry’. The framework was developed by the Dreamweaver English QE team. It was written in Core Java, supported by apple scripts and java scripts in the backend, making use of the Dreamweaver’s API exposed by the developers.
The diagram depicts the automation workflow:
Step 1: A job ticket (contains configuration details like TC #, Platform, Machine details, language information etc.) is fed into the Pulpo server.
Step 2: Pulpo server primary purpose is machine management and result logging. Pulpo server invokes the test machine based and executes the test automation based on the plan mentioned in the job ticket.
Step3: Once the execution is completed the log/results are copied to the Pulpo server for further analysis.
Step 4: Results are logged to the automation dashboard “AutoDash”
The Jerry framework contains automated test cases grouped under various test plans:
Daily Smokes – Basic test for validation of daily build
Basic Features Plan – Contains test cases of the legacy and new areas covering feature smoke in Test Studio
Acceptance Plan – Contains acceptance and full test pass coverage for features developed in legacy and present release cycle in Test Studio
We started with one iMAC machine dedicated to Dw automation. However, soon after proof of concept was successful, we added one more dedicated machine for automation on localized builds. The above test plans got executed on a pre-scheduled basis across all 15 locales on the predefined execution plan. Job tickets distributed across 15 locales were fed to the Pulpo server either manually or automatically and were triggered on the arrival of new build in Codex. Typically, by the time we arrived at the office, build sanity was completed on all the locales and we were good to share the builds with our vendor partners.
For monitoring and optimization of test coverage across 15 languages, a dedicated execution calendar was followed. Based on the calendar, different automation test plans were executed on various locales/platform combinations on a daily basis. Daily smoke test for build validation were executed, followed by dedicated full feature test pass on the weekends. The execution was pre-scheduled and the test coverage was distributed across locales for optimal results given the time and machine constraints.
Accomplishments & Learnings
In the Creative Cloud (CC) release, we benefitted from having automated test passes on localized builds across 15 languages:
- Overall test coverage efficiency improved four folds compared to manual test execution
- Quick sanity test for acceptance of localization build before passing the build to vendor partners increased efficiency
- Achieved quick turnaround time for basic feature testing by automation scripts
- Parallel certification on multiple builds (Patch and Main line builds)
- More focus on new features development part of the current sprint by the localization functional testers
- Prerelease build certification completely through automation
- Built blueprint and Code Sign verification through automation on all locales in 2 hours compared to 32 hours of manual verification
- Support from the core team: It is essential to have the automation blessed from the English team for optimal support. In case of Dreamweaver, we got immense support from the team, especially from Kiran Patil (Quality Manager) and Arun Kaza (Sr. Quality Lead) for driving the automation efforts on localized builds
- Pilot on automation framework for one Tier 1/double-byte/Cyrillic locales to ensure the framework was robust and would support automation on most of the locales
- Always document the issues /challenges you face during setting up automation, they always act as a reference point later
- Ensure core scripts are independent of English strings. In Dreamweaver, updating the legacy automation scripts to make these scripts run on localization was a big challenge, as automation scripts were failing at string comparisons. Aishvarya Suhane (Localization Automation QE) was a great help for writing functions in automation framework and creating a few new scripts for resolving localization-specific issues.
Cheetahs in the world of localization …
Special thanks to Guta Ribeiro for inspiring & mentoring me to write my first Blog & Rakesh Lal for his support.