Last Friday the l10n-drivers team launched a Testing & QA survey targeted at the localizers community. The survey will stay open for another week and I would like to encourage all of you who are involved in the testing efforts in your l10n team to take it (if you haven’t already, of course).
We all know the localization process is much more than translation. As much as it consists of actually translating the strings that you see in the product’s interface, it also involves lots of other activities, such as adapting the product to the need of the local users (think default search plug-in for example), making sure the first-run experience is good for the local users (web parts: the firstrun page, the ‘Getting started’ page etc.), and that the software update process will also be available in the user’s language. It also involves a fair share of testing and quality assurance, to make sure everything is in place the day of the release.
During all the years of active work, many localization teams have worked out their own practices and procedures regarding testing. We called them “testing plans”. Now, you may think that this is too much of a word in some cases, but we like to think that even a short list of things to check (e.g. accesskeys) before the release is, in fact, a simple testing plan. And it is equally interesting to see what items are included in such a list, as well as how the list itself is stored, presented and maintained (litmus, wiki, google doc and many other possibilities).
So it’s natural that when Mozilla starts thinking about localization testing plans, we first look at what’s been already invented and proved to work well. There are many localization teams, each with their own way to test the localization, suited for their needs and resulting from their approach and past experience. What if we could share these practices between the localization teams and help other teams adopt them?
The above briefly summarizes the discussion we had on the objectives of the survey. If you would like to help us, please take the survey! :)
It consists of 20 questions, split into four sections:
- Section 1. Tell us about you and your team
- Section 2. Test Cases, Testing Coverage and Planning
- Section 3. How can Mozilla help?
- Section 4. May we contact you?
It should take about 10 minutes to complete. Thanks!