QUALITY POLICY
Our Value Proposition
CEO Foundry is a global consulting and technology services company offering industry-specific solutions, strategic outsourcing, and integration services through a unique onsite, offsite, offshore delivery model that helps our clients achieve rapid deployment, world-class quality and reduced costs.
Our Mission Statement
Our mission is to bring value to our employees, our stakeholders, and to the global business community by delivering advanced technology solutions that enable our clients and trading partners to operate, interoperate and compete more effectively.
- Ensure that all of our products comply with relevant safety and regulatory requirements.
- Ensure our products meet and/or exceed their published specifications.
- Maintain and continually improve our product and service business management systems to conform at a minimum to ISO9001 Quality Management Standard or more stringent or legally required standards as dictated by specific markets.
- Continually monitor the quality of our customer interactions, with the intent to improve our customer's total experience.
- Establish quality requirements for suppliers, partners, and contractors and hold them accountable to comply.
- Treat customers in accordance with CEO Foundry Standards of Business Conduct and Privacy policies.
Our Quality Assurance Procedures
This document outlines the general procedures on a typical localization project. It should provide the lead test engineer with the correct guidelines for developing a specific QA plan.
1. Prepare Testing Environment
Technical analysis based on client materials such as:
- user manuals
- reference documents
- configuration guides
- installation guides
- on-line help
- source version software,
- leverage from last version
- any related materials if available
2. QA Team Priorities
Prepare Project Specific test plan which includes:
- Thorough analysis of project spec and goals
- A complete, step-by-step methodology including test cases and testing scripts for use by QA, engineers and translators throughout project
- Setup and configuration instructions for testing environment
- Detailed timetable for testing with project milestones
Determine testing approaches and methods such as unit, integration and functional necessary for carrying out project specific QA process
Project manager assigns testing team making sure all members understand all client's specifications and project requirements.
Ensure the correct preparation of hardware and software environments
Ensure all team members, including translators are fully trained in all areas of product to be localised
Ensure all team members are fully trained in use of bug tracking system and in correct methods of bug reporting
3. Initiate Testing
3.1 Minimal Acceptance Tests
- Entry Criteria: Series of Build Acceptance Tests will be performed on localised builds from engineers. These tests require a certain level of quality prior to commencement of testing, ensuring that QA is incorporated into all stages of the process
- Exit Criteria: Series of tests performed on proposed final build of the product which will be a subset of overall testing plan and technical analysis. The results of these tests determine whether the product is approved for final linguistic testing.
3.2 Functional Testing
Functional testing of all areas of the product ensures it will perform as closely to the English version as possible. This entails a series of tests which perform a feature by feature validation of behaviour, using a wide range of standard and erroneous input data. Depending on the project specification, it can range from simple smoke testing of the product to thorough script based testing of the product's user interface, APIs, database management, security, installation, networking, etc. The QA team at Oxford Conversis can perform functional testing on an automated or manual basis using black box and white box methodologies. Typical testing criteria for a localization project would include:
- Technical Verification to ensure quality of individual files in localized builds received from engineers. Criteria would include code integrity, link integrity, variable integrity, string consistency etc.
- Compatibility testing to ensure compatibility of a software product or Web site with different hardware platforms , browsers, network configurations, Operating Systems, and other third party software. Our testing labs at Oxford Conversis offer all the hardware and software needed for such testing including
- PCs, Macs, and UNIX workstations and servers
- Windows 95/98/ME/2000/NT/XP, Mac OS, and many platform versions of UNIX
- All versions of Internet Explorer, Netscape, and Mozilla
- Performance Testing is very much project specific and depending on the product to be localised a number of testing procedures such as Load, Stress and Capacity Testing can be applied to test the applications scalability.
3.3 User Interface Testing
User Interface Testing will be performed on the running application or web site. Our Visual QA procedures provide comprehensive testing on all possible cosmetic issues of localised product. All testing will involve valuable input from our language experts. Throughout the testing cycle exploratory testing is also performed by expert QA engineers to find defects that may not be found by formal testing methods. Typical criteria for such testing would include:
- Verification of complete translation of text, and images if applicable, on all screen.
- Testing for correct translation, with particular focus on newly added strings in built version
- Testing to ensure that translated text always conveys correct meaning in the target locale.
- Verification of correct layout of all data on each screen
- Check for correct display of language specific characters, with regard to their specific keyboard settings, such as the umlaut in German.
- Verification of other international standard issues such as tags, fonts, date & time functions, currencies
- Reporting and fixing user interface problems such as truncated text caused by translations. For example the German language is much lengthier on translation and can frequently lead to this problem of text truncation.
- Check for duplication and correct functioning of hot-keys on all screens.
3.4 Linguistic Testing
In the linguistic testing phase our testing specialists add another level of quality assurance by thoroughly testing the final build conclusively ensuring the product is both linguistically and technically sound. I.e. the translations are accurate, grammatically correct, and culturally appropriate.
4. Bug reporting and Tracking
Using the testing methodology listed above our QA engineers, translators and language specialists log issues, or bugs, on our Online Bug Tracking System. Localization issues which can be fixed by our engineers will be fixed accordingly. Internationalization and source code issues will also be logged and reported to the Client with suggestions on how to fix them. Bug Tracking process is as follows:
New Bugs are submitted in the Bug tracking system account by the QA.
When a bug is logged our QA engineers include all relevant information to that bug such as:
- Date/time logged
- Language
- Operating System
- Priority - Low\Medium\High\Urgent
- Possible Screenshot of Problem
- Bug Type - e.g. functional, UI, Installation, translation?
The QA also analyses the error and describes, in a minimum number of steps how to reproduce the problem for the benefit of the engineer. At this stage the bug is labelled "Open". Each issue must pass through at least four states:
- Open: Opened by QA during testing
- Pending: Fixed by Engineer but not verified as yet
- Fixed: Fix Verified by QA
- Closed: Fix re-verified before sign-off
In order to manage the testing process efficiently it is important to include several other bug status levels such as: Not a Bug, Use as Is, Deferred, Regressed, and Cannot Reproduce
- Engineering Team fixes bugs according to the priority assign, and changes the status to 'Pending'
- Engineer releases a new internal version, with fixed bugs and new features.
- QA verifies and checked the bugs in new release, and changes the status
- Fixed bugs are closed in the BTS by QA, Non fixed are 'Reopened' in DTS
Depending on quality of translations, engineering and on the number of bugs logged the QA team decides on a number of test cycles for project. Usually this would be no more than two or three full passes before Final Bug Regression and release to Linguistic Testing.