Show/Hide Toolbars

Waysys on the Web

Navigation: Quality Assurance

Quality Assurance Management

Scroll Prev Top Next More

For clients wishing to outsource some or all of the quality assurance for a project, Waysys offers the Quality Assurance Management Service.  The service provides processes and tools  for QA planning, test case analysis, inspections, manual and automated testing,  and QA metrics.  Waysys can provide QA staffing or use the client's staff.

 

The diagram below depicts the components of the service.  Clients can select one or more of these components.

 

Components of Quality Assurance Management

Components of Quality Assurance Management

 

Quality Assurance Planning

 

Waysys is experienced in the project and quality assurance methodologies, including the Scrum agile project management approach.  Based on past experience, Waysys help clients plan their QA activities.  See Quality Assurance JumpStartâ„¢ for details of the planning process.

 

Because performance testing is one of the most difficult aspects of testing an application, the service offers a separate template for the performance test plan.  The template assists in planning the many tasks necessary for successful performance testing.

 

Test Case Writing

 

The service employs the Framework for Integrated Tests (FIT).  FIT test cases are expressed as a set of tables involving inputs and outputs.  FIT test cases are written by business analysts or other business experts based on business requirements.  FIT offers these benefits:

 

It provides a documented, non-proprietary methodology for writing test cases.  This documentation makes it easier to train QA personnel in developing test cases.

 

It reduces the work of writing test cases, so that the effort becomes manageable.

 

Test cases can be provided to developers to help them understand requirements, thereby helping to insure more accurate implementations.

 

Test cases can be automated, making it possible to perform tests repeatedly during the project and during maintenance of the system.  

 

Automated Testing

 

Waysys emphasizes automated testing for projects.  Automated testing converts perishable manual effort into a reusable asset.  Waysys has developed or enhanced several tools for automated testing including:

 

GFIT

 

These tools support the FIT methodology.

 

Test Execution

 

As part of the service, Waysys can use the client staff as available and supplement this staff with both manual and automated testers.  

 

Training

 

Testers require training in QA methodologies, the system under test, and the QA tools.  Waysys provides hands-on, on-site classroom training and on-the-job training in all of these areas.

 

QA Management

 

On-going management of the quality assurance team is vital to the success of the project.  Waysys can either work with the client's QA manager or provide a QA manager.

 

Document Management

 

Even in agile projects, QA uses a wide variety of documents.  Often, these documents go through multiple revisions.  Proper organization of this documentation improves quality assurance productivity.  As part of the service, Waysys provides approaches to organizing and managing documentation, including the use of Wiki's.

 

Metrics

 

The purpose of collecting metrics is to provide the project team, project management, IT management, business management, and other stakeholders with these types of information:

 

The impact on staffing and schedule of the quality assurance activities as they evolve through the project

The impact on staffing and schedule of the defect repair efforts

Measures of the quality of the system implementation and its impact on future maintenance of the system

Indications when the system is ready for deployment.

 

As part of the system test effort, the team will monitor certain metrics derived from the testing:

 

Metric

Purpose

Total number of test cases written

Measures the amount of testing being planned.  Can be compared to the initial planned number of test cases.

 

Number of test cases run and not run

Measures the amount of testing completed and the amount remaining to be done.  Provides an indication of the amount of time needed to complete testing.

 

Number of test cases run per day

Measures the testing productivity.  Can be compared to the planned productivity to assess the accuracy of the schedule.  Can be used to predict the amount of time needed to complete testing.

 

Ratio of test cases passing to test cases run

Measures the quality of the system.  Can be used to estimate the amount of tests to be rerun.

 

Defects opened or reopened by severity

A count of the number of new defect reports opened during a week by level of severity.

 

Defects closed by severity

A count of the number of defects closed by level of severity.  Measures the productivity in closing defects.

 

Open defects by severity

A count of open defects by level of severity.  An upward trend in this number indicates that the system is not ready for deployment.  A downward trend indicates that the system may be approaching a point where it is safe to deploy.

 

 

As a management tool, the QA Manager maintains estimates of:

 

the number of test cases to be produced,

the number of tests that can be performed per day

the projected schedule for testing

 

As actual test case metrics are produced, these metrics can be compared to the estimates to assess the impact actual schedule versus what was planned.