QA City

   QA City >> Expert Column
Dont Miss Experts

Algorithm for Building Analytics-Driven Test Coverage

Ajay Jain
Ajay Jain
Engineering Manager, Quality, Creative Suites
Ajay is the Engineering Manager, Quality at Adobe Systems, Inc. primarily leading and test progra... more>>
This article is the continuation of the Analytics Driven Test Coverage - Part I. This article focuses on step by step algorithm for building analytics-driven test coverage.

Analytics is defined as the process of measuring, collecting, and tracking information based on a users action. This proposal applies analytics principles to the testing domain.

This paper discusses building a system which can measure, compile, and track test coverage in a relatively easy and effective way. The approach suggests integrating Analytics with specific test objects to gather and analyze test data parameters and values when the test objects are acted upon (tested) by the users (testers).

To implement the suggested approach, each test module is broken into logical test variables and data values. When these specific data values are triggered (based on actions of testers); results are automatically sent to a server that compiles, computes, and analyzes the test data and charts the overall test coverage.

Here is a step-by-step algorithm for building analytics-driven test coverage:
  1. Estimate total test cases and test coverage
  2. Create logical test objects
  3. Add Analytics logic to test objects
  4. Testers action on the test objects.
  5. Send Data to Dashboard
  6. Process raw data and chart test coverage
Step 1: Estimate Total Test Cases and Test Coverage

Total test cases (TCcount) in a software project is represented as:

Tc = Test case

Each test parameter should be identified and all data values associated with the test parameter should be clearly stored.
Test Label = array [test data values]

This chart represents 7 test variables each running with 8 possible associated data values.

Table 1: Test coverage matrix generated with all applicable test cases and data values

Consider an example. If there are "n" number of languages in which the End User Licensing Agreement can be displayed to the user, test label is EULA Language selection and various values of this data set are the 34 languages in which EULA can be displayed.

Step 2: Create Logical Test Objects

A logical test object encompasses a group of tests that have equal probability of execution. These objects can be identified as transitional objects that a tester selects while progressing in the testing activity.

Thus, logical test objects are created by grouping all possible test selections (users actions) that can be used by a tester.

Object 1: Test case {1, 2, 3, 4}
Object 2: Test cases {5, 6, 7, 8}
Object 3: Test cases {9, 10, 11, 12}

This configuration implies that Test Object 1 groups test cases 1, 2, 3 and 4 while test object 2 groups test cases 5, 6, 7 and 8, and so on.

Consider an example in which a tester is traversing an installation using an executable file.

The logical entities that a tester reviews are:

1) Installer screen where End User Licensing agreement (EULA) is placed. The various test labels (parameters) to consider include EULA Language, EULA content, different navigation buttons (Next, Cancel) etc.
2) Installer screen where user can select configuration of products to install, install location, installation folder name, different navigation buttons (Next, Cancel) etc.

Thus, in the two examples, EULA screen and Installer screen are test objects.

Step 3: Adding Analytics Logic to Test Objects

In this step, analytics code is associated with the logical objects, such as the installer screen. The analytics will be tagged with each test case and record the actions, such as the selection of an option/value taken by the testers.

Associating analytics with test objects helps in recording all user actions. These are then studied further to determine the parameter values invoked by the tester and workflows triggered by the user. Effectively, it provides details of all actions taken by the user on the test object.

Step 4: Testers Action

Analytics will not provide any data unless the tester takes actions on the application being tested. Step 4 covers the actions, intentional or unintentional, taken by the testers ((single time or multiple times) on all possible areas/options. During testing, a tester may choose to complete a workflow, cancel a workflow, change and select different values, pause on a particular step, and get occupied with some other work. All these actions are captured by Analytics.

Data provided by analytics not only helps in analyzing how the tester is testing the application but it also helps in predicting and determining what the user is expecting from the application. When working on an application that includes built-in analytics, analytics data helps in understanding user preferences, interests, and thoughts.. Analytics help in predicting how the user wants the application to behave.

Step 5: Send Data to Dashboard

Auto-data upload triggers are set in each test object, through which data test variables and data values are sent to the server for analysis and processing. For each trigger, the application sends captured data to the analytics server. If the user is not connected to the internet, a local copy of the captured data is maintained which is sent to the server when the user gets online.

Data is sent to the analytics server in the form of Analytics paging data. The encapsulated format includes the test label that was tested and the corresponding values that the tester selected while performing the testing. A diagram of analytics paging data is shown below:

Figure 5: Analytics paging data format

Step 6: Process Raw Data Processing and Chart Test Coverage

On the server, analytics paging constantly brings data (test labels and their corresponding values) from the testers computers. This raw data is processed and formatted (grouped) based on the test labels. Because, multiple testers are testing the product, data coming from these multiple channels requires effective grouping.
For a tester, there might be only one variable selection for a test case. However, when data comes from multiple sources, you might encounter duplicate entries of the same data values for a test case. For example, two testers might choose to set the license agreement language to Dutch. To accommodate such scenarios, the analytics program increases the count of the licensing agreement language parameter to two.

Analytics processing plays an important role. It groups relevant data into logical entities. For example, EULA language selection data is stored separately from values of install location.

This data is then used to plot test coverage charts that provide a testing snapshot. You can review the testing areas that are covered and determine areas that are not tested adequately and require re-prioritization of test cases and resources.

The three steps that convert raw data sent by analytics paging methods are Grouping, Analysis, and Charting. Grouping involves logically partitioning relevant data into single units. Analysis involves determining counts of the used and unused test values. Charting involves displaying data variables and their corresponding values in human-readable format.

Figure 6: Three components of data processing

Experts on QA
Swaid Qadir Bhat
Sr System Architect
Virtusa Corporation
Subhash  Motwani
Prasad Rao Pasam
Ayaskanta  Mohanty
Managing Director
TATWA Technologies
Rajesh  Dagar
Software Architect
Connect Icon Pvt Ltd
Yasar  Khuthub
Software QA Manager
Azure IT Solutions
Sunil  Bhat
Project Management
HCL Infosystems Limi
Sharad  Agarwal
Team Lead