Software Performance Testing

In software engineering, performance testing is a practice that aims to determine the performances of a system in terms of responsiveness and stability under a particular workload. In addition to this it is used to investigate, measure, validate and verify other qualities of a system such as: scalability, reliability and resource use. 

From: https://en.wikipedia.org/wiki/Software_performance_testing

Proposed approach

Testing area

Considering  the testing landscape, the proposition is focused on

Abstraction levels:

☐ Integration testing

☐ Component interface testing

☐ System testing

Types:

Software performance testing

☐ Smoke testing

☐ Non functional testing

Conceptual Process – Define > Execute > Analyze

Conceptual process: Define > Execute > Analyze

Platform Application Modules

In order to support testing process

S2E integrates the following software modules:

Define

☐ Browser Automation

☐ Script & configuration repository

Execute

☐ Performance Test Engine

☐ Metrics repository

☐ Test Automation

Analyze

☐ Dashboard

☐ Metrics repository

☐ Performance Monitor Integration

Technological Stack

Engine

☐ Browser Automation. Selenium

☐ Performance Test Engine. JMeter

☐ Script & Configuration Repository. GIT

Dashboard & Metrics

☐ Dashboard. Grafana

☐ Metrics. InfluxDb

☐ Performance Monitor integration. Make

Continuous Testing

☐ Test Automation. Taurus + Jenkins

Activities

☐ Core modules implementation (Selenium, Jmeter, GIT integration)

☐ Defining the test campaigns based on the critical applications

☐ Early log correlation.

☐ Early integration between “SwPerformance Testing” and ”Monitor System Data”, in order to correlate the performance event to the system and application log.

☐ Dashboard and custom Performance Database.

Deliverables

Test case scripts

Load test configurations

Load test results, based on:

  • Number of concurrent users
  • Response time
  • Number of errors occurred
  • Exchanged bytes

Depending on the monitoring system integration, the load test is correlated to basic system metrics as:

  • CPU usage
  • Memory usage
  • I/O usage
  • Bandwidth usage

Dashboarding - KPI measure  and representations

  • Number of Users
  • Hits per seconds
  • Errors per Second
  • Response Time
  • Latency (time between request and first response)
  • Connect Time
  • Bytes/s (Throughput)

Database

  • High performance database based on time series

The basic KPI will be correlated to have derived measurements:

  • Number of Virtual Users, Average Response Time and Errors per Second, derived KPI:
    • Bottlenecks
    • Bottlenecks that Might Need Fixing
    • System Failure
  • Number of Virtual Users and Hits per Second, derived KPI:
    • No Bottlenecks
    • Bottlenecks that Need Fixing
    • System Failure
  • Some simple outcomes:
    • Hardware not used. “Consider to reduce the resources available”.
    • Hardware not adequate. “Consider to increase the resources available”.
    • Hardware adequate but low performance. “Consider to tune the configuration”

Dashboard Example 1

Dashboard Example 2