Software Performance Testing

Nell’ingegneria del software il performance testingè una pratica che mira a determinare le performance di un sistema in termini di responsività e stabilità sotto un particolare carico di lavoro. Oltre a questo è usato per investigare, misurare, validare e verificare altre qualità di un sistema come: la scalabilità, l’affidabilità e l’uso delle risorse.

From: https://en.wikipedia.org/wiki/Software_performance_testing

L’approccio proposto

Ambito del Testing

Considering  the testing landscape, the proposition is focused on

Abstraction levels:

  • Integration testing
  • Component interface testing
  • System testing

Types:

Software performance testing

  • Smoke testing
  • Non functional testing

Conceptual Process – Define > Execute > Analyze

Il Processo concettuale: Define > Execute > Analyze

Moduli Applicativi della Piattaforma

Per supportare il processo di testing

S2E integra i seguent imoduli software:

Define

  • Browser Automation
  • Script & configuration repository

Execute

  • Performance Test Engine
  • Metrics repository
  • Test Automation

Analyze

  • Dashboard
  • Metrics repository
  • Performance Monitor Integration

Stack Tecnologico

Engine

  • Browser Automation. Selenium
  • Performance Test Engine. JMeter
  • Script & Configuration Repository. GIT

Dashboard & Metrics

  • Dashboard. Grafana
  • Metrics. InfluxDb
  • Performance Monitor integration. Make

Continuous Testing

  • Test Automation. Taurus + Jenkins

Activities

  • Core modules implementation (Selenium, Jmeter, GIT integration)
  • Defining the test campaigns based on the critical applications
  • Early log correlation.
  • Early integration between “SwPerformance Testing” and ”Monitor System Data”, in order to correlate the performance event to the system and application log.
  • Dashboard and custom Performance Database.

Deliverables

Test case scripts

Load test configurations

Load test results, based on:

  • Number of concurrent users
  • Response time
  • Number of errors occurred
  • Exchanged bytes

Depending on the monitoring system integration, the load test is correlated to basic system metrics as:

  • CPU usage
  • Memory usage
  • I/O usage
  • Bandwidth usage

Dashboarding - KPI measure  and representations

  • Number of Users
  • Hits per seconds
  • Errors per Second
  • Response Time
  • Latency (time between request and first response)
  • Connect Time
  • Bytes/s (Throughput)

Database

  • High performance database based on time series

The basic KPI will be correlated to have derived measurements:

  • Number of Virtual Users, Average Response Time and Errors per Second, derived KPI:
    • Bottlenecks
    • Bottlenecks that Might Need Fixing
    • System Failure
  • Number of Virtual Users and Hits per Second, derived KPI:
    • No Bottlenecks
    • Bottlenecks that Need Fixing
    • System Failure
  • Some simple outcomes:
    • Hardware not used. “Consider to reduce the resources available”.
    • Hardware not adequate. “Consider to increase the resources available”.
    • Hardware adequate but low performance. “Consider to tune the configuration”

Dashboard Example 1

Dashboard Example 2