In this talk, we discuss the most useful metrics to track QA teams & processes.
Full Session Description
Nowadays, great rivalry exists in software world. Everyone tries to reduce time to market, cost and increase the coverage and quality as much as possible. In this manner, managers try to minimize inner costs. They investigate the efficiency of each process. QA teams and activities are under investigation as well:- Are we spending too much money for QA: What are they doing? Is it worth?- Are they successful?How can you understand if your QA team is successful? In this presentation, most useful metrics to manage, track and evaluate QA teams & processes are discussed. Bad metrics (giving no idea about success) are also presented. For instance, being concentrated on only test case numbers can lead us to wrong perceptions. Apart from numbers, quality and coverage are of great importance in terms of maturity of testing processes. Not to overcome other weaknesses, I will discuss what maturity metrics should be followed comprehensively. (without being stuck only on test case numbers)All claims are supported with real-life experiences by people, who have been trying to manage QA activities in DevOps environment. Finally, our aim is to find ways to achieve convincing our managers and ourselves as well for the necessity of QA activities and teams.After going over metrics which demonstrates the progress and quality of the product; we will all have a chance to compare how QA activities save money, time and prestige since potential bugs, which are likely to appear in Production environments are eliminated in early stages.I will have a little code demonstration, in which we will collect data from Jira (using Jira API) & post them to CloudWatch to monitor them on graphs.Outline* Introduction: Metrics & Monitoring* Activity: A little demo about what happens without progress check & monitoring* Important QA Metrics* Practical Case Study: Storing Metrics in Amazon CloudWatch: Collect Data from Jira & Post them to CloudWatch, Create Dashboard & monitor Graphs: All over Java code.* Wrap-Up
Eylul Akar
Test Automation Engineer @ Siemens
Mesut Durukal
QA & Test Automation Manager @ Siemens.
About the authors
Eylül works as a Test Automation Engineer at Siemens AG. She has Bs degree from Marmara University Computer Sciences and MsC degree from Bogazici University Software Engineering program. She has nine years of experience in software application development and software testing. She has worked on web and mobile applications on various platforms. She is recently working on cloud-based IoT system. She is a great Agile and Test Automation enthusiastic.
Mesut Durukal is QA & Test Automation Manager at Siemens.
He has a BSc & MSc degree from Boğaziçi University Electrical & Electronic Engineering. He has a 7 years’ experience in Defense Industry, working in Multilocation projects serving as the Manager of Verification & Validation activities. He has then been working in Agile Software Testing projects for 3 years. He is acting as a Product Owner & E2E Test Automation Leader for the QA team.