Quality and speed are both important

Every digital organization wants to release new features quickly, frequently and with high quality. Unfortunately, in traditional IT organizations, speed and quality are often mutually exclusive, and management is forced to choose one over the other. This was because traditional quality assurance processes were performed manually, so ensuring an application was bug-free required a slow, costly and inefficient quality assurance process. Skipping quality assurance resulted in faster development but a buggy and low quality product.

When approaching digital transformation, companies need to rethink their culture, organizational structure and process in order to achieve high efficiency without sacrificing in quality, reliability and predictability.

Experts in QA automation

When Grid Dynamics was founded in 2006, we had the advantage of beginning our internal QA practice with the newly-established industry best practices of full automation, cross-functional teams and DevOps culture. Since then, we have grown a team of several hundred world-class engineers that focus on providing quality assurance via automation and close collaboration with development teams.

Over the past several years, we have implemented automation, test data management and service virtualization to help both our large and small clients achieve efficient quality assurance. Through test automation, we have provided close to 100% coverage to our clients, and have reduced test execution times from weeks to hours and even minutes.

Building blocks

Organization and architecture

Three key prerequisites enable high quality, efficient testing:

  • Microservices architecture enables applications to be built with testability in mind. Strong contracts and loose coupling of services allows them to be tested in isolation, while well-defined APIs and UI save costs on implementing and maintaining tests.
  • Cross-functional teams with quality engineers embedded into development teams reduces the lag between the readiness of functionalities and the readiness of tests.
  • Close collaboration between quality engineers, product owners and system analysts ensures that the delivered functionality is aligned with business expectations.
  • Microservices architecture enables applications to be built with testability in mind. Strong contracts and loose coupling of services allows them to be tested in isolation, while well-defined APIs and UI save costs on implementing and maintaining tests.
  • Cross-functional teams with quality engineers embedded into development teams reduces the lag between the readiness of functionalities and the readiness of tests.
  • Close collaboration between quality engineers, product owners and system analysts ensures that the delivered functionality is aligned with business expectations.

Test data management

Quality engineering and test automation requires data:

  • Testing on real production data is not always possible or advisable due to efficiency, security, compliance or test coverage concerns. A production data snapshot may not represent all corner cases, but synthetic data can also miss certain corner cases that appear in production due to human error.
  • Too often in the industry we see flaky tests that depend on hard-coded identifiers or uncontrolled changes in data sets, making the tests useless.
  • Our approach is to test both carefully generated synthetic data sets and obfuscated production data when possible. If tests and test frameworks are properly implemented, it is possible to have the same test work on both synthetic and real data.
  • Testing on real production data is not always possible or advisable due to efficiency, security, compliance or test coverage concerns. A production data snapshot may not represent all corner cases, but synthetic data can also miss certain corner cases that appear in production due to human error.
  • Too often in the industry we see flaky tests that depend on hard-coded identifiers or uncontrolled changes in data sets, making the tests useless.
  • Our approach is to test both carefully generated synthetic data sets and obfuscated production data when possible. If tests and test frameworks are properly implemented, it is possible to have the same test work on both synthetic and real data.

Dependency management

Most systems under tests have dependencies, typically in the form of other services:

  • A number of techniques exist to isolate services from dependencies during testing, including development of mocks and stubs, and using tools for service virtualization.
  • Choosing the right level of isolation during testing is a must. Depending on the service and the nature of dependencies, service virtualization may increase or decrease the efficiency and cost of testing as well as the quality of the end result.
  • Our approach includes analysis of the business logic and contracts of the service under test and its dependencies to choose when to use stubs and mocks, and when to use real dependencies.
  • A number of techniques exist to isolate services from dependencies during testing, including development of mocks and stubs, and using tools for service virtualization.
  • Choosing the right level of isolation during testing is a must. Depending on the service and the nature of dependencies, service virtualization may increase or decrease the efficiency and cost of testing as well as the quality of the end result.
  • Our approach includes analysis of the business logic and contracts of the service under test and its dependencies to choose when to use stubs and mocks, and when to use real dependencies.

Full automation of all quality aspects

An orange circle and squares

Unit testing

Is typically performed by developers, but is an integral part of quality assurance.

An orange circle, arrows, and a square

Service-level testing

Performed via API for individual services, and covers most test cases to allow releasing services independently.

And orange cicle and a square

Integration testing

Is performed end-to-end on the UI level, and may be required for high-risk changes.

An orange circle and a clock

Performance testing

Done continuously as part of the CI/CD pipeline to ensure that new changes didn't affect throughput or latency.

An orange circle and a stats icon

Stability and reliability testing

An important sub-type of non-functional testing, and a component of chaos engineering.

An orange circle and a monitor

Production testing and advanced monitoring

Performed on live service instances in production to ensure that services continue working as expected after release.

An orange circle and a star

Data quality testing

A sub-type of production testing to monitor the correctness of data flows in transactional and analytical systems.

A orange circle and arrows

Security testing

Performed with modern code analysis and site vulnerability detection tools.

Integration with the continuous delivery pipeline

Integration with the continuous delivery pipeline

Technology stack

Docker logo

Performs containerization

a red icon

End-to-end testing for AngularJs

Maven logo

Build automation tool

Jenkins logo

Automates software development process with continuous integration

Protractor logo

Testing framework

J unit logo

Unit testing framework for Java

SE logo

Framework for testing web applications

JS logo

High level programming langauge

Java logo

Popular general-purpose computer programming language

jbehave logo

Framework for behavior driven development

SoapUI logo

Writes, runs, integrates, and automates API tests

Fitnesse logo

Web server and automated testing tool

Black, red, and yellow letters in the word TestNG

Testing framework for Java that covers many kinds of tests

cucumber logo

Automated acceptance tests written in behavior driven development

spring logo

Application framework and inversion of control container

Ruby logo

General purpose programming language

db unit logo

JUnit extension for database-driven projects

a robot logo

Write, code, build, build, and design Android UI tests

a green logo

Framework for Android unit testing

perfecto moble logo

Cloud platform for continuous delivery

Appium logo

Test automation framework

Docker logo

Performs containerization

a red icon

End-to-end testing for AngularJs

Maven logo

Build automation tool

Jenkins logo

Automates software development process with continuous integration

Protractor logo

Testing framework

J unit logo

Unit testing framework for Java

SE logo

Framework for testing web applications

JS logo

High level programming langauge

Java logo

Popular general-purpose computer programming language

jbehave logo

Framework for behavior driven development

SoapUI logo

Writes, runs, integrates, and automates API tests

Fitnesse logo

Web server and automated testing tool

Black, red, and yellow letters in the word TestNG

Testing framework for Java that covers many kinds of tests

cucumber logo

Automated acceptance tests written in behavior driven development

spring logo

Application framework and inversion of control container

Ruby logo

General purpose programming language

db unit logo

JUnit extension for database-driven projects

a robot logo

Write, code, build, build, and design Android UI tests

a green logo

Framework for Android unit testing

perfecto moble logo

Cloud platform for continuous delivery

Appium logo

Test automation framework

Docker logo

Performs containerization

a red icon

End-to-end testing for AngularJs

Maven logo

Build automation tool

Jenkins logo

Automates software development process with continuous integration

Protractor logo

Testing framework

J unit logo

Unit testing framework for Java

SE logo

Framework for testing web applications

JS logo

High level programming langauge

Java logo

Popular general-purpose computer programming language

jbehave logo

Framework for behavior driven development

SoapUI logo

Writes, runs, integrates, and automates API tests

Fitnesse logo

Web server and automated testing tool

Black, red, and yellow letters in the word TestNG

Testing framework for Java that covers many kinds of tests

cucumber logo

Automated acceptance tests written in behavior driven development

spring logo

Application framework and inversion of control container

Ruby logo

General purpose programming language

db unit logo

JUnit extension for database-driven projects

a robot logo

Write, code, build, build, and design Android UI tests

a green logo

Framework for Android unit testing

perfecto moble logo

Cloud platform for continuous delivery

Appium logo

Test automation framework

Our engagement model

Our approach is to embed specialized quality engineers directly into application development teams. With this approach, we build cross-functional teams which have all the necessary skills and distinct specializations within each team, as we believe that this setup provides the highest efficiency, quality and speed. In such teams, a developer and a quality engineer work on a feature together. That way, when a developer completes the feature, it is already covered by automated tests. 

When we engage with clients, we start by analyzing their existing processes and technical approach. This analysis helps us write personalized, strategic and tactical improvement plans. We then embed quality engineers into development teams and begin improving the quality of products, or enhance efficiency by automating existing manual test cases. With this approach, our client teams are hands-on and learn quickly, which helps spread the right culture, skills and processes across the organization.

Read more

Get in touch

Let's connect! How can we reach you?

    Invalid phone format
    Submitting
    Quality automation

    Thank you!

    It is very important to be in touch with you.
    We will get back to you soon. Have a great day!

    check

    Something went wrong...

    There are possible difficulties with connection or other issues.
    Please try again after some time.

    Retry