ISV Testing Strategies for Reliability and Compatibility

Grant Miller
 | 
Jun 29, 2023

For independent software vendors (ISVs), validating the reliability and compatibility of a software release is crucial before it is distributed to end-customers. Implementing a comprehensive testing strategy is vital to identify and rectify any potential issues prior to deployment. In this post, we’ll explore the stages of release testing and the role they play in improving the end-customer deployment experience.

ISVs should always test each release prior to distributing to end-customers. Tests cascade from simple and fast checks done at every application code commit, to more time consuming complex tests that are run in preparation for a full customer-facing release.

By breaking down the testing process into different stages, we can optimize what and when to test and reduce the need for executing a full matrix of all possible license values, app config options, Kubernetes config options, Kubernetes versions, and Kubernetes distros.

Summary of Common Release Test Stages

Common test stages include:

Smoke Test

Smoke Test, Check for install & compatibility, run against a limited config, frequency - every PR & commit

A smoke test is an installation test performed in common environments. The goal here is to ensure that the application can install and start. This stage typically doesn't include end-to-end (E2E) or performance tests for the application. An app release smoke test might be designed to test the Helm Chart rendering, looking for missing or invalid configuration or container images that crashloop on start. Smoke tests should be executed on every PR and commit to a vendor’s application repo to validate that the changes don't break compatibility with a basic deployment target and static configurations. A good goal is to complete smoke tests in less than 5 minutes, not counting time to provision an environment.

Policy Test

Policy Test, check for Kubernetes policy violations, run against a complete config, frequency - every PR & commit

Policy tests should run at the same time as a smoke test against a single K8s distro/version that spins up with default values and will observe the K8s API under a pre-defined number of vendor-supplied integration/end-to-end tests to report on possible K8s policy violations (i.e. app attempts to write to read-only disk). This is where the application E2E test suite should run. Policy violations are detectable only when the application is executing common workloads.

Release Test

Release test - check for stability & readiness, run against a matrix of combinations, frequency - when preparing for a new release

Release tests should be performed after smoke and policy tests have been passed in preparation for a release. Release tests will consist of a single set of vendor-defined configuration of entitlements/values, and run the app against a matrix of default-configured K8s distros and versions that are currently in support. Vendors can also remove specific distros and versions to reduce the set. The goal is to test all combinations of distros and versions supported by an ISV to set the release up for success in all possible supported environments.

Canary Test

Canary test, check for load handling  reliability, run against matrix of combinations, frequency - promotion to beta

Canary tests should be made when a release is promoted to the Beta channel and will be tested against N environments where N = the sum of unique environments that represent an ISV’s paying customers’ full configuration (entitlements, values, k8s config, distro, K8s version) as reported through telemetry. This could also include starting with the version that a customer is currently using and then performing an upgrade of the application. 

Using Replicated’s Compatibility Matrix (coming soon) will allow the canary test to change automatically over time an ISV’s customers change. If a new customer adds a new untested environment, the test will update to accommodate the new environment. We also recommend that the vendor supply their standard E2E test suite to be executed in addition to other tests you run, like load tests for high load customer-representative environments.

By investing in these testing strategies, ISVs demonstrate their commitment to delivering high-quality, compatible software that meets the needs and expectations of their customers in various environments. Ultimately, a robust testing approach enhances the customer experience by providing them with a stable and reliable software solution that they can trust.

We've been putting a lot of thought into how to resolve this issue. If you'd like to hear more, please reach out to our account teams for a sneak peek briefing.

‍Learn More: