At 3SS we have been on a long journey of supporting our customers, developing and delivering our software and solutions worldwide. And we’ve lots of stories to tell about our evolution, and how our operations have changed, including when it comes to automated testing.
This journey has come about because we are always striving to achieve the best possible user experience for our customers who rely on our video solutions. Our priority is to deliver optimal results for the pay-TV providers and broadcasters who are building their businesses on our technology as foundation. And in parallel, we are committed to doing all we can to help viewers get the very best entertainment experiences.
The quality of experience is of the highest importance: The interruption of a video stream, buffering or slow reaction times can have a devastating, and even irrecoverable, impact on user satisfaction.
A major success factor in enhancing the user experience has been the effective use of Automated Testing. Since initially establishing 3SS back in 2009, we’ve used many different testing approaches and tools. As always, we are happy to share our story, what we’ve learned along the way, and how our use of Automated Testing has evolved.
At first, we tried to automate everything, in great detail. This was wonderful, and efficient, at least at the beginning. But soon after, having been in existence for just over a year, the thousands of tests which we had implemented soon required huge efforts to be maintained. And moreover, for all that time invested, the outcomes weren’t as valuable as we’d hoped. So we analyzed, and re-analyzed, all the test scenarios. We realized, as with so much in life, simpler is better. So we streamlined and adjusted our automation goals.
Most of the applications that we build for Smart TVs run on a high number of different devices with very different specs. Testing manually on all of them implies a large amount of repetitive work and takes a huge amount of time. But how can all these be tested faster, with limited effort, and ensure the ability to find any and all bugs as early as possible?
Automated Testing done right
Obviously, the answer was Automated Testing. We knew that our immediate goal had to be to minimize the accompanying development efforts required to support the testing, so that our teams could focus on implementation, testing and releasing new features faster. Therefore, we refined our processes so that testing would fall under the responsibility of our QA teams, rather than the developers.
The search for tools then began; we needed tools that allowed us to create tests without the need to write code. Based on this, our decision to work with Suitest for HTML5-based development for Smart TV was made: Tests are easy to implement and to maintain, using the system’s extensive test editor. The Suitest solution was the only one with an object-based approach like Selenium, therefore was best suited for testing Smart TV applications. It did not require any complex set-up and our QA engineers quickly learned how to use it efficiently.
We learned to embrace all testing activities and to adapt them based on what gives the most value to the project. We discovered powerful new manual and automated ways of testing in the process:
Smoke testing: We needed to perform basic tests to check critical functionality. Smoke testing assesses whether whether the build is stable and assures that the most important features of the system are working as expected. This is particularly useful right after a new build was made, helping us decide whether more expensive, project-impacting tests were required.
Regression testing: We also adopted this type of software testing which confirms whether a recent code change, such as an update or improvement, has affected existing features. With Regression Testing, old test cases are re-executed on the new version of the software.
In the projects in which we applied them, these and other new testing approaches enabled our QA teams to have more time for exploratory testing, and they could take more responsibility in their work, and be more independent. The various new testing methods also allowed apps to be validated for quality from the user’s perspective. This directly led to better stability and higher quality of user experience across all our apps and platforms.
Keeping is simple, and setting priorities
Our test strategy is simple: we implement the most valuable tests first and we run them continuously. But which tests are the most valuable and how do we decide this? We thoroughly analyze the business requirements of the product and we define the scenarios for tests which would have the highest impact in case of failure. For example, these can be tests related to the player functionality, payment process or app navigation.
By using a tool like Suitest we can get much more done, faster. To give you a clearer picture: Conventional regression testing, typically manually done, would have taken us 5 days. By using Automated Testing, and adapting our way of working, we reduced 5-day long testing processes down to just two hours – and on 6 platforms! We’ve been able to perform Automated Testing on as many as 15 devices at once. Releases are more stable, and you can really see an improvement in quality.
We also combined manual and automated testing and as a result we could decrease the defect leakage by more than 50%!
Test stability was something we invested in early on – and it played out: We consistently have clean test results that we can rely on.
How we’ve kept testing remotely at home
Due to COVID-19, we all worked from home for months. But this did not have any impact on the way we do testing. We’d set up a test lab to be fully functional, and to work without interruption, with a performant webcam to help us monitor the tests. Now it’s proven that such a testing set-up can effectively be used remotely in Corona times. For development and manual testing, each of us has a TV at home, but this doesn’t mean that we are testing the new features only on one device. To test on multiple devices and do debugging and manual testing when needed, we used the test lab at our office. By leveraging Automated Testing capabilities, and a remotely operable webcam, we controlled the test lab TVs remotely, and not blocked in any way.
Overall, the way we do testing has evolved immensely. Our new capabilities, the right tools (Suitest), testing remotely and picking the right methods have been a highly worthwhile investment for 3SS. We are now enabled to deliver better quality video services more efficiently, from our development centers in several European locations as well as from our home offices.
Want to learn more about Automated Testing at 3SS and how we could help you with your video products? Reach out to us via firstname.lastname@example.org.
Author: Crina Mocian, DevOps Engineer at 3SS