“As organizations continue to prioritize continuous testing as the foundation of their agile development efforts, we are excited to see how their performance against these benchmarks improves over time, and we look forward to doing our part to help them reach their goals,” said Charles Ramsay, the CEO of Sauce Labs. To expand its continuous testing capabilities, Sauce Labs purchased Screener, a provider of automated visual testing solutions. Screener allows users to test their UI across multiple browsers, devices and operating systems to automatically detect visual errors for easier integration into the DevOps workflow. It also allows developers to test individual UI components to get fast feedback in the early stages of the development cycle. “As more code and complexity shifts to the front-end of the development process, visual component testing is quickly becoming a critical part of any comprehensive shift-left testing strategy,” said Loyal Chow, the founder of Screener. Sauce Labs also released Sauce Headless, which enables development teams to get fast feedback on code by running atomic tests early in the delivery pipeline. It leverages headless Chrome and Firefox browsers on Linux in a container-based infrastructure so development teams can identify issues early and keep the pipeline moving by testing on every commit. Sauce Labs published its first report that analyzes how companies measure up to benchmarks of four key continuous testing pillars. The company also announced its acquisition of Screener and availability of Sauce Headless. The Continuous Testing Benchmark Report was based on user data from the company’s continuous testing cloud between June and December of last year. The report found that the majority of companies fared dismally when compared to the test quality and test run-time benchmarks. It stated that only 18.75% of organizations passed 90% of tests they run and 35.94% of organizations completed their tests in an average of two minutes or less. However, numbers were much higher regarding test platform coverage and test concurrency. 62.53% of organizations tested across 5 or more platforms on average and 70.88% utilized at least three-quarters of their available testing capacity during peak testing periods. Just 6.23% of organizations achieved the benchmark for excellence across all four categories.