Friday June 29th 2012 12am
Mobile Testing: Same Amusement Park Different Ride
I recently attended the STP Online Summit: Insights on Deploying a Mobile Testing Strategy and during the panel discussion it was very evident that history is definitely repeating itself. I am no mobile testing expert but there are many common themes I heard from the discussion that were prevalent for testing professionals when we evolved from paper to mainframe to PC to client server and to the web; hence the title of this article.
As we have moved through these technology platforms and advances, I think the complexity of quality assurance/testing has geometrically progressed. The test coverage matrix is mind boggling. The picture below from Karen Johnson's presentation Mobile Testing: A Look at What Makes the Mobile Environment Different is scary and mind-blowing.
Take a deep breath, hold, and exhale; let's see if we cannot see the tree in front of the forest and revisit some basic tools you will need to survive in this ever changing mobile world we live in.
Goals, Objectives and Quality Dimensions
Understand what the business owners, sales & marketing will be measuring from the mobile space. You need to make the connection between business goals, quality dimension and the underlying tests needed to validate success attainment. Quality or Dimensions generally fall into the following categories: Reliability Usability, Maintainability, Conformability, Interoperability, Performance etc... As testers, you need to be able to articulate the tests in terms of these dimensions along with the effort to deliver it. Decision making becomes clearer and so do the risks. In the case where the testing phase gets crunched from eight weeks down to four, the team can quantify the effort that is necessary to attain that stated level of quality/dimension (reliability, accuracy, performance) to meet the business goal(s). How necessary (how much risk is mitigated) in testing every Browser/OS+OS Version/Devices/Model/Carrier combinations. Proactively communicating the probability of meeting quality requirements (via test coverage) compared to time remaining will be of great value to business owners.
Quantify the Test Effort & Duration
In order to approach an answer we need to define the investment in testing. We must make the transition from qualify to quantify. What is the quantified effort (investment) in delivering robust testing (from testing strategy, planning, design, environment, methods, techniques, execution, analysis and reporting) to support reliability, install-ability and scalability... ensuring the product/service/application is operating as designed on the mobile platform. How many test cases (an actual number) and the duration needed to successfully execute to affirm certification? You need to minimally have an approximation and some material backing up your statement. Indeed the power of information is at your fingertips. When faced with a 30% reduction in test cycle time, you can now present the risks (quantified and qualified) to the business goals. "The compressed schedule will cause test coverage to be under 60% for mission critical mobile certification. Product reliability, usability, performance will have un-tested functionality going to production. This risk to business objectives is certain and the negative impact to the business warrants alternate mitigation strategies to be considered before test cycle time is reduced".
Risked Based Testing
Risk is usually defined in terms of "not doing something or to a certain level" we incur risks to meeting the goals. How much investment in testing is going to be required to deliver quality (as defined by the business)? At what point in delivering the level of quality do we approach success? We determine the correct mix/trade-offs of the dimensions to best meet the business goals. We therefore make risk based decisions (cost, time, quality) to deliver the expected value to the business.
Understanding utilization patterns of the users in how they navigate the site will also enable targeting the most traversed paths are tested before the less common ones. You cannot test every combination or path, there is the law of diminishing returns, but ensuring your tests cover the most important features, most used, under the most common configurations should enable a level of success for the business. It is safe to say your marketing & sales teams (some product managers too) will have tons of data about the customers. They most likely have data from analytics tools your company has bought. Seek them out and start digging. Dan Bartow from SOASTA had some good statistics of the mobile user community. Use this information you gather about your user community and when the crunch is on you know where you have to test and where the risk is minimal to allow less or no testing.
Test automation tools have never been more important to the testing community. Simulators & emulators are mandatory if you plan on staying close to the change curve. Upgrades & enhancements are deployed and installed before you even know about them. How much regression is required and how deep does the regression go. Executing test cases for the web should be portable to the mobile space. There is only incremental effort as we re-use test cases for multiple scenarios. Without automated test execution that extends tests from the web to mobile you will be hard pressed to meet time market goals. It is estimated that "in the lab" using emulators & simulators you can most likely cover 75% of what would be experienced in the real world. Is it good enough? You need to have this conversation with your business owners. There might be risks out there requiring mitigation though additional "real world" test coverage.
A bad user experience can negatively affect revenue in both short & long term and not to mention the bad press that can be generated through social media sites. Missed opportunities and the loss of future revenue streams can be a deadly combination to the survival of your business. The marketing organizations will continue to look for innovative ways to satisfy sales and business objectives, therefore technology must be able to respond in a timely, efficient and effective manner to ensure we have the right products and services in place to deliver and validate business benefits through testing.
Paul Fratellone Program Director Quality & Test Consulting, MindTree Inc - Paul Fratellone has a 25-year career in information technology centralized in Testing & Quality Management. Performing every activity/task of the QA professional from QA Analyst to Director, he has experienced many challenges and opportunities to deliver quality through testing.
Come see Paul at the Software Test Professionals Conference in Miami from October 15-18. Paul will lead sessions 101 & 102: Preparing the QA Budget, Effort & Test Activities (Parts 1 & 2), part of the Leadership Perspectives for Testers track.