Mobile software applications are becoming more prevalent in our daily lives A recent study published by Andrea Smith on Mashable highlighted just how much we as a society are addicted to our mobile applications, going so far as to say “Some people confess to using over 50 apps a day.” We see them everywhere, waiting in line, walking down the street, or even attending a sporting event. In fact, 82% of the respondents, people believed they couldn’t be without their mobile applications for longer than one day and if this sample reflected society, then these mobile applications need to work correctly, consistently as well as meet user needs.
More applications are being created. The need for people to not only test these applications but have the knowledge in how to test these products on various types of devices is becoming quite a challenge. This article will describe some of the configuration tests for testers to consider in their mobile application testing projects.
Last month, I facilitated a Weekend Testers Americas session on Configuration Testing of mobile devices using Facebook as a native application. We had an array of devices, a combination of iPhones, Android phones, iPads and Android tablets. Despite the various mobile devices and operating systems, testers experienced radical differences in using the same mobile application. The result: testers had an eye opening experience and were able to broaden their perspective for when they went back to test their own mobile applications.
Some of the observations found from our weekend testing session include:
- Sorting of the news feed posts appeared different based on the device used.
- Facebook application showed information based on the size of the device’s displayable area where the tablet displayed more information than the phone.
- Default display and functional settings on all three configurations (tablet, desktop/laptop and phone) varied like the Friends lists, refresh settings, time stamps of posts.
- Search functionality behaves differently on the tablet vs. the laptop/browser application.
- Knowing how to do something on one device configuration doesn’t mean you will automatically know how to perform a function on another configuration – for those users who do switch from configuration to configuration.
Have you thought about testing what appears on the screen on one device and the output using a different device? Even among different Android phones, you will see a difference in physical sizes of the viewing area. When designing your test cases, not only do you need to consider general real estate of the mobile application, but also how the applications’ views each appear discernible on different sized devices. Does Facebook native application fill out the Android phone screen on the 4-inch screen and the 5.5-inch screens? Now the question comes up, how would you automate the testing for such differences? Should you automate? Such a test may not be worth automating, especially if that part of the code is not changed from release to release. Not all tests should be automated and with mobile applications becoming so critical for companies to produce, testing projects need to be carefully planned out and an assessment of when to invest in automation for mobile tests conducted is crucial.
Have you ever compared the Facebook application on a tablet to that of the mobile application on your phone? Even if both configurations share the same operating system, they are almost completely different applications or versions of code with radical variations in the display. So how would you plan your testing based around one mobile application? Here are some things for you to consider; different configuration of devices, rotation of the device and what is displayed and does it change any viewable functionality? These may be tests to only conduct once in a release but should be a part of a release at some point.
In mobile application versions, the icons displayed are different in Facebook depending on the configuration. Some test considerations for “ease of use” and transitioning from one configuration to another should be included. What constitutes ease of use? Who determines the definition? Currently, these factors should be “tested” out before design and coding begin. Remember, as a tester, you need to have clearly defined requirements or a clear understanding in how your application will be used on each configuration. If not, the lack of a “seamless experience” can have a destructible impact on a company’s market reputation.
What about learnability? Do your users typically switch from one configuration to the next? Tests to include visual and functional transition between configurations should be considered as part of the release. With some mobile phone applications being different than their counterpart tablet versions, is the transition comfortable for the user? Testing for “comfort” or “ease of use” is a subjective call. Mobile testers have to know more about their users and how they use the application. This is where sales, marketing and any other customer facing team members can share with the testers’ experiences and user stories.
As we’ve progressed into using so many mobile applications, personal bias and prejudices have built up in our minds. We have different expectations in display, in usage, in timing of feedback and in functionality. Those of us who are more often than not using laptops and desktops to conduct our daily activities do not typically use mobile applications the same as someone who has no access to a laptop or desktop. Their bias in usage is completely different, therefore “ease of use” has a different meaning. Our testing, therefore, has to consider different expectations based on the configuration.
Network connectivity while using the mobile application is yet another configuration test consideration. A tablet for example, is generally used in fixed locations like an armchair in front of the television or at a favorite coffee shop. Once the connectivity is established, there is little fluctuation due to lack of mobility. This is not necessarily true of a mobile phone. How often are you walking or in a moving vehicle engaging the internet? If your application requires internet connectivity, you will need to add appropriate testing based on the configuration.
How many different types of tests exist specifically for mobile devices and mobile applications? This article is an introduction to consider general functionality, usability, appearances of different configurations depending on the mobile device. Why does a developer design a mobile application to appear very different from a desktop application or a tablet application? All tests do not apply to all configurations. Definitions of usability must be carefully quantified in the requirements and usability of the application may depend on the specific market of customers expected to use the application. Work closely with your stakeholders to learn as much as you can about the users/customers and their perspective.
Finally, continue to practice testing on mobile devices and mobile applications. The more time spent testing mobile applications, the more inspiration (and better mental model) a mobile tester will gain as to what kinds of tests to apply. The different types of performance tests, notification tests, and network communication tests can apply, along with general functional and behavioral tests. Learning there are more types of tests beyond testing the GUI functionality is critical to planning out your mobile testing projects.
About the Author
JeanAnn Harrison Formally a Lead Quality Assurance Engineer at CardioNet, Inc providing ambulatory cardiac monitoring service for physicians’ patients. Jean Ann was the software quality assurance lead on the next generation mobile heart monitor device and has been the lead on all embedded software testing at CardioNet. Jean Ann’s background also includes a variety of projects of large multi-configured applications for client/server, web, Unix and mainframe systems. Her experience is primarily manual testing with occasional automation and a strong focus on building quality into design.Constantly working to perfect her craft, Jean Ann attends and presents at conferences, takes various courses, networks and actively participates in software testing forums. She believes software testing takes daily practice to contribute to a project’s success. Jean Ann currently works as a Project Manager for Project Realms, Inc.