Connect(); 2016

Volume 31 Number 12

[Connect(); Mobile Test]

Scale Your Automated Mobile App Testing with Xamarin Test Cloud

By Justin Raczak; 2016

In recent years, there’s been a dramatic shift in the way teams build and deliver software. Where it was once believed lengthy requirements-gathering processes would ensure the delivery of a perfect product with the first release, it’s now known that rapid learning coupled with rapid iteration is the key to success. As the thinking changes, so, too, must the workflows. Development cycles lasting months or years followed by lengthy waterfall QA phases don’t facilitate rapid learning. Feedback loops must be shortened and small changes implemented quickly and released to users. To deploy readily, it must be known that software is in a good state at all times. Test automation makes this possible.

Automated testing lets you test your apps in ways that used to take days or weeks. Rather than waiting until the end of a sprint comprised of hundreds of lines of new code, you can test small changes added with every commit. This continual testing surfaces defects as soon as they’re introduced and reduces the time needed to debug them. And because the behavior of the app is continually validated, you have the confidence to deploy to users whenever you’re ready. Automated testing makes possible a world in which you discover defects and ship a fix within the same day. But the mobile ecosystem presents unique challenges with a diverse landscape of mobile device and OS makers.

Xamarin Test Cloud makes it fast and simple to scale your automated testing with minimal changes to your existing workflow. Offering more than 400 unique device configurations, Test Cloud enables you to validate your app’s behavior on the device models and OS versions that are important to your users without the expense or management overhead that comes with building and managing your own device lab. In most cases, you can tap into this immense value with few to no changes to your code.

Test Cloud supports authoring tests in C# (UITest), Ruby (Calabash) and Java (Appium and Espresso). In the project modification portion of this article, I’ll focus on our most requested test framework addition, Appium with JUnit, and walk through the changes you need to make to your project to run your existing tests in Test Cloud. I’ll also take a look at the Web interface where you’ll review your test results and troubleshoot failed tests. The specific modifications required might change over time. You can find the most recent version of these instructions at bit.ly/2dhp2VQ.

In this example I assume the following preconditions:

  • An active Test Cloud account (sign up at bit.ly/2e3YgTy)
  • An installed command-line tool (instructions are at bit.ly/2dcrbXS)
  • A native Android application project
  • An existing suite of Appium tests written in Java with JUnit (at least version 4.9) conforming to Appium 1.5
  • Maven build system (at least version 3.3.9)

Changes to the Build System

Before you can begin using Test Cloud, you’ll need to add the dependency to ensure the tasks for preparing the requisite files are available to your build.

Add the Test Cloud Dependency To include Test Cloud in your project and ensure its enhanced Android and iOS drivers are available at compile time, add the following dependency to your pom.xml file:

<dependency>
  <groupId>com.xamarin.testcloud</groupId>
  <artifactId>appium</artifactId>
  <version>1.0</version>
</dependency>

Add the Upload Profile Add the profile from Figure 1 to your pom.xml inside the <profiles> tag. If you don’t already have a <profiles> section, create one and add the profile. This profile will pack your test classes and all dependencies into the target/upload folder where they can then be uploaded to Test Cloud.

Figure 1 Test Cloud Upload Profile

<profile>
  <id>prepare-for-upload</id>
  <build>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-dependency-plugin</artifactId>
        <version>2.10</version>
        <executions>
          <execution>
            <id>copy-dependencies</id>
            <phase>package</phase>
            <goals>
              <goal>copy-dependencies</goal>
            </goals>
            <configuration>
              <outputDirectory>${project.build.directory}
                /upload/dependency-jars/</outputDirectory>
              <useRepositoryLayout>true</useRepositoryLayout>
              <copyPom>true</copyPom>
            </configuration>
          </execution>
        </executions>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-resources-plugin</artifactId>
        <executions>
          <execution>
            <id>copy-pom-file</id>
            <phase>package</phase>
            <goals>
              <goal>testResources</goal>
            </goals>
            <configuration>
              <outputDirectory>${project.build.directory}
                /upload/</outputDirectory>
              <resources>
                <resource>
                  <directory>
                    ${project.basedir}
                  </directory>
                  <includes>
                    <include>pom.xml</include>
                  </includes>
                </resource>
              </resources>
            </configuration>
          </execution>
          <execution>
            <id>copy-testclasses</id>
            <phase>package</phase>
            <goals>
              <goal>testResources</goal>
            </goals>
            <configuration>
              <outputDirectory>${project.build.directory}
                /upload/test-classes</outputDirectory>
              <resources>
                <resource>
                  <directory>
                    ${project.build.testOutputDirectory}
                  </directory>
                </resource>
              </resources>
            </configuration>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>
</profile>

Changes to the Tests

Now that your build is configured, you must modify your test classes to leverage the Test Cloud Java extensions.

Add the Imports to Test Classes Import the following packages into your test classes:

import com.xamarin.testcloud.appium.Factory;
import com.xamarin.testcloud.appium.EnhancedAndroidDriver;
import org.junit.rules.TestWatcher;
import org.junit.Rule;

Instantiate the TestWatcher Insert this instantiation into one of your test classes:

@Rule
public TestWatcher watcher = Factory.createWatcher();

Update Your Driver Declarations Replace every declaration of AndroidDriver<MobileElement> with EnhancedAndroidDriver­<MobileElement>, like so:

private static EnhancedAndroidDriver<MobileElement> driver;

Update Your Driver Instantiations Replace every instantiation of your driver such that lines in the form of:

Driver = new AndroidDriver<MobileElement>(url, capabilities);

become:

Driver = new EnhancedAndroidDriver<MobileElement>(url, capabilities);

The enhanced driver enables you to “label” the steps in your test using driver.label(“myTestStepLabel”). This method will produce a test step label and accompanying screenshot that will be viewable in the test report in Test Cloud. I recommend calling label in the @After method, which will capture a screenshot of the app in its final state before the test completes. The screenshot will be taken even if the test is failing, which might provide valuable insight into why it’s failing. In practice, this could look something like:

@After
  public void tearDown(){
    driver.label("Stopping app");
    driver.quit();
  }

Upload to Test Cloud

Now that your project is equipped with all the prerequisites, you’re ready to prepare your files and execute a run in Test Cloud. Before proceeding with the upload steps, it’s a good idea to try a local run and make sure everything works as expected. If you need to troubleshoot any of the configuration changes you’ve just made, it’s much faster to do so locally.

To pack your test classes and all dependencies into the target/upload folder, run the following command:

mvn –DskipTests -P prepare-for-upload package

You might want to verify the target/upload directory now exists in your project’s root folder to ensure you’re ready for upload. If this will be a new app in Test Cloud, you’ll need to create the app as part of the test run. Follow the flow to create a new test run to select your devices, set preferences and generate the command you’ll need to execute the run. For this exercise, I recommend selecting a small number of devices from the Tier 1 category so your results will be ready for review quickly. Copy the generated command and run it at the command line.

Once the file upload has been successfully negotiated and validated, devices will be provisioned, your app will be installed and your tests will execute. The Test Cloud operating model is based on device concurrency, or the number of physical devices that can be used in parallel. For example, a user with five concurrent devices can test an app on the Nexus 5X, Nexus 6P, Samsung Galaxy S5, Samsung Galaxy S6 and HTC M8 at the same time. This efficiency is one of Test Cloud’s most significant advantages, making it easy to increase your coverage to span more devices while adding little to no additional wait time.

The command-line tool will stream updates on the test run’s status and provide you with a link to the test report once the run has finished. Follow the provided test report link to examine your results.

There are three levels of granularity with which you can view your results:

  • Overview report.
  • Device grid.
  • Device detail report.

I’ll discuss each in turn.

The Overview Report The overview provides you with summary information about a test run including pass/fail details, failure stats by OS version, manufacturer, and form factor, and details about the run itself, including device configurations targeted and total run time (see Figure 2).

The Overview Report
Figure 2 The Overview Report

If your test run produces failures, you’ll likely want to dig deeper to explore root causes and collect data for debugging. The device grid is the next level of detail.

The Device Grid The device grid provides a highly efficient mechanism for navigating through your test results step by step alongside the screenshots captured at each step. With failures clearly indicated in the test steps, you can quickly jump to a failed step and examine the visual state of your app on each device. For larger device sets, you can filter the devices displayed to only those that failed to create a cleaner field for inspection. If the cause of the failure isn’t apparent at this level of detail, you can drill down one more level to view device details (see Figure 3).

The Device Grid Report
Figure 3 The Device Grid Report

The Device Detail Report The device detail view gives you the same access to test step navigation and screenshots but provides additional detail specific to the selected device, including CPU and memory usage. From this view you can also access the device logs and stack trace, artifacts that will likely be the most useful when investigating a test failure (see Figure 4).

The Device Detail Report
Figure 4 The Device Detail Report

At this point I’ve followed the most common workflow in Test Cloud:

  1. Execute test (manually or via continuous integration, or CI).
  2. Review results.
  3. Retrieve debugging artifacts.
  4. Fix.

Next, I’ll discuss a few simple ways to think about your device-targeting strategy and optimizing your testing workflow for performance to keep your pipeline flowing quickly.

Thinking About Device Coverage

Selecting the devices your organization will support and ultimately test against is nearly as important as the testing itself. While there are many sources of aggregate and generalized market data that can help guide your thinking in this area, the most impactful source is usage data from your own user base. The exception to this, of course, is an application only distributed internally to a set of known and managed devices. For external and consumer apps and internal apps distributed under a bring-your-own-device (BYOD) policy, usage data is your best source.

Many tools in the market can help you garner insight into the devices your audience uses. This is the data set from which you can extrapolate your supported device list. The exact methodology you use to determine which devices to support from the aggregate list is up to you and your organization. In most cases, it won’t make sense to support every device from your usage data, as this quickly becomes unwieldy and expensive. You might decide to cover as many devices as make up a certain percentage of your user base. Or you might decide to think in terms of numbers of users and support as many devices as required to leave fewer than 500 users’ devices covered. If you have an e-commerce app, you might want to cross-reference your usage data with transaction data, ensuring devices that represent the highest spend and most frequent transactions are covered. Again, the specific approach you take to develop your device-support list should be based on the needs and goals of your business.

Keep in mind the mobile market moves quickly. This means, in order for your support list to be accurate and meaningful, you must inspect your usage data regularly. Watch for market signals that might suggest it’s a good time to review the data again, such as the rollout of a new device model or OS.

Optimizing Your Testing Pipeline

The best way to extract the most value from test automation is to test early and often. This reduces the time and cost associated with fixing bugs and ensures the deployment pipeline stays clear. But as teams and operations scale, latency can build up in the pipeline and developer productivity can decrease. Let’s look at ways to keep the pipeline clear and productivity high.

Not All Tests Are Equal As projects grow over time, their test suites take longer to run. There’s an inflection point at which running the test suite after making a simple change becomes painful and cumbersome, often leading to bad habits such as skipping the tests altogether. You can preempt this by thinking early about your application’s critical paths—that is, what flows or experiences in your application must absolutely work? Using the earlier e-commerce app example, this might mean users can browse products, add products to the cart and check out. It’s less important that users can set their notification preferences. With this structure in place, it becomes much more practical to run tests on every push or even every commit. Instead of running the full test suite for small changes, you can run only those that are part of the critical paths. How, exactly, you accomplish this delineation will depend on the test framework you use.

The Right Devices at the Right Time While testing every push to a feature branch may be ideal from a quality perspective, this quickly becomes expensive for large teams, especially those that support many different device configurations. You can reduce the overhead here by applying a progressive strategy to your device targeting on these test runs. Does a build of a non-production branch need to be tested on every device you support? The answer is likely no. Instead, you can select a sensible number of devices that balances effective testing with shorter wait times. For a pre-production build from CI, a sampling of the most popular device models and OS versions from your device support list will provide a valuable level of coverage without raising your build time beyond an hour. For an individual developer testing from a local workstation, testing against one or two devices might suffice.

These are just a few examples of ways to think about configuring your testing workflow. The broader point is to invest the time to question whether your pipeline flow is optimal. Even if you’ve answered this question before, as with everything you do, it’s always a good idea to routinely inspect and adapt.

Wrapping Up

In this article you’ve seen how easily you can migrate from running your tests on a simulator or single local device to harnessing the power of hundreds of device configurations using Xamarin Test Cloud. I also touched on a few strategies for organizing your testing workflow and extracting the most value from your test resources. If you’re not already using Test Cloud, you can sign up for a free trial at bit.ly/2e3YgTy to begin using it with your projects today.


Justin Raczak is a senior program manager at Microsoft, leading the mobile test automation service. Although he only recently joined Microsoft, he has focused on automated testing and its role in advancing continuous delivery for the past three years. He can be reached at justin.raczak@microsoft.com.

Thanks to the following Microsoft technical expert for reviewing this article: Simon Søndergaard
Simon Søndergaard is a software engineer at Microsoft.


Discuss this article in the MSDN Magazine forum