AWS Device Farm is an AWS service for mobile applications, much like a farm of devices on which a developer may test apps on the AWS Cloud. Its objective is to improve the quality of mobile (Android & iOS) and Web applications by testing on real mobile devices on the AWS Cloud. Its two main features are:
- An automated test platform
- Remote access mobile devices
An automated testing platform provides a large collection of devices in the AWS Cloud. Remote access lets a user interact with remote devices in real time from a Web browser. Both the services are easy to use. For automated testing, choose a native, hybrid, or Web app, and select the real devices from the AWS Device Farm on which to test an app. The AWS Devices Farm tests the apps and generates results to inform an app developer about any bugs or performance problems. Some built-in automated tests, such as “Explorer” and “Fuzz,” are provided, or a developer may upload custom test scripts. The comprehensive test reports include high-level results, logs, screenshots, and performance data. AWS Device Farm provides an initial free test time of 250 minutes, after which Automated Testing Android slots and Automated Testing iOS slots have to be purchased for automated testing. The automated testing platform does not display any real devices to the tester and runs the tests internally on real devices. Automated testing does not let a tester manage the details of a built-in test, but has the provision to upload a custom test script.
In this article, we shall use an Android Hello World app and run automated tests on the app. In a subsequent article, we shall interact with the app on a real device on the AWS Device Farm. This article has the following sections:
- Setting the Environment
- Creating a Device Farm Project
- Creating a Run for an Automated Test
- Managing the Uploads
- Configuring a Project
- Fixing Some Common Issues
To use the AWS Device Farm, an Amazon Web Service account is required; this may be created at https://aws.amazon.com/. Subsequently, select the AWS Device Farm service, as shown in Figure 1.
Figure 1: AWS Device Farm Service
Alternatively, click Start Testing Today at https://aws.amazon.com/device-farm/, as shown in Figure 2.
Figure 2: Starting Testing on AWS Device Farm
Download the sample Android app APK HelloWorld_v1.0_apkpure.com.apk from https://apkpure.com/helloworld/my.v1rtyoz.helloworld. Or, use a different Android app APK for testing.
To create a Device Farm project, on the AWS Device Farm start page, click Get started. In the Create project dialog, specify a Project name (HelloAndroid, for example) and click Create project. A new Device Farm project gets created, as shown in Figure 3. Two tabs are provided: Automated tests & Remote access. Because this article discusses automated testing, select the Automated tests tab, which is selected by default after a new project is created.
Figure 3: HelloAndroid—A new AWS Device Farm Project
To create an automated test, click Create a new run, as shown in Figure 4.
Figure 4: Creating a new Test Run with Create a new run
The Create a new run wizard starts. Click the icon for “Test a native application on Android or iOS devices,” as shown in Figure 5. The other option icon is “Test a web application on Android or iOS devices.”
Figure 5: Selecting the option to test a native app on Android
Next, choose an application to test. Click the Upload button, as shown in Figure 6.
Figure 6: Selecting Upload to upload an Android App APK
Select the HelloWorld .apk file that was downloaded earlier, or choose a different APK to upload. The Hello World APK file starts to upload, as shown in Figure 7.
Figure 7: Uploading an Android app APK
When the upload is complete, the app detail gets listed; it can be as the app Package, Activity, Minimum SDK, Screens, Target SDK, Version Code, and version Name. Click Next step, as shown in Figure 8.
Figure 8: Summary of Test Run
In Configure a test, select from one of the Test types. Two built-in test types are provided: Built-in: Explorer and Built-in: Fuzz. The built-in test types do not require a user to upload any test scripts whereas the other test types do. Unless custom test scripts have been developed, choose the Built-in: Explorer test type, as shown in Figure 9. Optionally, a Username and Password may be provided. Click Next step.
Figure 9: Configuring a Test of type Built-in: Explorer
In Select devices, select a Device pool to select devices from. The default Device pool, “Top Devices,” is pre-selected and includes devices such as Samsung Galaxy and LG G Pad. A compatibility test indicates that the app is compatible with 5 out of 5 devices (100% Compatibility) in the selected Top Devices pool, as shown in Figure 10. Click Next step.
Figure 10: Selecting a Device Pool
Next, select the device state in Specify device state, as shown in Figure 11. Options to Add extra data, Install other apps, and Set radio states are provided. The Add extra data button may used to upload a zip file with test data that gets extracted before the test is run. The Install other apps button lets a user upload other apps. All supported “radio states” (wireless standards or technologies for communication between devices)—which include WiFi, Bluetooth, GPS, and NFC—are selected by default. An option to select Device location is also provided so that a latitude and longitude may be set for the location-specific behavior of an app. We have used the default setting, which is based on the user’s current location. The Device locale default setting should also be kept, unless in another region and/or using a different language.
Figure 11: Specifying Device State
The Network profile is based on the Uplink/Downlink bandwidth (bps), Uplink/Downlink delay (ms), Uplink/Downlink jitter (ms), and Uplink/Downlink loss (%). The definitions of the different network profile settings are shown in Table 1.
|Network profile setting
|Uplink/Downlink bandwidth (bps)
|Data throughput rate in bits per second, as an integer from 0 to 104857600.
|Uplink/Downlink delay (ms)
|Delay time for all packets to destination in milliseconds, as an integer from 0 to 2000.
|Uplink/Downlink jitter (ms)
|Time variation in the delay of received packets in milliseconds, as an integer from 0 to 2000.
|Uplink/Downlink loss (%)
|Proportion of transmitted packets that fail to arrive, from 0 to 100 percent.
Table 1: Network profile settings
Different network profiles are available, such as:
- 3G Average
- 3G Good
- 3G Lossy
- EDGE Average
- EDGE Good
- WiFi Average
- WiFi Good
- WiFi Lossy
The settings for a particular Network profile may be viewed by selecting the network profile. As an example, “3G Average” settings are shown in Figure 12.
Figure 12: Network profile 3G Average
For the “Disabled” Network profile, the settings are shown in Figure 13.
Figure 13: Network profile Disabled
The Network profile is set to “Full” by default, which has the settings shown in Figure 14.
Figure 14: Network profile Full
Keep the default Network profile of “Full” and click Next step, as shown in Figure 15.
Figure 15: Device State Settings
In Review and start run, review the summary of the run. The run timeout setting is configurable, with a maximum of 60 minutes and a minimum of 5 minutes per device; it is recommended that you set the timeout to slightly in excess of the actual test time anticipated for a run. If the run exceeds the timeout, the test run is stopped and only partial test results are available. A maximum run time of 60 minutes is configured per device by default and for the total run an aggregated time over all the devices is also listed as shown in Figure 16.
Figure 16: Summary of Test Run
Any of the run settings configured, including the App to be tested, the test type, and the devices to test the app on may be modified with the Edit link. Click Confirm and start run, as shown in Figure 17.
Figure 17: Confirming and starting Run
The Android app test run starts, as shown in Figure 18.
Figure 18: A Test Run started
As and when tests are completed, the test run results are displayed. For example, three tests are indicated to have completed successfully, as shown in Figure 19.
Figure 19: Partial Test Results Displayed when available
Multiple test runs may be started in parallel. As an example, start another test run with the same HelloWorld APK, but choose some of the test settings as different. Choose the test type as Built-in: Fuzz, as shown in Figure 20.
Figure 20: Configuring a 2nd Test of type Fuzz
The fuzz test is based on sending random events to the app. The Event count is the number of events the UI fuzz test performs and is set to 6000 by default; a value between 1 and 10,000 may be specified, as shown in Figure 21. The Event throttle is the time, in ms, between two consecutive events sent to the app, with a default setting of 50 ms and a supported range of 0 to 1000 ms. The Randomizer seed is the seed to use for randomizing the UI fuzz test. Using the same seed value between tests generates identical event sequences.
Figure 21: Settings for Fuzz Test Type
Instead of selecting the Device pool “Top Devices” for the devices to test the app on, create a custom device pool with Create a new device pool, as shown in Figure 22.
Figure 22: Creating a new Device Pool
In the Create a new device pool dialog, specify a Name (AndroidLGDevices) and either select the itemized devices to add to the device pool or add a rule for selecting the devices. Only one of the two options can be set for selecting devices, adding a rule, or selecting individual devices. To set a rule, click Add rules. For the Field. one of three options may be selected: Type, Manufacturer, or Platform, as shown in Figure 23.
Figure 23: Selecting a Rule Field
Select Manufacturer. For Operator, select “EQUALS” and for Operand, select “LG”. Click Save device pool, as shown in Figure 24.
Figure 24: Settings for a new Device Pool
The Device pool is created and gets set for the test run. The initial compatibility tests indicates 100% compatibility with 26 out of 26 devices, as shown in Figure 25.
Figure 25: Selecting the newly configured Device Pool
Start the 2nd test run also, as shown in Figure 26.
Figure 26: Starting the 2nd Test Run
Multiple test runs for the same Android app but using different test types and test devices get started simultaneously, as shown in Figure 27.
Figure 27: Multiple Test Runs running simultaneously
Partial test results get displayed when completed. As shown in Figure 28, one test run indicates 18 tests having completed successfully, whereas the other test run lists 12 tests having completed successfully.
Figure 28: Partial Test Results for the two Test Runs
Different test types have different numbers of tests and run for different durations. When all tests for a test run complete successfully, the icon on the test run changes from the circular revolving arrows to a green circle with a check mark. The Test results indicate that 15 tests have completed successfully for a run of type Explorer and the test run is listed as “Passed.” The other test run of type Fuzz continues to run and indicates 1 test having failed, as shown in Figure 29.
Figure 29: One of the Test Runs listed as “Passed” and some tests in the other run Failed
Navigate to the projects page from the dashboard, as shown in Figure 30.
Figure 30: Navigating to the Projects Page in the Dashboard
The HelloAndroid project lists the Last run and its results: 15 PASSED, 0 FAILED, and 0 ERRORED, as shown in Figure 31.
Figure 31: Project Summary for HelloAndroid Project
To get details about the test run, click the test run, as shown in Figure 32.
Figure 32: Selecting a Test Run for detailed information
The Android app APK name and a message indicating that all tests have passed get displayed, as shown in Figure 33.
Figure 33: Test Run listed as having passed all Tests on all Devices
Screenshots for the different devices on which the tests were run also get displayed, as shown in Figure 34. The screenshots show how the HelloWorld app would display when run on a particular device type. The following screenshots show the app on LG G Pad 7.0″ (AT&T) 4.4.2.
Figure 34: Screenshots of Test on LG G Pad 7.0″ (AT&T) 4.4.2
The screenshots in Figure 35 show the app on Samsung Galaxy S5 (T-Mobile) 4.4.2. The app may appear slightly different on different devices.
Figure 35: Screenshots of Test on Samsung Galaxy S5 (T-Mobile) 4.4.2
Multiple runs may be created for different (or the same) apps. The uploaded apps may be managed by selecting Project settings, as shown in Figure 36.
Figure 36: Selecting Project settings
In Project settings, select the Uploads tab, as shown in Figure 37.
Figure 37: Selecting the Uploads Tab
The uploaded HelloWorld APK gets listed, as shown in Figure 38.
Figure 38: Uploaded Android App
The same app may be uploaded multiple times if used in multiple runs, as shown in Figure 39.
Figure 39: Multiple Uploads
To delete an upload, click the uploaded APK and click Delete in the dialog displayed, as shown in Figure 40.
Figure 40: Deleting an Upload
Some project-wide project settings may be configured by selecting Project Settings, as shown earlier in Figure 36. The Device slots tab lists the automated test and remote access slots purchased in excess of the free trial duration of 250 minutes.
To create a project-wide device pool, select the Device pools tab. Click the Create a new device pool button, as shown in Figure 41.
Figure 41: Configuring a Project to create a new device pool
The Create a new device pool dialog, which was discussed earlier, gets displayed, as shown in Figure 42.
Figure 42: Creating a new device pool dialog
Select the devices to add to the pool and click Save device pool, as shown in Figure 43.
Figure 43: Configuring a new Device Pool
A project-wide device pool gets configured, as shown in Figure 44.
Figure 44: New Device Pool
The device pool gets listed for selection when creating a test run, as shown in Figure 45.
Figure 45: New Device Pool available for creating a new Test Run
To create a new network profile, select the Network profiles tab and click Create a new network profile, as shown in Figure 46.
Figure 46: Creating a new Network profile
Specify the network profile detail in the Create a new network profile dialog and click Save network profile, as shown in Figure 47.
Figure 47: Creating a new Network profile dialog
As shown previously in Figure 38, the Android app uploaded when creating a new test run is listed in Project settings>Uploads. Project-wide apps (apks) also may be uploaded from the Project settings; select the Uploads tab. Click Upload a file, as shown in Figure 48.
Figure 48: Upload a file
In the Upload a file dialog, select the type of the app, such as Android App. Click File>Upload to upload an app APK file, as shown in Figure 49.
Figure 49: Uploading an Android App
The app gets uploaded, as shown in Figure 50.
Figure 50: Uploaded Android App
When creating a test run, the app gets listed in the Select a recent upload drop-down list, as shown in Figure 51.
Figure 51: Selecting a recent upload when creating a new test run
Next, we shall discuss some common issues that a user may come across while testing an app. When uploading an app’s APK file, if the file is not found to be valid an error message indicates that the APK file could not be unzipped and could not be processed, as shown in Figure 52. A different APK file has to be used if an app is not valid.
Figure 52: When choosing an app, an error could be generated if the app is not valid
When one of the tests in a test run fails, the test run may continue to run. As shown in Figure 53, the test run for the Fuzz type test “25 out of 26 devices completed,” 1 failed, and the test continues to run.
Figure 53: Only 25 out of 26 devices completed the test run
If a test continues to run due to some tests having failed, click the test run and click Stop run, as shown in Figure 54. Also shown in Figure 54 is the cause of the failed test: “Application not responding.” The device on which the test/s failed is also displayed for further debugging the issue.
Figure 54: Stopping a Test Run
In this article, we introduced the AWS Device Farm for automated testing. The AWS Device Farm provides several device types. Not only Android apps, but also iOS apps and Web apps may be tested on the AWS Device Farm. In a subsequent article, we shall discuss remote access to a device for on-device testing of an Android application.