iOS Performance Comparison
iOS performance testing runs your app on real, physical devices using Emerge's testing suite, allowing you to detect performance regressions for any PR before it's merged.
How it works
Each uploaded build is tested against its base build to determine the change in time for a specific "span". A span is the duration between a start and stop point that you specify in your app's code. Default spans for app launch are provided automatically. Performance testing tells you when there is a change in how long it takes the app to mark a span as stopped after they are started.
Each time a new version of the app is uploaded, it gets installed on an iPhone SE 2020, along with the version it's being compared against (the app referenced from the specified baseSha
). Both apps are installed on the device using the process described in App Installation. Then, for each version, a setup XCUITest is optionally run (e.g., logging the user in) before the UI test is run repeatedly that exercises the span to test. Each run generates a single sample of that span. Once Emerge has a statistically significant result from all the samples the result is available through source control integration such as a GitHub comment or status check.
Integration
Marking span start and end times
For each span to be tracked, Emerge needs to know at what point it starts and stops. To do this, it uses NotificationCenter
. To mark when a span has started, use the following:
NotificationCenter.default.post(name: Notification.Name("EmergeMetricStarted"), object: nil, userInfo: [
"metric": "someSpanName", // Where "someSpanName" is the unique string to identify some span
])
To mark when a span has finished, post the same notification but with "EmergeMetricEnded"
as the name instead of "EmergeMetricStarted"
. Note that for measuring startup time, however, there's a special way to do it, because there's no way to post a Notification
at the very start of a launch. To measure startup time, skip the "EmergeMetricStarted"
notification, but add "use_process_start": true
as a key-value pair alongside "metric": "myMetric"
. This will cause Emerge to use the process start time as the start of the span.
Each span needs a unique value for "metric" (the name of the span), with at most one start/stop for each unique "metric".
Recording the span
Test lifecycle
Note: Different steps of the test lifecycle use either the base or head app. If you are setting things up and changing code in the app itself (not the test package) make sure you update both the base and head app
Crash reporters
Crash reporters should be disabled at runtime in builds uploaded for performance tests. For more details see Crash Reporters (iOS).
Setup
Some of your performance tests may need a setup XCUITest, that gets the app into a state where it can be efficiently tested repeatedly. For example, to test the app's startup time for when a user is logged in, you might have a setup XCUITest that logs the user in, and then the main test can just launch the app without having to log the user in/out.
Performance test
Startup Tests
The below steps are for adding a custom test, which is a more complicated setup and should only be attempted after startup tests are working.
The actual performance test code is run from a wrapping XCUITest, which decides how many times and how exactly to run the test code to get more samples of the span. At a high level, your performance tests will look like this:
// In a UI test bundle
import XCTest
import EMGPerfTesting
class ExamplePerfTest: NSObject, EMGPerfTest {
func runInitialSetup(withApp app: XCUIElement) {
// One-time setup, before all the perf test iterations, such as logging in. The app is already launched
app.buttons["login"].tap()
}
func runIteration(withApp app: XCUIElement) {
// Test to trigger the spans that are being measured. The app is already launched
app.buttons["searchPage"].tap()
}
}
Each performance test needs to import the EMGPerfTesting
framework and implement the methods in the EMGPerfTest
protocol. Instructions to integrate this framework are here. Note that the ExamplePerfTest
doesn't conform to XCTestCase
, even though it's in a test bundle, and instead has just two methods that each take a pre-constructed XCUIElement. This is because the app is launched by the test running infrastructure, and provided to you as an XCUIElement. Also, since the test will be run in a very specific way (setup gets run once, then the test gets run some variable number of times) it can't be a normal XCTestCase that's run with Xcode's test runner, it has to use this alternate EMGPerfTest
protocol. Your code will be loaded from the test bundle and run as something roughly equivalent to:
let test = ExamplePerfTest()
let setupApp = XCUIApplication()
setupApp.launch()
test.runSetup(app: setupApp)
while shouldContinue() {
let iterationApp = XCUIApplication()
iterationApp.launch()
test.runIteration(app: iterationApp)
}
The test should support iOS versions >= iOS 15.5.
Although perf tests don't inherit from XCTestCase
, it's still easy to verify that they work as expected with normal Xcode UI testing. Just add a test like the following:
import XCTest
import EMGPerfTesting
class PerfTestRunningTests: XCTestCase {
func testAllPerfTests() throws {
PerfTestRunner.runAllPerfTestsForBundle(ofClass: ExampleUITests.self)
}
}
PerfTestRunner
is a class in EMGPerfTesting
that runs a simplified version of Emerge's actual test runner to ensure that everything works.
This simplified test you can run locally doesn't include things such as re-signing the app. Ensure your app works with a default startup test to make sure it is compatible before adding a custom UI test.
App exit
The app is automatically quit after your test runs. When the provided runIteration
function returns the app is kept running for up to 20 seconds or until all expected spans (how to specify expected spans is explained below) have been received. If after 20 seconds all expected spans have not been received the test fails with an error.
Recording Custom metrics
Emerge allows custom metrics to be recorded from your performance tests and exposed through the /getPerfTest
API endpoint. Custom metrics will be presented as-is in the userInfo
field if present in the performanceTests
array objects.
To save custom metrics, from your test, there are 2 options:
-
Include them in the
EmergeMetricEnded
notification: -
- NotificationCenter.default.post(name: Notification.Name("EmergeMetricEnded"), object: nil, userInfo: [ "metric": "someSpanName", // Unique Span name "customValue": 1, // Add any serializable value ])
-
If no custom metrics are found in the notification's userInfo, you can save a JSON file called
metrics.json
in the app Documents directory: -
func saveMetrics() guard let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).last else { return } let fileURL = documentsDirectory.appendingPathComponent("metrics.json") let custoMetrics: [String : Any] = [ "customValue": 1, ] guard let jsonData = try? JSONSerialization.data(withJSONObject: metrics, options: .prettyPrinted), let jsonString = String(data: jsonData, encoding: .utf8) else { return true } try? jsonString.write(to: fileURL, atomically: true, encoding: .utf8) }
Custom metrics must be named
metrics.json
!Emerge will not pick up custom metrics if not present in the documents directory and explicitly named
metrics.json
Emerge will automatically pick up this file and save the raw JSON in the userInfo
field available in the performanceTests
array from the /getPerfTest
endpoint. These metrics are not exposed in Emerge's UI.
Custom metrics are supported per test. Any custom metrics found during a test will be added to all spans within the test.
Potential performance implications
Saving a file during a performance test can introduce variance, as I/O operations could affect the test duration. If saving custom metrics, Emerge suggests using custom spans as the primary measurement rather than the duration of the whole test (default). Information about custom spans is available in the Performance testing specific spans section.
Uploading test bundles
Once everything looks good, add a top-level folder in the .xcarchive that gets uploaded called EmergePerfTests
(a sibling of the Products
directory) and add all the .xctest bundles there. Also add a file at EmergePerfTests/info.yaml
that contains a list of EMGPerfTest classes that should be run, and the set of spans that should be measured for each one, specified as such:
testClasses:
- class: ExamplePerfTest1
spans:
- didFinishLaunching
- class: ExamplePerfTest2
spans:
- didFinishLaunching
- customSpan
When running a test, the yaml file and test bundle in the head build is used to decide which tests to run. This ensures when a new test is added in a PR, that test will run before the PR is merged.
There's also a special class name, startup
, which doesn't require an actual .xctest bundle and gives you the default test that just launches the app. Also note that didFinishLaunching
and didBecomeActive
are spans that are automatically emitted for you to use if you want, where they track the end of applicationDidFinishLaunching(_:)
and applicationDidBecomeActive(_:)
respectively. If you don't include any EmergePerfTests
folder with your upload, this startup test with a didFinishLaunching
span is used by default.
Code Signing
For details on code signing during performance tests, see CI Installation
App installation
To test your app it first needs to be installed on a device, to do this the app must be re-signed.
Entitlements
We re-sign the app with a custom provisioning profile and remove certain entitlements. The following entitlements are included, the rest are stripped:
"application-identifier"
"com.apple.developer.team-identifier"
"get-task-allow"
"com.apple.developer.siri"
The app identifier is left unchanged, but the team identifier and app id prefix is replaced with a new team identifier. The com.apple.developer.siri
entitlement is preserved, and others including app groups, associated domains, and push notifications, are stripped from the appโs entitlements.
This may change some behavior, so make sure your app doesnโt crash after re-signing, like asserting that a particular app group is available. Additionally, security measures that rely on the appโs signature may break, so these should be disabled before uploading to Emerge.
Bundle Id
The bundle id is modified in the application's Info.plist. This modified bundle id will show up on crash reports and may require changes to app code to expect the different bundle id. Two bundle ids are used, one for the base and one for the head app. If you need specifics on what string is used for the bundle id, contact your Emerge representative.
Updated 5 days ago