Join CTO Moataz Soliman as he explores the potential impact poor performance can have on your bottom line. 👉 Register Today

ebook icon

App Performance

General

The Evolution of Mobile Application Performance Monitoring

Introduction to APM

How have app usage, consumer expectations, and performance monitoring changed over time?

Global smartphone use is at an all-time high. An estimated 6.8 billion people (about 85% of the world’s population) use smartphones. According to Statista, people in the US spend over five hours on their phones daily. Heavy app usage has raised the bar for performance standards — consumers now have higher expectations and lower tolerances for slow or buggy apps. Performance monitoring, already commonly used in web development, has now become essential for apps as well. 

Crash reporting tools have made it easier for developers to improve their app stability, a major contributor to the user experience. When their favorite apps crash on them, users are likely to uninstall, leave a bad review, and go on to use a different app. That makes it extremely important that your app isn’t prone to crashes. 

But stability is one facet of overall app performance. Speed also contributes to the user’s experience and perception of your app and brand. The internet is faster than ever, devices are more advanced than ever, and users expect peak performance from their mobile apps. Just a few seconds of delay at launch or between two screens can lead users to uninstall your app and switch to a competing product—of which there are many. To maintain competitiveness, an app has to be fast, stable, regularly updated, and resource-conscious. 

Overall app performance is a factor of both speed and stability: how often does the app crash or freeze? How long do users have to wait between screens or after tapping actions?

That means that mobile app performance monitoring has grown beyond crash reporting.  Performance monitoring software is now sophisticated enough to track multiple metrics for app quality and hunt down specific processes or situations that cause lag. 

Let’s take a look at how app performance monitoring has evolved over the last decade and where it’s headed next.

User Expectations: Then and Now

Performance standards started with crash-free rates

Back in 2012, end users might have been okay with facing occasional crashes, but this is no longer the case. According to recent research, 80% of app users will uninstall the application on the first day if they experience a crash, meaning that even temporary issues can churn users before they get a chance to explore your app. Users also expect apps to be fast and responsive—48% of users won’t hesitate to uninstall an app that feels slow or stuttery.

Of course, app performance monitoring was focused on user experience in 2012, just as it is today, but crash rates took center stage. Users gave one-star ratings and angry reviews for apps that crashed, even if it was due to OS performance.

Because of this, for many years, the stability of your application had the greatest influence on your app rating and growth. It was impossible to have a higher app rating if you didn’t have a good crash-free rate. Accordingly, mobile teams prioritized a crash-free rate

Product stability impacted not only app store rankings, but app store optimization and discoverability as well. It was common for devs and product managers to aim for a high crash-free rate and correlate that with a higher ranking and better visibility in app stores without factoring in other causes. As smartphones and internet speeds have evolved, so have user expectations. 

People switching from cell phones to smartphones tolerated performance issues

Just 56% of Americans owned a smartphone in 2013; by 2021, the amount had grown to 85%. But in the early 2010s, people new to smartphones understandably had a higher tolerance for the app taking its time to load. That’s because switching from a regular cell phone to a smartphone is a step up in terms of features and functions, and the user doesn’t have past experiences to shape their expectations. Slow internet and poor app performance weren’t a concern when most users were still fascinated by the novelty of the smartphone itself. 

People didn’t use smartphones as much 

In the early 2010s, smartphones weren’t the primary source of information and entertainment they have become today. Users spent an average of just two hours and 42 minutes per day on their phones back in 2014, as opposed to five to six hours in the 2020s. Ecommerce was just over 5% of the total retail sales in 2012, compared to about 15% just 10 years later. 

Over the course of a decade, our habits transformed. People now use their smartphones in place of desktop or laptop computers to search for information, use social media apps, play video games, buy new products, check their email, and more. We also have more smartphone apps that offer more options than ever before, which has also increased user expectations for features and performance.

The internet was just slower

Smartphone users were used to app launches taking a bit long back in the day. But both mobile internet and broadband became 13 times faster by 2022. The average internet speed in the US was around 5 Mbps for mobile and 14.3 Mbps for broadband in 2012. It grew to 78.86 Mbps for mobile internet and 192.73 Mbps for broadband in 2022. So, users naturally expect apps to work at least 13 times faster now.

Smartphones had slower processors

The processing power of the latest phones is hundreds of times more than the phones from 2012, which leads to enhanced performance. 

The Samsung Galaxy S III was one of the most popular smartphones in the early 2010s. It had a 1400 MHz quad-core processor, 1 GB of RAM, and 32 GB of storage. In comparison, the Samsung Galaxy A53, released in 2022, has an octagonal CPU, with two cores at 2.4 GHz and six cores at 2 GHz, 6 GB of RAM, and 128 GB of memory.

Smartphones also come with significantly better capabilities, faster processing, faster download and upload speeds, better graphics, and more memory than their predecessors in the 2010s.

Less competition meant a lower bar for app developers

The App Store had about half a million apps and games in 2012. That number grew to more than 1.6 million by 2022. Google Play had 3.5 million apps in 2022, as opposed to roughly 675,000 in 2012

More apps in each category mean more competition for developers and more options for users, and more options mean a lower tolerance for slow or buggy performance. Users can simply switch to other apps if yours does not perform as expected.

Beyond Stability: App Quality Metrics

While the crash-free rate continues to be important, evolving user expectations mean there are more metrics that will take center stage in mobile app performance monitoring in the coming years. UI stutters, app hangs, user terminations, launch time, network performance, and screen loading time all affect user experience, engagement, and retention. 

All of these should be tracked closely so you can study their relationship with user behavior and optimize them as needed. Next-generation app performance monitoring software can monitor wait times and identify specific issues that impact the end-user experience.

1. Network performance

Network performance monitoring assesses the health of your network and all devices associated with the network. In the case of mobile network performance monitoring, the health and availability of the network depend on the service provider, but the app developers have to keep track of network availability and how it affects the user’s experience. 

Network wait times and failures should be tracked from both the server-side and client-side for developers to fully understand their app’s behavior. Even if a performance issue is due to a slow or unstable network connection, users may attribute the issue to your app. You can monitor network performance by tracking all network requests made by your app and be alerted to timeouts and similar issues. 

In case of a slow network, you can manage user expectations with in-app messages or notifications to let the users know that their experience will be affected unless they connect to a faster network. And even if your users do not have access to a better network, they’re less likely to blame your app if it’s slow.  

2. Launch time

Launch time monitoring tracks how long your app takes to load based on the state of the app in the mobile device’s memory. 

  • Cold launch: when an app loads from scratch 
  • Warm launch: when some of the app’s processes are running in the background, but others are disabled until the app loads
  • Hot launch: when the app is already running in the background 

If your app takes too long to launch, especially during a hot launch, users will most likely switch to a faster alternative. If it does launch quickly, it will help your users get to the content they want and provide a frictionless experience. 

To monitor your launch time, set alerts for every time the app takes more than a few seconds to launch. Users expect cold launches to take around five seconds and hot launches to take around 1.5 seconds. By tracking your launch times for different operating systems and devices, you can optimize your app to start quickly, resulting in a better experience for your users. 

3. Screen loading time

Screen loading measures the amount of time it takes for individual screens in your app to load. If certain or all screens in your app take longer to load than expected, you will end up losing users. The standard time for each screen to load depends on your industry. People using a project management app will expect it to load faster, as opposed to a graphics-heavy video game.

The rate at which a screen and all of its contents load depend on:

  • The quality of your code
  • The quality of your backend servers
  • Your content delivery networks (CDNs)
  • The sizes of images used on the screen
  • The number of requests the app makes to the server when moving from one screen to the next
  • Your caching rules and how you store your most commonly used assets

You can track screen loading time in Android by measuring the time consumed by activities between ‘onCreate’ and ‘onResume’ events. In iOS, the same can be achieved by tracking activities between viewDidLoad and viewDidAppear

Next-generation performance monitoring tools like Instabug can track your screen loading time by default and give you access to detailed logs that will help you identify the factors negatively impacting your loading time.

4. App hangs

When an app doesn’t respond to a user’s request for more than 0.25 second (or 250 milliseconds), it results in a negative user experience. Imagine a user taps the button to go to another screen or to fetch a response to their query, and the app hangs on them. Every time they have to restart the app, and if it continues to hang on the same screen, they will more likely switch to a competitor instead of reporting the issue. Instead of relying on the users to report such issues, monitor your app and set alerts for every time it hangs.

App hangs are measured as a percentage of the total amount of time the app was unresponsive while the user was on a particular screen. It’s best to measure the duration of each hang and not just the number of hangs because it’s possible for a user to experience multiple hangs on each screen. 

5. UI responsiveness

UI stutters, lags, or delays happen when the response to the user request on a particular screen takes longer than expected. Unlike a hang, where the app is completely unresponsive and never reloads or responds, a lag or stutter is just a delay in response. 

Let’s say you enter some values in an app and tap a button to get an output. If the app lags, it will return the intended values, but after a longer delay than usual. If it hangs, the app becomes completely unresponsive. The user will have to force-quit, the OS will stop it, it will crash, or it will become responsive again but won’t return any values. 

The terms ‘lag’ and ‘UI stutter’ differ slightly in that ‘stutter’ is usually associated with a lack of smoothness in graphics and animations. ‘Lag’ usually refers to overall app slowness or delayed responses.

To provide an exceptional user experience, you want your app to have zero lag or stutter. Lags and UI stutters can be as frustrating as long loading times and hangs. Your users will react negatively to such issues, especially if there are other apps out there that don’t lag. 

You can track UI stutters with the latest application monitoring tools. Instabug also gives you detailed logs of user activity and status so you can see CPU usage, memory consumption, the number of requests made, etc., when the app lagged.

6. User terminations

A user termination is when a user force-quits the app. Some users may close an app because they got what they needed, but others may close the app to force a reload as a result of app hangs or UI stutters. 

It may be tough to differentiate sessions that users terminated because they didn’t need to use the app anymore from sessions that were terminated due to negative experiences. But you can keep an eye on this variable by relating terminations to new feature and version releases. 

If you notice a spike in user terminations after a new version or new release, you can start looking into other variables such as launch time, loading times, and app hangs to see where the problem lies. User termination reports do not carry a stack trace, so it is best to study these reports in relation to new releases.

Too many users force-quitting the app means there are major issues that should be handled immediately to avoid user churn.  

Evaluating performance with Apdex

With so many metrics gaining importance, in addition to the standard few you track, you need a north star, a guiding metric you can use to gauge quality. Your Apdex score is a representation of multiple performance metrics, like app stability, wait times, and UI hangs. Why? Evaluating performance on pure metrics alone can be limiting, as your app’s user experience is a culmination of numerous factors. It’s difficult to pin down the performance of an application to one specific data point or a metric. 

For example, if you focus on network performance as your north star metric and you optimize it to the point where you don’t get any network failures at all, you may think you have achieved peak performance. In the meantime, your users may still be dissatisfied with your app because it’s hanging or taking too long to launch or load content. 

Therefore, your Apdex score is a valuable metric because it’s a comprehensive summary of your average user experience. Keeping an eye on your Apdex score will help you maintain high levels of mobile app performance.

Your Apdex score combines stats on crashes, UI hangs, and wait times from your app performance and assigns a single, comprehensive score from 0 to 1.

How does it work? Instabug calculates Apdex score by using the formula:

Apdex score = (Satisfying sessions + 0.5 * Tolerable sessions) / Total Sessions

The total sessions include crashes as well as frustrating, tolerable, and satisfying sessions. All of these sessions are defined by the number of performance issues faced by the user during each session. 

Apdex scores are on a scale of zero to one, with anything over 0.94 being a high score and anything less than 0.5 being unacceptable.

With Instabug, you can track the overall Apdex score for your app and set alerts for when the score drops below a certain limit. You can also track the Apdex scores of different segments separately. For example, you can set different alerts for users with older operating systems or particular devices.  

Futureproof Your User Experience

Rising user expectations for app performance means you will need a reliable app performance monitoring tool where you can customize the metrics you want to monitor—now and in the future. An app's performance is increasingly becoming a deciding factor for users browsing the app stores, which are saturated with competing apps.

The number of smartphone subscriptions is projected to increase to 7.6 billion by 2027. Devices will only get faster, apps will become feature-heavy, and users will expect flawless experiences.

Instabug helps you track all performance and stability metrics. You can customize how your scores are calculated and get real-time alerts whenever your app’s performance is not where it should be. Visibility into the processes impacting performance will also allow your engineering team to quickly diagnose and troubleshoot performance issues before they impact a wider audience.

Learn more:

Instabug empowers mobile teams to maintain industry-leading apps with mobile-focused, user-centric stability and performance monitoring.

Visit our sandbox or book a demo to see how Instabug can help your app

Seeing is Believing, Start Your 14-Day Free Trial

In less than a minute, integrate the Instabug SDK for iOS, Android, React Native, Xamarin, Cordova, Flutter, and Unity mobile apps