Memory Profiling 101 (100 Days of Google Dev)

ALEX DANILO: The Android ecosystem is booming with devices. You’ve got phones, tablets, wearables, casts, tangos, cardboards, and even cars. And savvy developers know that the most important rule for creating stable, consistent performance across all of these platforms has to do with one simple thing– memory. My name is Alex Danilo, and the key to optimizing the usage of your apps memory, and the performance that comes with that, has everything to do with wielding the Android SDK’s memory tools like an awesome performance ninja. So let’s begin your training. Now, it’s worth pointing out that the ins and outs of application memory is much more complex than simply allocating and releasing objects, especially in a garbage-collected language, like Java. See, garbage collection events occur in your code any time that resources need to be freed, which normally is a good thing. But if your code isn’t utilizing memory properly, then a flood of GC events may occur, eating into your performance.

You can get a sense of how much memory your application is using with the memory monitor tool inside Android Studio. This simple tool will display a graph that updates every second or so, showing you how much memory your application is currently using, and how much is available to potentially use. Each time the dark blue section of the bar dips, that’s when a garbage collection event has occurred, freeing up memory for the application. And if you see lots of these dips in a short period of time, chances are that you’ve got something crazy going on in your code, that’s really eating into your app’s performance. But sadly, the simplicity of the memory monitor doesn’t help you track down where this madness might be coming from.

If we want more fine-grained knowledge of the state of our memory, and what objects are actually taking up space, we can use a handy tool called Heap Viewer inside DDMS. This tool allows you to view how much heap memory a process is using, which is useful in tracking down craziness in memory during various parts of your application’s execution. Once you open up DDMS and have your phone connected, you can click on the update heap button. This will turn on the profiling system to start recording heap information. Once that’s done, click on the heap tab, which will be where the display of that data goes. You’ll now be looking at the Java heap memory information for the selected application. Now notice that text at the top of the tab– the one that says heap updates will happen after every GC for this client? Well, click on cause a GC to update your data.

You’ll see that the data table will quickly update, showing you what data is currently available and alive on the heap. When you select one of the data types, the bottom panel will update, showing you a histogram for the number of allocations with a specific memory size. For example, you can see here that there’s over 100 objects on the heap that are one byte arrays, at a total size of 48 bytes. Meanwhile, it looks like there’s a one byte array object that’s megs. This tool is helpful to see what types of objects your application has allocated, as well as what their sizes are on the heap.

For example, if you see three megabytes of bitmap objects on the heap, even though you’ve destroyed that activity, you might have a memory leak to worry about. But sadly, that view doesn’t tell you where the data is being allocated in your code. For that, we need a new tool called Allocation Tracker. You can load up Android Studio, and click on the Android tab at the bottom of the window. This will effectively bring up a version of DDMS that runs docked inside your IDE.

Once you’ve connected to your device and selected a debuggable application, you can click on the start allocation tracking tab, and then start playing around in your application for a while. When you’re done, you can click on the stop allocation tracking button. Now note, depending on how long you’ve been doing stuff, this might take a while to complete. So just hang out for a bit. You’ll notice that a new tab will appear at the top of your IDE that lists out all the allocations that occurred during the duration of your sampling. Each row in this view represents an allocation that occurred. The order column tells you what order this allocation happened in. The allocated class column tells you what type of data was allocated– if it’s an array, or a specific class object. The size column tells you how big, in bytes, the allocation was. The thread ID column tells you what thread allocated that data, and the allocated site column tells you what function was responsible for allocating that memory.

When you click on an allocation, the bottom of the panel will update, showing you the full call stack for this allocation. This tool is super handy for tracking down problems like memory churn. So if you see a bunch of similar objects allocated, one after the other, so their allocation order is close to each other, then you found a great place to focus on optimization. But your performance ninja training is far from complete. That’s why you need to check out the rest of the Android Performance Patterns resources. And don’t forget to join the Google+ community to get more tips and tricks from other ninjas out there in the world. So as always, keep calm, profile your code, and remember– perf matters..

As found on Youtube