Skip to end of metadata
Go to start of metadata

Performance of SAP Memory Analyzer

Developers are addicted to numbers, so let's look into some. The performance numbers listed here were retrieved for the following heap dump:

Productive Heap Dump (3.0 GB)
50.000.000 Objects
150.000.000 References

The actual size doesn't matter much, as only the number of objects and object references determine its complexity. Still, as a rule of thumb, you can expect to parse heap dumps up to the size of 2.0 - 2.5 GB on a 32 bit box with 1.5 GB Java heap for the SAP Memory Analyzer. Inspecting a heap dump after it was parsed is usually possible on any decent 32 bit box.

Here some performance numbers for the most critical operations applied to this heap dump measured on a 64 bit 4 x 2.67 GHz machine with 4 GB main memory:

Operation

Time

First Opening

13 mins

Any Later Opening

Instantly

Biggest Distinct Objects

Instantly

Retained Size for Single Object

Instantly

Minimum Retained Size for Set of Objects

1 msec

Precise Retained Size for Set of Objects

6 secs

Single Object Keeping another Object Alive

Instantly

Note: The most important operations were implemented with a time complexity of O(1). Where we implemented algorithms with O(N) we did so with linear scaling whenever possible, improving performance for multi-core systems. Biggest Distinct Objects doesn't list nested objects within a bigger enclosing object and accounts them to and lists them below the enclosing object. Minimum Retained Size is almost always either matching the precise retained size or is very close to it - it only depends on the number of objects in the inspected set, NOT the number of objects in the heap dump. The time above is the average time for the computation of the object sets of all 60.000 classes in that heap dump.

If you are new to these tools, you will not know if this is good or bad. Computing the Precise Retained Size e.g. is 8x faster compared to the best commercial tool we know of. However, only a few tools offer this functionality at all. Most give you the plain shallow size which is the memory used by the inspected objects, but not the memory which would be freed by the GC if those inspected objects would be no longer referenced (e.g. shallow size of a String might be comparatively smaller than the retained size which includes the char[] stored inside the String). And why should you care to compute the Precise Retained Size if the Minimum Retained Size is precise enough for you? Furthermore we know of no tool which offers the Retained Size for single objects instantly or even helps in finding the single object responsible for the inspected object to remain in heap. All you usually can do is heap walking (walk the references to objects back to the referrer) which is useless if you have a couple million, let's say Strings to check.

That said, performance of this tool matters only to a certain degree. It is important, we enjoyed tuning it, and the tool feels 100x faster than all the other tools we worked with, but the analysis features are which really make it so useful in the end and which save you from needless work.

  • No labels