Skip to end of metadata
Go to start of metadata

How can I get an HPROF heap dump?

Are you running an AS Java Weekstone with SAP JVM...

  • Use the ConfigTool to configure the following VM parameters:
  • In SAP MMC right-click on the AS Java server process and select Dump Stack Trace
  • The heap file is written to the working directory of the server, e.g. /usr/sap/<SID>/J<instance>/j2ee/cluster/server<node>/

...or a plain Java Application?

  • If you are running a SAP JVM, use ~/bin/jvmmon.exe
  • On a Sun or HP VM please see the table below.

All options to get an HPROF dump in detail...

via Java VM parameters (see ConfigTool on how to set those for AS Java installations):

  • -XX:+HeapDumpOnOutOfMemoryError writes heap dump on OutOfMemoryError (recommended)
  • -XX:+HeapDumpOnCtrlBreak writes heap dump together with thread dump on CTRL+BREAK (recommended; use SAP MMC to trigger dumps for AS Java installations)
  • -agentlib:hprof=heap=dump,format=b combines the above two settings (old way; not recommended as the VM frequently dies after CTRL+BREAK with strange errors)

via tools:

  • Sun JMap: jmap.exe -dump:format=b,file=HeapDump.hprof <pid>
  • Sun JConsole: Launch jconsole.exe and invoke operation dumpHeap() on HotSpotDiagnostic MBean
  • SAP JVMMon: Launch jvmmon.exe and call menu for dumping the heap

Heap dump will be written to the working directory.

Vendor / Release

VM Parameter



Sun Tools


SAP Tool


On out of memory

On Ctrl+Break





Sun, HP
















Yes (Since 1.5.0_15)


Yes (Only Solaris and Linux)

















Any 1.5.0




Yes (Only Solaris and Linux)



If you run into problems opening a heap dump with the SAP Memory Analyzer (Error message: "Unrecognized heap dump sub-record type"), please let us know. This indicates a corrupt heap dump file. In this case we would like to learn which tool/approach you have chosen to dump the heap.

When HPROF heap dumps are NOT written on OutOfMemoryError?

Heap dumps are not written on OutOfMemoryError for the following reasons:

  • Application creates and throws OutOfMemoryError on its own
  • Another resource like threads per process is exhausted
  • C heap is exhausted

As for the C heap, the best way to see that you won't get a heap dump is if it happens in C code (eArray.cpp in the example below):

# An unexpected error has been detected by SAP Java Virtual Machine:
# java.lang.OutOfMemoryError: requested 2048000 bytes for eArray.cpp:80: GrET*. Out of swap space or heap resource limit exceeded (check with limits or ulimit)?
# Internal Error (\\...\hotspot\src\share\vm\memory\allocation.inline.hpp, 26), pid=6000, tid=468

C heap problems may arise for different reasons, e.g. out of swap space situations, process limits exhaustion or just address space limitations, e.g. heavy fragmentation or just the depletion of it on machines with limited address space like 32 bit machines. The hs_err-file will help you with more information on this type of error. Java heap dumps wouldn't be of any help, anyways.

Also please note that a heap dump is written only on the first OutOfMemoryError. If the application chooses to catch it and continues to run, the next OutOfMemoryError will never cause a heap dump to be written! 

How can I get an IBM system dump?

How to get an IBM system dump to be analyzed with Memory Analyzer + DTFJ Adapter depends on the concrete IBM system / JVM. The following resources should provide the necessary description:

  • SAP Note 1259465 - For NetWeaver 6.40 / 7.0x on AIX
  • SAP Note 1265455 - For NetWeaver 6.40 / 7.0x on pLinux (Linux on Power)
  • SAP Note 1263258 - For NetWeaver 6.40 / 7.0x on xLinux (IBM VM for Linux on x86_64) 
  • SAP Note 1267126 - For NetWeaver 6.40 / 7.0x on IBM i
  • SAP Note 1336952 - For NetWeaver 6.40 / 7.0x on zLinux (Linux on System z)
  • The DTFJ Adapter Help available under the "Help" menu in Memory Analyzer
  • IBM's Diagnosis documentation
  • No labels


  1. Unknown User (100um8jeh)

    java.lang.NullPointerException when opening head dump file

    I get the above error when opening a heap dump.
    The head dump (400 MB) file was created with -XX:+HeapDumpOnOutOfMemoryError on the following system:

    • Java 1.5.0_14-b03 (Java HotSpot(TM) 64-Bit Server VM)
    • Linux 2.6.18-6-amd64
    • Tomcat 5.5.x

    Error Message

    Stack Trace

    An internal error occurred during: "Parsing heap dump from '...java_pid10848.hprof'".

    	at org.eclipse.mat.hprof.HprofParserHandlerImpl.beforePass2(
    	at org.eclipse.mat.hprof.HprofIndexBuilder.fill(
    	at org.eclipse.mat.parser.internal.SnapshotFactoryImpl.parse(
    	at org.eclipse.mat.parser.internal.SnapshotFactoryImpl.openSnapshot(
    	at org.eclipse.mat.snapshot.SnapshotFactory.openSnapshot(

    Is the file corrupt or it is a problem with the Memory Analyzer ?


  2. Hi Martin,

    I have created Bugzilla Bug 237030 to track your bug report. If you want to get email notifications about changes, add yourself to the CC list.

    Kind regards,


  3. Unknown User (100um8jeh)

    i just did
    THANK YOU !!

  4. Unknown User (r311ezf)


    Does this work for IBM AIX 5.3 boxes. Our OS is AIX 5.3 and DB is Oracle 10g and SAP in NW 7.0. Can we use Memory Analyzer?


  5. Unknown User (100s0u068)


    When the system is generating the heap dump file, is it possible to generate it to a different folder rather than the standard one: /usr/sap/<SID>/J<instance>/j2ee/cluster/server<node>/

     Best regards,

     Mouro Vaz

  6. Unknown User (100s0u068)


     Yes, it's possible (OSS Note 1004255)


    Mouro Vaz

  7. Hi,
    can I add -XX:+HeapDumpOnCtrlBreak "feature"  to the jmap command?