this suggests that I should let the program work until it crashes (due to a memory error), which creates a crash report.
I donβt think itβs true - you wonβt get the dump file when OutOfMemoryError (I would bet that the author confuses this problem with some kind of JVM error that will cause the main dump to be saved).
The best procedure is to grab a bunch of heaps using jmap ; this will output the heap contents to a binary file (commonly known as an hprof file). Then this file can be analyzed using any number of analyzers:
- jhat - A sun tool that analyzes an HPROF file, launches an embedded web server, so you can analyze a bunch through a web browser / view reports. I found this to be very slow for large heaps.
- VisualVM - Awesome debugging / troubleshooting tool bundled with JDK. Among other things, it can also be used to dump a heap of any running process, as well as a thread dump. I found it very slow to load large hprof files, though.
- Eclipse Memory Analyzer is an Eclipse plugin that can generate hprof files.
I highly recommend using the plug-in for Eclipse, since it loads very quickly large (> 500MB) heaps of dumps (per minute), produces useful reports, supports a query language with complex logic, etc.
source share