Get an error & ldquo; java.lang.OutOfMemoryError: Java heap space & rdquo; while performing a simple work of mapreduce

advertisements

I have been trying to run a simple Mapreduce job for wordcount in RHEL 6 but am consistently getting this error. Please help.

13/01/13 19:59:01 INFO mapred.MapTask: io.sort.mb = 100
13/01/13 19:59:01 WARN mapred.LocalJobRunner: job_local_0001
java.lang.OutOfMemoryError: Java heap space
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:949)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:674)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:756)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
13/01/13 19:59:02 INFO mapred.JobClient:  map 0% reduce 0%
13/01/13 19:59:02 INFO mapred.JobClient: Job complete: job_local_0001
13/01/13 19:59:02 INFO mapred.JobClient: Counters: 0


You probably need to increase some JVM settings for max heap and max perm space.

I'd recommend running Visual VM when your Hadoop job is running so you can get some visibility into what's going on.

Are you running multiple servers? Maybe you're asking a single server to do too much.