Nikhil Mahajan <niks729@yahoo.com> wrote: > Exception in thread "main" java.lang.OutOfMemoryError This isn't really a Spark-specific issue. Try increasing the maximum size of your Java heap with "-Xmx1024m". Java's maximum heap size is fixed at 64M by default. I presume you have swap space on top of your RAM... Cheers, Navin.