欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

Spark程序运行报错:JVM申请的memory不足解决办法

程序员文章站 2022-08-10 16:27:49
Spark http://spark.apache.org/ http://spark.apache.org/docs/latest/ http://spark.apache.org/docs/2.2.0/quick-start.html http://spark.apache.org/docs/2 ......

Spark


http://spark.apache.org/


http://spark.apache.org/docs/latest/


http://spark.apache.org/docs/2.2.0/quick-start.html


http://spark.apache.org/docs/2.2.0/spark-standalone.html


http://spark.apache.org/docs/2.2.0/running-on-yarn.html


http://spark.apache.org/docs/2.2.0/configuration.html


http://spark.apache.org/docs/2.2.0/structured-streaming-programming-guide.html


http://spark.apache.org/docs/2.2.0/streaming-programming-guide.html


2.21开始,spark-streaming,有变化


http://spark.apache.org/docs/2.2.1/streaming-programming-guide.html

 

 

报错内容:

System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.

在Eclipse里开发spark项目,尝试直接在spark里运行程序的时候,遇到下面这个报错:


ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 468189184 must be at least 4.718592E8. Please use a larger heap size.


很明显,这是JVM申请的memory不够导致无法启动SparkContext。但是该怎么设呢?


通过查看spark源码,发现源码是这么写的:

/** 
   * Return the total amount of memory shared between execution and storage, in bytes. 
   */  
  private def getMaxMemory(conf: SparkConf): Long = {  
    val systemMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory)  
    val reservedMemory = conf.getLong("spark.testing.reservedMemory",  
      if (conf.contains("spark.testing")) 0 else RESERVED_SYSTEM_MEMORY_BYTES)  
    val minSystemMemory = reservedMemory * 1.5  
    if (systemMemory < minSystemMemory) {  
      throw new IllegalArgumentException(s"System memory $systemMemory must " +  
        s"be at least $minSystemMemory. Please use a larger heap size.")  
    }  
    val usableMemory = systemMemory - reservedMemory  
    val memoryFraction = conf.getDouble("spark.memory.fraction", 0.75)  
    (usableMemory * memoryFraction).toLong  
  }  

  

所以,这里主要是val systemMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory)。

conf.getLong()的定义和解释是

  1. getLong(key: String, defaultValue: Long): Long  
  2. Get a parameter as a long, falling back to a default if not set  

所以,我们应该在conf里设置一下spark.testing.memory.

通过尝试,发现可以有2个地方可以设置

1. 自己的源代码处,可以在conf之后加上:

    val conf = new SparkConf().setAppName("word count")
    conf.set("spark.testing.memory", "2147480000")//后面的值大于512m即可

2. 可以在Eclipse的Run Configuration处,有一栏是Arguments,下面有VMarguments,在下面添加下面一行(值也是只要大于512m即可)

-Dspark.testing.memory=1073741824

其他的参数,也可以动态地在这里设置,比如-Dspark.master=spark://hostname:7077

再运行就不会报这个错误了。

解决:

Window——Preference——Java——Installed JREs——选中一个Jre 后 
Edit在Default VM arguments 里加入:-Xmx512M

Spark程序运行报错:JVM申请的memory不足解决办法

OK!!!