Spark on YARN - Cannot allocate containers as requested resource is greater than maximum allowed allocation

Error : YARN application has exited unexpectedly with state FAILED! Check the YARN application logs for more details. 2021-10-12 15:15:30,201 Diagnostics message: Uncaught exception: org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request! Cannot allocate containers as requested resource is greater than maximum allowed allocation. Requested resource type=[vcores], Requested resource=<memory:7296, vCores:7>, maximum allowed allocation=<memory:12288, vCores:4>, please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, which might be less than configured maximum allocation=<memory:12288, vCores:128>

Error descriptions :

I was facing the above issue when I was executing the same spark-submit to the AWS environment, but that issue did not come when I was executing spark-submit on-premises.

Solutions :

I have removed the resource configuration from spark-submit.

Sample spark-submit :

spark-submit  --packages com.databricks:spark-csv_2.11:1.2.0,com.Typesafe: config: 1.3.2 --name xyz--class main layer.SparkSessionTest "/opt/code/ReportGenerations-1.0-SNAPSHOT.jar" "/opt/code/ReportGeneration.properties"  "/opt/code/log4j.properties"

1 answer

  • answered 2021-10-12 18:54 Gabip

    The error clearly states that the amount of resources that you are trying to allocate for your executors is too big for the nodes in your yarn cluster. In your specific case, the cluster is configured to have 4 cores in each node while you are trying to allocate executors with 7 cores.

    In order to solve it, try to reduce the number of cores for each executor using the following config, provided as a flag in the spark-submit command:

    --executor-cores 2
    

    So the spark-submit will be:

    spark-submit --executor-cores 2 --packages com.databricks:spark-csv_2.11:1.2.0,com.Typesafe: config: 1.3.2 --name xyz--class main layer.SparkSessionTest "/opt/code/ReportGenerations-1.0-SNAPSHOT.jar" "/opt/code/ReportGeneration.properties"  "/opt/code/log4j.properties"
    
    

    Please note that 2 is just an example. The max number of cores you can allocate according to your node's capacity is 4.

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum