首页 > 解决方案 > 获取 spark-submit 参数的值:num-executors、executor-cores 和 executor-memory

问题描述

我在 SO 上阅读了很多关于这个主题的问题,并且我制作了一个简陋的 bash 脚本来快速获取这些值。

创建脚本的主要来源是:

该脚本包含以下内容:

# fixed values
CORES_PER_EXECUTOR=5    #  (for good HDFS throughput) --executor-cores
HADOOP_DAEMONS_CORE=1
HADOOP_DAEMONS_RAM=1


# Example values
# This information can be obtained using the `lscpu` command
total_nodes_in_cluster=10    # `CPU(s):` field of lscpu
total_cores_per_node=16     # `Core(s) per socket:` field of lscpu
total_ram_per_node=64      # using `free -h` command

available_cores_per_node=$((total_cores_per_node - HADOOP_DAEMONS_CORE))

available_cores_in_cluster=$((available_cores_per_node * total_nodes_in_cluster))

available_executors=$((available_cores_in_cluster / CORES_PER_EXECUTOR))

num_of_executors=$((available_executors - 1 )) # Leaving 1 executor for ApplicationManager

num_of_executors_per_node=$((available_executors / total_nodes_in_cluster))

mem_per_executor=$((total_ram_per_node / num_of_executors_per_node))

# Counting off heap overhead = 7% of `mem_per_executor`GB:
# TODO "Counting off heap overhead = 7% of 21GB = 3GB. ???
seven_percent=$((mem_per_executor / 7))
executor_memory=$((mem_per_executor - seven_percent))

echo -e "The command will contains:\n spark-submit --class <CLASS_NAME> --num-executors ${num_of_executors} --executor-cores ${CORES_PER_EXECUTOR} --executor-memory ${executor_memory}G ...."

我想知道:

谢谢!

标签: apache-spark

解决方案


推荐阅读