首页 > 技术文章 > spark安装

timlong 2018-11-11 20:55 原文

配置文件:

spark-env.sh

export SCALA_HOME=/usr/local/scala
export SPARK_WORKING_MEMORY=1g
export SPARK_MASTER_IP=mac
export HADOOP_HOME=/tim/hadoop
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native

slaves:

node1

node2

spark-defaults.conf

spark.yarn.jars   hdfs://rhel01/spark/jars/*

 

推荐阅读