首页 > 解决方案 > Execution of scala Code through command line using spark-submit

问题描述

I have created a Scala project using IntelliJ IDEA and created JAR file. I want to execute code with spark-submit.I can successfully execute a code where no arguments have to pass. But here i am trying to pass arguments through command line but it throws error. My command is

spark-submit --class BATalgo1[-10,10,100,5,4] \file:///out/artifacts/Spark_Stack_jar/Spark_Stack.

Path is correct,It throws following error.

java.lang.ClassNotFoundException: BATalgo1[-10,10,100,5,4]
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:810)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

MY Code is,

object BATalgo1 {
 def main(args: Array[String]): Unit = {
  val t1 =System.nanoTime()
  var MinVal = args(0).toInt //scala.io.StdIn.readDouble();
  var MaxVal = args(1).toInt //scala.io.StdIn.readDouble();
  var MaxIt = args(2).toInt //scala.io.StdIn.readDouble();
  val N = args(3).toInt //scala.io.StdIn.readInt();
  val d = args(4).toInt //scala.io.StdIn.readInt()
  println("arguments are= ", MinVal,MaxVal , MaxIt ,N ,d )
 }
}

标签: scalacommand-linespark-submit

解决方案


Try:

spark-submit --class BATalgo1 <your-executable-jar-file> -10 10 100 5 4 

If main class BATalgo1 is inside package com.package replace BATalgo1 with com.package.BATalgo1.

Command line arguments are passed in the last of the spark-submit command.

Please refer the documentation for further information.


推荐阅读