首页 > 解决方案 > 无法运行程序“python3”

问题描述

我尝试在我的 Windows 10 (64) 中安装 SPARK 并为:SPARK_HOME、HADOOP_HOME 和 JAVA_HOME 设置环境变量,但是,当我尝试运行这样的 python 程序时:

    *'''spark-submit SparkRuming.py'''*

I tried to change my environment variables to may possible places, but I'm still getting this:

    *'''Exception in thread "main" java.io.IOException: Cannot run program "python3": CreateProcess error=2, The system cannot find the file specified
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
        at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:97)
        at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
        at java.lang.ProcessImpl.create(Native Method)
        at java.lang.ProcessImpl.<init>(ProcessImpl.java:444)
        at java.lang.ProcessImpl.start(ProcessImpl.java:140)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
        ... 14 more'''*

我确定我缺少一些环境变量或者我指向错误的地方,有人有同样的问题吗?

标签: pythonpyspark

解决方案


推荐阅读