spark安装与配置("Exception: Python in worker has different version 2.7 than that in driver 3.7)

运行py文件

spark-submit --master spark://node1:7077 /home/shexiaobin/spark/examples/src/main/python/pi.py100

出现了以下的报错:
Exception: Python in worker has different version 2.7 than that in driver 3.7, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:330) at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:470) at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:453) at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:284) at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37) at scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:1126) at scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:1132) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

解决方法:
【spark安装与配置("Exception: Python in worker has different version 2.7 than that in driver 3.7)】设置PYSPARK_PYTHON
cd/home/shexiaobin/spark/conf vi spark-env.sh # 加入以下内容 export PYSPARK_PYTHON=/home/shexiaobin/anaconda3/bin/python

    推荐阅读