‘spark-shell’ is not recognized as an internal or external command, operable program or batch file

  apache-spark, installation, java, pyspark, windows

I keep having this error during installation to of spark on windows 10.
"’spark-shell’ is not recognized as an internal or external command, operable program or batch file."

I checked several previous questions and tried everything still having same issue.
I then tried installing java jre and jdk tried both (i am not sure if there is any difference) still same.

here is are the paths
for HADOOP_HOME my directory is C:Big_Datahadoop
for SPARK_HOME my directory is C:Big_Datasparkspark-3.0.1-bin-hadoop2.7
for JAVA_HOME my directory is C:Java

My User variable path
PATH

Source: Windows Questions

LEAVE A COMMENT