Hive support on windows not working using winutils

  apache-spark, apache-spark-sql, hive, scala, windows

I am facing the below error to access hive from local windows using winutils. The below steps are followed to access hive. Can some one help on solving my problem.

ERROR Facing:

Exception in thread "main" org.apache.spark.sql.AnalysisException:
java.lang.RuntimeException: java.io.FileNotFoundException: File
/tmp/hive does not exist;

Step 1: created a folder as below and placed winutils.exe file.

C:Users<>Documentswinutilsbinwinutils.exe

Step 2: created a temp hive path in as below and provided all permissions.

C:Users<>Documentswinutilstmphive

Command:

C:Users<>Documentswinutilsbin>winutils.exe

chmod 777 C:Users<>Documentswinutilstmphive

Step 3: written the spark code as below to access hive.

def main(args: Array[String]): Unit = {
   System.setProperty("hadoop.home.dir","C:UsersslingaladinneDocumentswinutils")
    val spark=SparkSession
      .builder
      .appName("SampleSparkPrg")
      .config("spark.master","local")
      .enableHiveSupport()
      .getOrCreate()
    val sampleSeq=Seq((1,"spark"),(2,"Hive"))
    println("Seq Created")
    val df=spark.createDataFrame(sampleSeq).toDF(colNames = "sno","name")
    println("DF Created")
    df.show() 
}

Source: Windows Questions

LEAVE A COMMENT