Spark on Windows – UnsupportedFileSystemException:Error

  apache-spark, hadoop, hdfs, java, windows

I am trying to run a jar which listens when a file is transferred to HDFS. When I run the hadoop fs -put command the file gets added to HDFs but when I debug the jar on the command line – I get the following exception – UnsupportedFileSystemException: No FileSystem for scheme "C". The file I transfer is on my C local drive. (When I run hadoop fs -ls I can see that the file has got transferred) Any idea on why this might be ? I am using Spark and Hadoop on Windows 10.I use the same manner to reference the file given in the answer at java.io.IOException: No FileSystem for scheme: C and WinError 10054: An existing connection was forcibly closed by the remote host. Even though the file gets transferred to HDFS with the put command I get the above error on the listener jar console.

Source: Windows Questions

LEAVE A COMMENT