Category : apache-spark

I have problem with starting pyspark in cmd on windows 10 (same error in pycharm when creating SparkSessioin), I get following error C:Usersadmin>pyspark Python 3.8.2 (tags/v3.8.2:7b3ab59, Feb 25 2020, 22:45:29) [MSC v.1916 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information. Traceback (most recent call last): File "C:spark-3.1.2-bin-hadoop3.2pythonpysparkshell.py", line 29, ..

Read more

When I try to start the livy-server via command prompt using "bash livy-server start", I get the following error which is because of the space in "Program Files". failed to launch C:/Program Files/Java/jdk1.8.0_301/bin/java -cp /c/Users/user_name/apache-livy-0.7.1-incubating-bin/jars/*:/c/Users/user_name/apache-livy-0.7.1-incubating-bin/conf: org.apache.livy.server.LivyServer: nice: ‘C:/Program’: No such file or directory full log in /c/Users/user_name/apache-livy-0.7.1-incubating-bin/logs/livy–server.out Source: Windows..

Read more

I am trying to find bad_sectores or bad recorded files on a hard drive programmatically, using c++ or python or apache spark or assembly code,however, couldn’t find any useful until now. Is there any way to scan hard and find bad sectors programmatically? I need sample code that show details like: failure addresses, percent of ..

Read more

Use Case Simple text file data fetching from kafka topic using spark with java programming language and writing CSV using FileWriter with following installations: kafka version: 2.4.0(2.11) Java Version: "16.0.1" Spark version: 2.4.7 with Scala 2.11.12 Hadoop: 2.7 Java compiler: JavaSE-1.8 Background: No experience with Apache Spark Little experience with Apache Kafka Kafka producer and ..

Read more

C:UsersHP>spark-shell Exception in thread "main" java.lang.ExceptionInInitializerError at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54) at org.apache.spark.internal.config.package$.<init>(package.scala:1006) at org.apache.spark.internal.config.package$.<clinit>(package.scala) at org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157) at scala.Option.orElse(Option.scala:447) at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157) at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115) at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:990) at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:990) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @7bedc48a at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297) ..

Read more