By Gyan


2017-04-10 20:20:30 8 Comments

I am on Hortonworks Distribution 2.4 (effectively hadoop 2.7.1 and spark 1.6.1)


I am packaging my own version of spark in the uber jar (2.1.0) while cluster is on 1.6.1. In the process, i am sending all required libraries through a fat jar (built using maven - uber jar concept).

However, spark submit (through spark 2.1.0 client) fails citing NoClassFound Error on jersey client. Upon listing my uber jar contents, i can see the exact class file in the jar, still spark/yarn cant find it.

here goes - The error message -

Exception in thread "main" java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
        at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
        at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
        at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
        at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:151)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)

And here is my attempt to find the class in jar file -

jar -tf uber-xxxxx-something.jar | grep jersey | grep ClientCon
com/sun/jersey/api/client/ComponentsClientConfig.class
com/sun/jersey/api/client/config/ClientConfig.class

... Other files

what could be going on here ? Suggestions ? ideas please..

EDIT the jersey client section of the pom goes here -

    <dependency>
         <groupId>com.sun.jersey</groupId>
         <artifactId>jersey-client</artifactId>
         <version>1.19.3</version>
     </dependency>

EDIT I also wanted to indicate this, that my code is compiled with Scala 2.12 with compatibility level set to 2.11. However, the cluster is perhaps on 2.10. I am saying perhaps since I believe that cluster nodes dont necessarily have to have Scala binaries installed; YARN just launches the components' jar/class files without using Scala binaries. wonder if thats playing a role here !!!

0 comments

Related Questions

Sponsored Content

4 Answered Questions

[SOLVED] What is an uber jar?

  • 2012-08-14 06:42:04
  • Ashay Batwal
  • 82736 View
  • 224 Score
  • 4 Answer
  • Tags:   maven uberjar

35 Answered Questions

[SOLVED] Can't execute jar- file: "no main manifest attribute"

27 Answered Questions

[SOLVED] How to add local jar files to a Maven project?

1 Answered Questions

[SOLVED] ClassPath issue with Spark2 Streaming on Yarn

1 Answered Questions

[SOLVED] Error executing Maven jar file containing SparkSession builder

1 Answered Questions

1 Answered Questions

[SOLVED] Sparksession error is about hive

1 Answered Questions

[SOLVED] scala worksheet throwing error with spark

1 Answered Questions

[SOLVED] java apache spark mllib

  • 2017-09-08 11:49:45
  • ASHVINI KUMAR
  • 107 View
  • 0 Score
  • 1 Answer
  • Tags:   java apache maven

2 Answered Questions

[SOLVED] Spark job fails on java 9 NumberFormatException for input string ea

Sponsored Content