By NetanelRabinowitz

2017-03-19 13:50:23 8 Comments

At the Spark 2.1 docs it's mentioned that

Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 2.1.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).

at the Scala 2.12 release news it's also mentioned that:

Although Scala 2.11 and 2.12 are mostly source compatible to facilitate cross-building, they are not binary compatible. This allows us to keep improving the Scala compiler and standard library.

But when I build an uber jar (using Scala 2.12) and run it on Spark 2.1. every thing work just fine.

and I know its not any official source but at the 47 degree blog they mentioned that Spark 2.1 does support Scala 2.12.

How can one explain those (conflicts?) pieces of information ?


@user7735456 2017-03-19 14:22:09

Spark does not support Scala 2.12. You can follow SPARK-14220 (Build and test Spark against Scala 2.12) to get up to date status.

update: Spark 2.4 added an experimental Scala 2.12 support.

@jjj 2017-03-19 15:15:38

could have been added as comment

@Maziyar 2018-11-20 11:52:55

Spark 2.4 now supports Scala 2.12 as an experimentally.

@George Hawkins 2019-04-07 14:11:19

2.12 support is no longer experimental - it's now GA - see Spark 2.4.1 release notes.

@ecoe 2019-12-10 21:57:13

Scala 2.12 may be supported, but as of Spark 2.4.x the pre-built binaries are compiled with Scala 2.11 (except version 2.4.2).

@denvercoder9 2017-09-22 18:08:46

To add to the answer, I believe it is a typo has no mention of scala 2.12.

Also, if we look at timings Scala 2.12 was not released untill November 2016 and Spark 2.0.0 was released on July 2016.


Related Questions

Sponsored Content

2 Answered Questions

[SOLVED] Submit job to spark with conflicting jackson dependencies?

7 Answered Questions

[SOLVED] What are all the uses of an underscore in Scala?

  • 2011-11-03 19:43:45
  • Ivan
  • 140347 View
  • 539 Score
  • 7 Answer
  • Tags:   scala

1 Answered Questions

Cross-build scala API for various spark version

1 Answered Questions

spark-submit with Mahout error on cluster mode (Scala/java)

1 Answered Questions

[SOLVED] Run Spark in standalone mode with Scala 2.11?

Sponsored Content