What Scala do you need to know before using Apache Spark?
If you're not a Scala regular then this talk by VP, Fast Data Engineer Dean Wampler at Spark Summit will teach you the core features of Scala you need to know so you can be effective with Spark!
Just Enough Scala for Spark
Apache Spark is written in Scala. Hence, many if not most data engineers adopting Spark are also adopting Scala, while Python and R remain popular with data scientists. Fortunately, you don’t need to master Scala to use Spark effectively.
This session teaches you the core features of Scala you need to know to be effective with Spark’s Scala API. Topics include:
1) classes, methods, and functions
2) immutable vs. mutable values
3) type inference
4) pattern matching
5) Scala collections and the common operations on them (the basis of Spark’s RDD API)
6) really useful Scala types, like case classes, tuples, and options
7) effective use of the Spark shell (Scala interpreter)
8) common mistakes and how to avoid them.
This talk was given by Dean Wampler at Spark Summit.