Start of main content
Spark developers know DSL is very expressive: in most cases, it's enough for most tasks and there's no need to go beyond the limits of its use. However, the job is the same program as others. It means you can work on its design, for example, inject dependencies, do configuration, and control resources. Such code is easier to reuse, test and support. In this talk, Dmitry will tell how to write in Spark functionally using Scala at maximum speed.