Anatomy of a Spark job run
Spark application contains several components, all of which exists whether you are running Spark on a single machine or across a cluster of hundreds or thousands of nodes.
The components of the spark application are Driver, the Master, the Cluster Manager and the Executors.
All of the spark components including the driver, master, executor processes run in java virtual machines(JVMs). A JVM is a cross-platform runtime engine that executes the instructions compiled into java bytecode. Scala, which spark is written in, compiles into bytecode and runs on JVMs.
0 Comments