The Spark Submit run configuration to build and upload your Spark application to a cluster.
The Spark monitoring tool window to monitor submitted jobs, view DAG visualizations, and more. This includes jobs submitted from the Spark Submit run configurations and EMR steps. If you have the Zeppelin plugin installed, you can also open Spark jobs from Zeppelin notebooks.
Integration with other big data tools without leaving the IDE (open Spark applications from AWS EMR, navigate to Spark jobs from Hadoop YARN, view logs in S3 storages).
Install the Spark plugin
This functionality relies on the Spark plugin, which you need to install and enable.
Press Control+Alt+S to open the IDE settings and then select.
Open the Marketplace tab, find the Spark plugin, and click Install (restart the IDE if prompted).
In this chapter:
If you want to monitor existing jobs, learn more about the Spark monitoring tool window.
If you want to submit a Spark application to a cluster, learn more about Spark Submit run configuration