The Spark new project wizard, which lets you quickly create a Spark project with needed dependencies.
The Spark Submit run configuration to build and upload your Spark application to a cluster. For Scala files, there is also a special icon in the gutter, which lets you create this configuration even faster.
The Spark monitoring tool window to monitor submitted jobs, view DAG visualizations, and more. This includes jobs submitted from the Spark Submit run configurations and EMR steps. If you have the Zeppelin plugin installed, you can also open Spark jobs from Zeppelin notebooks.
Integration with other big data tools without leaving the IDE (open Spark applications from AWS EMR, navigate to Spark jobs from Hadoop YARN, view logs in S3 storages).
Install the Spark plugin
This functionality relies on the Spark plugin, which you need to install and enable.
Press Control+Alt+S to open the IDE settings and then select.
Open the Marketplace tab, find the Spark plugin, and click Install (restart the IDE if prompted).
In this chapter:
Tutorial on how to create a Spark application using the new project wizard and upload it to an AWS EMR cluster.
If you want to monitor existing jobs, learn more about the Spark monitoring tool window.
If you want to submit a Spark application to a cluster, learn more about Spark Submit run configuration