Spark Standalone Mode下的client mode和cluster mode
1 . 有问题找百度,百度上搜索spark
2. 进入spark官网,找到对应的版本的文档.因为我的是spark2.1.2版本
3.进入spark的2.1.2文档后,看到Running the Examples and Shell和Launching on a Cluster
Launching on a ClusterThe Spark cluster mode overview explains the key concepts in running on a cluster. Spark can run both by itself, or over several existing cluster managers. It currently provides several options for deployment:
|
4.进入文档的Spark Standalone Mode页面
5.Launching Spark Applications目录下讲了client
mode和cluster
mode
Launching Spark ApplicationsThe If your application is launched through Spark submit, then the application jar is automatically distributed to all worker nodes. For any additional jars that your application depends on, you should specify them through the Additionally, standalone
You can find the driver ID through the standalone Master web UI at |