You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/user-docs.md
+8-1Lines changed: 8 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,6 +35,7 @@ DCOS Spark includes:
35
35
*[Kafka][5]
36
36
*[Zeppelin][6]
37
37
38
+
<aname="quick-start"></a>
38
39
# Quick Start
39
40
40
41
1. Install DCOS Spark via the DCOS CLI:
@@ -53,6 +54,7 @@ DCOS Spark includes:
53
54
54
55
Visit the Spark cluster dispatcher at `http://<dcos-url>/service/spark/` to view the status of your job. Also visit the Mesos UI at `http://<dcos-url>/mesos/` to see job logs.
55
56
57
+
<a name="install"></a>
56
58
# Install
57
59
58
60
To start a basic Spark cluster, run the following command on the DCOS CLI. This command installs the dispatcher, and, optionally, the history server. See [Custom Installation][7] to install the history server.
@@ -378,6 +380,7 @@ To use a specific Spark instance from the DCOS Spark CLI:
378
380
$ dcos config set spark.app_id <service.name>
379
381
```
380
382
383
+
<a name="upgrade"></a>
381
384
# Upgrade
382
385
383
386
1. In the Marathon web interface, destroy the Spark instance to be updated.
@@ -387,7 +390,7 @@ $ dcos config set spark.app_id <service.name>
387
390
```
388
391
$ dcos package install spark
389
392
```
390
-
393
+
<a name="run-a-spark-job"></a>
391
394
# Run a Spark Job
392
395
393
396
1. Before submitting your job, upload the artifact (e.g., jar file) to a location visible to the cluster (e.g., S3 or HDFS). [Learn more][13].
To set Spark properties with a configuration file, create a `spark-defaults.conf` file and set the environment variable `SPARK_CONF_DIR` to the containing directory. [Learn more][15].
The Spark dispatcher persists state in Zookeeper, so to fully uninstall the Spark DCOS package, you must go to `http://<dcos-url>/exhibitor`, click on `Explorer`, and delete the znode corresponding to your instance of Spark. By default this is `spark_mesos_Dispatcher`.
447
451
452
+
<a name="runtime-configuration-change"></a>
448
453
# Runtime Configuration Change
449
454
450
455
You can customize DCOS Spark in-place when it is up and running.
@@ -459,6 +464,7 @@ You can customize DCOS Spark in-place when it is up and running.
459
464
460
465
5. Click `Change and deploy configuration` to apply any changes and cleanly reload Spark.
461
466
467
+
<a name="troubleshooting"></a>
462
468
# Troubleshooting
463
469
464
470
## Dispatcher
@@ -485,6 +491,7 @@ To debug authentication in a Spark job, enable Java security debug output:
485
491
$ dcos spark run --submit-args="-Dsun.security.krb5.debug=true..."
486
492
```
487
493
494
+
<a name="limitations"></a>
488
495
# Limitations
489
496
490
497
* DCOS Spark only supports submitting jars. It does not support Python or R.
0 commit comments