声明:文章原创,转载需注明出处。由于文章大多是学习过程中的记录内容,技术能力有限,希望大家指出其中错误,共同交流进步。由此原因,文章会不定期改善,看原文最新内容,请到:http://blog.chinaunix.net/uid/29454152.html
1. Installing Spark Standalone to a Cluster
To install Spark Standalone mode, you simply place a compiled version of Spark on each node on the cluster.
to visit the below blog:
http://blog.chinaunix.net/uid-29454152-id-5148300.html
http://blog.chinaunix.net/uid-29454152-id-5148347.html
2. Starting a Cluster Manually
1)at master
command to start spark
sudo ./sbin/start-master.sh
spark://HOST:PORT can be find in webUI address :
2)at worker
command to connect master
./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT
3. Connecting an Application to the Cluster
command to start app in spark
sudo ./bin/spark-shell --master spark://IP:PORT --total-executor-cores <numCores>
4.submit jar
mode:down at spark dir
./bin/spark-submit --class path.to.your.class [options] <app jar>
example:at standalone
./bin/spark-submit \
--class my.main.classname \
--master spark://127.0.0.1:7077
--executor-memory 2G \
--total-executor-cores 4 \
/home/warrior/IdeaProjects/sparkTest/out/artifacts/sparkTest_jar/sparkTest.jar
阅读(1042) | 评论(0) | 转发(0) |