Spark是Hadoop的子項(xiàng)目。因此,最好是基于Linux系統(tǒng)安裝Spark 。下列步驟顯示如何安裝Apache Spark。
安裝Java是安裝Spark強(qiáng)制性的事情之一。 試試下面的命令來驗(yàn)證Java版本。
$java -version
如果Java已經(jīng)安裝在系統(tǒng)上,就能看到類似以下的響應(yīng)結(jié)果 ?
java version "1.8.0_71" Java(TM) SE Runtime Environment (build 1.8.0_71-b13) Java HotSpot(TM) Client VM (build 25.0-b02, mixed mode)
$scala -version
如果Scala已經(jīng)安裝在系統(tǒng)上,應(yīng)該能看到以下響應(yīng) ?
Scala code runner version 2.11.6 -- Copyright 2002-2013, LAMP/EPFL
通過訪問以下地址下載 Scala 的最新版本 - 下載Scala. 在本教程中,我們將使用 Scala-2.11.6版本。下載后,在下載文件夾中找到 Scala tar文件。
$ tar xvf scala-2.11.6.tgz
使用以下命令移動(dòng) Scala 軟件文件,到相應(yīng)目錄,一般是 (/usr/local/scala).
$ su – Password: # cd /home/Hadoop/Downloads/ # mv scala-2.11.6 /usr/local/scala # exit
$ export PATH = $PATH:/usr/local/scala/bin
$scala -version
如果 Scala 已經(jīng)安裝在系統(tǒng)上,能看到以下響應(yīng) ?
Scala code runner version 2.11.6 -- Copyright 2002-2013, LAMP/EPFL
$ tar xvf spark-1.3.1-bin-hadoop2.6.tgz
移動(dòng)Spark軟件文件到相應(yīng)目錄 (/usr/local/spark),使用下面的命令.
$ su – Password: # cd /home/Hadoop/Downloads/ # mv spark-1.3.1-bin-hadoop2.6 /usr/local/spark # exit
添加以下內(nèi)容到 ~/.bashrc 文件. 這意味著添加的位置,將spark軟件文件的位置到PATH變量。
export PATH = $PATH:/usr/local/spark/bin
$ source ~/.bashrc
$spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on classpath Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 15/06/04 15:25:22 INFO SecurityManager: Changing view acls to: hadoop 15/06/04 15:25:22 INFO SecurityManager: Changing modify acls to: hadoop 15/06/04 15:25:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop) 15/06/04 15:25:22 INFO HttpServer: Starting HTTP Server 15/06/04 15:25:23 INFO Utils: Successfully started service 'HTTP class server' on port 43292. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.4.0 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_71) Type in expressions to have them evaluated. Spark context available as sc scala>