Hive versions 1.2 onward require Java 1.7 or newer. Hive versions 0.14 to 1.1 work with Java 1.6 as well.
Hadoop 2.x (preferred), 1.x (not supported by Hive 2.0.0 onward). Hive versions up to 0.13 also supported Hadoop 0.20.x, 0.23.x.
Hive Installation Steps
Step 1 - Creating hive directory. Open a new terminal(CTRL + ALT + T) and enter the following command.
Step 2 - Change the ownership and permissions of the directory /usr/local/hive. Here 'hduser' is an Ubuntu username.
Step 3 - Switch User, is used by a computer user to execute commands with the privileges of another user account.
Step 4 - Change the directory to /home/hduser/Desktop , In my case the downloaded apache-hive-2.1.0-bin.tar.gz file is in /home/hduser/Desktop folder. For you it might be in /downloads folder check it.
Step 5 - Untar the apache-hive-2.1.0-bin.tar.gz file.
Step 6 - Move the contents of apache-hive-2.1.0-bin folder to /usr/local/hive
Step 7 - Edit $HOME/.bashrc file by adding the pig path.
$HOME/.bashrc file. Add the following lines
Step 8 - Reload your changed $HOME/.bashrc settings
Step 9 - Change the directory to /usr/local/hive/conf
Step 10 - Copy the default hive-env.sh.template to hive-env.sh
Step 11 - Edit hive-env.sh file.
Step 12 - Add the below lines to hive-env.sh file. Save and Close.
Step 13 - Copy the default hive-default.xml.template to hive-site.xml
Step 14 - Edit hive-site.xml file.
Step 15 - Add or update below properties in hive-site.xml file. Save and Close.
Step 16 - Change the directory to /usr/local/hadoop/sbin
Step 17 - Start all hadoop daemons.
Step 18 - You must use below
HDFS commands to create /tmp and /user/hive/warehouse (aka
hive.metastore.warehouse.dir) and set them chmod g+w before you can
create a table in Hive.
Step 19 - Change the directory to /usr/local/hive/bin
Step 20 - We need to run the schematool command below as an initialization step. For example, we can use "derby" as db type.
Step 21 - To use the Hive command line interface (CLI) from the shell.
Step 22 - To list all the tables those are present in derby database.
Hadoop 2.x (preferred), 1.x (not supported by Hive 2.0.0 onward). Hive versions up to 0.13 also supported Hadoop 0.20.x, 0.23.x.
Hive Installation Steps
Step 1 - Creating hive directory. Open a new terminal(CTRL + ALT + T) and enter the following command.
$ sudo mkdir /usr/local/hive
$ sudo chown -R hduser /usr/local/hive $ sudo chmod -R 755 /usr/local/hive
$ su hduser
$ cd /home/hduser/Desktop/
$ tar xzf apache-hive-2.1.0-bin.tar.gz
$ mv apache-hive-2.1.0-bin/* /usr/local/hive
$ sudo gedit $HOME/.bashrc
export HIVE_HOME=/usr/local/hive export PATH=$HIVE_HOME/bin:$HIVE_HOME/lib:$PATH
$ source $HOME/.bashrc
$ cd $HIVE_HOME/conf
$ cp hive-env.sh.template hive-env.sh
$ gedit hive-env.sh
export HADOOP_HOME=/usr/local/hadoop export HIVE_CONF_DIR=$HIVE_CONF_DIR export HIVE_AUX_JARS_PATH=$HIVE_AUX_JARS_PATH
$ cp hive-default.xml.template hive-site.xml
$ gedit hive-site.xml
<property> <name>hive.metastore.schema.verification</name> <value>false</value> <description>Will remove your error occurring because of metastore_db in shark</description> </property> <property> <name>hive.exec.scratchdir</name> <value>/tmp/hive</value> <description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission.</description> </property> <property> <name>hive.exec.local.scratchdir</name> <value>/tmp/$ {user.name}</value> <description>Local scratch space for Hive jobs</description> </property> <property> <name>hive.downloaded.resources.dir</name> <value>/tmp/$ {user.name}_resources</value> <description>Temporary local directory for added resources in the remote file system.</description> </property> <property> <name>hive.scratch.dir.permission</name> <value>733</value> <description>The permission for the user specific scratch directories that get created.</description> </property>
<property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:derby://localhost:1527/metastore_db;create=true </value> <description>JDBC connect string for a JDBC metastore </description> </property>
$ cd /usr/local/hadoop/sbin
$ start-all.sh
$ hdfs dfs -mkdir /tmp
$ hdfs dfs -chmod 777 /tmp
$ hdfs dfs -mkdir /user/hive/warehouse
$ hdfs dfs -chmod g+w /tmp
$ hdfs dfs -chmod g+w /user/hive/warehouse
$ cd $HIVE_HOME/bin
$ schematool -initSchema -dbType derby
Step 21 - To use the Hive command line interface (CLI) from the shell.
$ ./hive
$ show tables;
Comments
Post a Comment