Installation & Configuration of Hadoop 3.1 Single Node Cluster in Linux (RHEL 8) | Nehra Classes

Nehra Classes
Nehra Classes
4 هزار بار بازدید - 4 سال پیش - Install & Configure Hadoop in
Install & Configure Hadoop in RHEL 8: Configuration of Hadoop 3.1 Single Node Cluster in Linux == Apache Hadoop is a collection of open-source software utilities that facilitate using a network of many computers to solve problems involving massive amounts of data and computation. It provides a software framework for distributed storage and processing of big data using the MapReduce programming model. Originally designed for computer clusters built from commodity hardware—still the common use—it has also found use on clusters of higher-end hardware. All the modules in Hadoop are designed with a fundamental assumption that hardware failures are common occurrences and should be automatically handled by the framework. 1. Prerequsities Java is the primary requirement for running Hadoop on any system. yum install -y java-1. java -version 2. Create Hadoop User We recommend creating a normal (nor root) account for Hadoop working. useradd hadoop passwd hadoop Make an entry in sudoers file for Hadoop user. visudo hadoop After creating the account, it also required to set up key-based ssh to its own account. To do this use execute following commands. su - hadoop $ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa $ cp -r ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys $ chmod 0600 ~/.ssh/authorized_keys Let’s verify key based login. Below command should not ask for the password but the first time it will prompt for adding RSA to the list of known hosts. $ ssh localhost $ exit 3. Download Hadoop 3.1 Archive $ cd ~ $ wget https://archive.apache.org/dist/hadoop/core/hadoop-3.1.0/hadoop-3.1.0.tar.gz $ tar xvf hadoop-3.1.0.tar.gz $ mv hadoop-3.1.0 hadoop 4. Setup Hadoop Pseudo-Distributed Mode 4.1. Setup Hadoop Environment Variables Update Java Alternatives. $ sudo update-alternatives --config java First, we need to set environment variable uses by Hadoop. Edit ~/.bashrc file and append following values at end of file. $ vim ~/.bashrc export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.242.b08-4.el8.x86_64/jre/ export HADOOP_HOME=/home/hadoop/hadoop export HADOOP_INSTALL=$HADOOP_HOME export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export YARN_HOME=$HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin Now apply the changes in the current running environment $ source ~/.bashrc Now edit /home/hadoop/hadoop/etc/hadoop/hadoop-env.sh file and set JAVA_HOME environment variable. Change the JAVA path as per install on your system. This path may vary as per your operating system version and installation source. So make sure you are using correct path. $ vim /home/hadoop/hadoop/etc/hadoop/hadoop-env.sh export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.242.b08-4.el8.x86_64/jre/ 4.2. Setup Hadoop Configuration Files Hadoop has many of configuration files, which need to configure as per requirements of your Hadoop infrastructure. Let’s start with the configuration with basic Hadoop single node cluster setup. first, navigate to below location. $ cd /home/hadoop/hadoop/etc/hadoop $ vim core-site.xml $ vim hdfs-site.xml $ vim mapred-site.xml $ vim yarn-site.xml 4.3. Format Namenode Now format the namenode using the following command, make sure that Storage directory is $ hdfs namenode -format 5. Start Hadoop Cluster Just navigate to your $HADOOP_HOME/sbin directory and execute scripts one by one. $ cd /home/hadoop/hadoop/sbin/ Now run start-dfs.sh script. $ ./start-dfs.sh Now run start-yarn.sh script. $ ./start-yarn.sh 6. Access Hadoop Services in Browser Hadoop NameNode started on port 9870 default. Access your server on port 9870 in your favorite web browser. http://192.168.1.109:9870/ Now access port 8042 for getting the information about the cluster and all applications http://192.168.1.109:8042/ Access port 9864 to get details about your Hadoop node. http://192.168.1.109:9864/ You are done. ==== DNF/Yum Configuration in RHEL 8: https://www.seevid.ir/fa/w/cXqL5lLHM_o === Thanks for watching the video. If it helped you then, please do like & share it with others as well. Feel free to post your queries & suggestions, we will be glad to answer your queries. If you like our hard work then do subscribe to our channel & turn on the bell notification for latest updates. === Contact Us: Vikas Nehra's Twitter Handle: http://bit.ly/VikasNehraTwitterHandle Vikas Nehra's FB Account: https://www.facebook.com/er.vikasnehra/ Vikas Nehra's Instagram Handle: https://www.instagram.com/er.vikasnehra/ Registration Form: http://bit.ly/NehraClassesRegForm Twitter Handle: http://bit.ly/NehraClassesTwiiterHandle Facebook Page: www.facebook.com/nehraclasses Instagram: https://www.instagram.com/nehraclasses/ Telegram Channel: https://t.me/NehraClasses WhatsApp Us: https://bit.ly/2Kpqp5z Email Us: [email protected] === ©COPYRIGHT. ALL RIGHTS RESERVED.
4 سال پیش در تاریخ 1399/04/31 منتشر شده است.
4,022 بـار بازدید شده
... بیشتر