Bobcares

A Complete Guide to Setup Hadoop on EC2 Instance

by | Aug 6, 2024

Using Hadoop on Amazon EC2 instances can offer several benefits, including scalability, cost-efficiency, flexibility, and more. This article offers the steps to setup Hadoop on EC2 instance easily. As part of our AWS Support Services, we assist our customers with several AWS EC2 queries.

Overview
  1. Hadoop Setup on Amazon EC2 Instance: An Introduction
  2. Why We Need Hadoop Setup on Amazon EC2 Instance?
  3. Steps to Set up Hadoop on Amazon EC2 Instance
  4. Important Considerations
  5. Conclusion

Hadoop Setup on Amazon EC2 Instance: An Introduction

AWS EC2: Within the Amazon Web Services (AWS) cloud, virtual computers known as AWS EC2 (Elastic Compute Cloud) instances offer scalable processing capability. Without having to spend money on actual hardware, they let users control computer resources, store and execute programs.

Apache Hadoop: Apache Hadoop is an open-source framework for processing large datasets across clusters of computers. It allows users to distribute data and computation across many servers using simple programming models. Three Main Components of Hadoop:

1. HDFS (Hadoop Distributed File System)
2. MapReduce
3. YARN (Yet Another Resource Negotiator)

Some of the main features of Hadoop are as follows:

  • Open-Source: Free to use, with source code available for modification.
  • Cost-Effective: Uses inexpensive commodity hardware compared to traditional databases.
  • High Scalability: Easily scales by adding more nodes or upgrading existing ones.
  • Fault Tolerance: Data is replicated across nodes, ensuring availability even if one fails.
  • Fast Data Processing: Uses data locality to reduce network traffic by processing data where it is stored.
  • Flexibility: Handles structured, semi-structured, and unstructured data of any size.

Why We Need Hadoop Setup on Amazon EC2 Instance?

The following are the primary reasons to use Hadoop on AWS EC2:

1. Cost-Effectiveness: Rather of needing pricey dedicated hardware, Hadoop may make use of the affordable EC2 instances. Costs may be optimized with the flexibility to scale up or down EC2 capacity as needed.

2. Scalability: Adding or deleting nodes makes it simple to grow Hadoop clusters operating on EC2. Large datasets may be processed thanks to this, as the cluster can be scaled as needed.

3. Flexibility: With a large selection of instance types offered by EC2, customers may choose the ideal combination of computation, memory, and storage to suit their Hadoop workloads. For Hadoop, this flexibility is crucial.

4. Integration with AWS Services: Hadoop running on EC2 may be linked with S3 storage, CloudWatch monitoring, and Identity and Access Management (IAM) security services. This offers a massive data processing platform that is complete.

5. Fault Tolerance: Although EC2 instances are susceptible to failure, Hadoop’s built-in fault tolerance, which is achieved by data replication between nodes, guarantees high data and processing availability.

We can quickly setup and maintain a scalable large data processing environment in the cloud by setting up a Hadoop cluster on EC2 instances.

Steps to Set up Hadoop on Amazon EC2 Instance

setup hadoop on ec2 instance

1. Launch EC2 Instance

1. Log in to the AWS Management Console.

2. Navigate to the EC2 service and click on Launch Instance.

3. Choose the latest Ubuntu LTS AMI from the Quick Start section.

4. Select an appropriate instance type based on the needs.

5. Enable Auto-assign Public IP for the subnet.

6. Optionally, add additional EBS volumes for larger installations.

7. Configure the security group to allow necessary inbound and outbound traffic.

8. Create a new key pair or use an existing one for secure access.

9. Click Launch to start the instance.

10. Allocate an Elastic IP address and associate it with the instance for a static IP.

2. Connect to the Instance

1. Adjust permissions for the private key file:

chmod 400

2. Connect using SSH:

ssh -i  ubuntu@

3. Update System Packages

Update package lists and upgrade installed packages in order to setup Hadoop on EC2 instance:

sudo apt update
sudo apt upgrade

4. Change Hostname

1. Edit the hostname file:

sudo nano /etc/hostname

2. Reboot the instance:

sudo reboot

5. Install OpenJDK

Install OpenJDK using the package manager:

sudo apt install openjdk-8-jdk

6. Install Hadoop

Download Hadoop and extract it to the /usr/local directory:

wget  -P ~/Downloads
sudo tar zxvf ~/Downloads/hadoop-* -C /usr/local

7. Set Up Environment Variables

1. Open the .bashrc file:

nano ~/.bashrc

2. Add the following lines:

export HADOOP_HOME=/usr/local/hadoop
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

3. Apply changes:

source ~/.bashrc

4. Edit hadoop-env.sh:

sudo nano $HADOOP_CONF_DIR/hadoop-env.sh

5. Update these variables:

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop

8. Configure Hadoop

1. core-site.xml:

fs.defaultFS
hdfs://:9000

2. yarn-site.xml:

yarn.nodemanager.aux-services
mapreduce_shuffle
yarn.resourcemanager.hostname

3. mapred-site.xml:

sudo cp $HADOOP_CONF_DIR/mapred-site.xml.template $HADOOP_CONF_DIR/mapred-site.xml
mapreduce.jobtracker.address
:54311
mapreduce.framework.name
yarn

4. hdfs-site.xml:

dfs.replication
1
dfs.namenode.name.dir
file:///usr/local/hadoop/data/hdfs/namenode
dfs.datanode.data.dir
file:///usr/local/hadoop/data/hdfs/datanode

9. Create Directories for Hadoop

Create directories for NameNode and DataNode to complete Hadoop setup on EC2 instance:

sudo mkdir -p $HADOOP_HOME/data/hdfs/namenode
sudo mkdir -p $HADOOP_HOME/data/hdfs/datanode
sudo chown -R ubuntu $HADOOP_HOME

10. Start Hadoop Cluster

1. Format the HDFS NameNode:

hdfs namenode -format

2. Start HDFS and YARN services:

$HADOOP_HOME/sbin/start-dfs.sh
$HADOOP_HOME/sbin/start-yarn.sh
$HADOOP_HOME/sbin/mr-jobhistory-daemon.sh start historyserver

11. Verify Java Processes

Use jps to ensure all Hadoop processes are running:

jps

12. Access Hadoop Web UI

Namenode Overview: http://:50070

Cluster Metrics Overview: http://:8088

Important Considerations

1. The NameNode URL and other attributes in the core-site.xml file must be correctly specified. This file contains errors that might prevent the Hadoop cluster from starting up correctly.

2. For SSH to function, the EC2 instances’ private key must have the appropriate permissions (such as chmod 400 key.pem).

3. The Hadoop EC2 environment script (hadoop-ec2-env.sh) requires the AWS access key ID and secret access key to be entered correctly. If the credentials are incorrect, the cluster won’t launch.

4. Choosing a suitable AMI (Amazon Machine Image) with the preferred Hadoop version pre-installed is crucial. Issues may arise if an incompatible AMI is used.

5. The cluster may not be able to function if security groups are improperly configured and block essential ports (such as 8020 for NameNode and 9000 for HDFS).

6. For HDFS to function, the EC2 instances must have enough disk space allotted to them. Problems may arise from inadequate space.

7. In order for Hadoop to function, environment variables like JAVA_HOME and HADOOP_HOME must be specified correctly.

[Want to learn more? Click here to reach us.]

Conclusion

In conclusion, incorrect configuration files, permissions, passwords, AMI selection, networking, and environment variables are the main causes of problems. Most issues may be avoided by carefully following the steps for setup and verifying these details again. We should be able to set up and manage a Hadoop cluster with a single node on an EC2 instance by following this method from our Techs.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.