How to contribute limited/specific amount of storage as slave to the Hadoop cluster? πŸ’» πŸ’» πŸ’»

Kaushal Soni
5 min readOct 21, 2020

Hola Guys ! 🀩 🀩 🀩

Hope you all are well and excited for knowing more about Hadoop Cluster. So, for answer your question I created this Blog. πŸ˜„ πŸ˜„ πŸ˜„

β€œ This Blog is for those who want to know about how create partition and then use that partition for Hadoop Cluster Datanode.”

So, let’s explore the technical stuff Together. 🀝 🀝 🀝

Prerequisite πŸ˜’ πŸ˜’ πŸ˜’

  1. Hadoop Software should be already installed in system.
  2. Java Software also should install in system.
  3. There should a cluster with NameNode already configured, we will create only Datanode with limited storage.

To contribute limited/specific amount of storage as slave to the Hadoop cluster, we should follow these steps : πŸ‘‡ πŸ‘‡ πŸ‘‡

  1. Create Partition
  2. Format the partition
  3. Mount the Patition
  4. Configure the Hadoop Datanode
  5. Verify the Setup configuration ( Optional)

So, let’s perform above mentions steps one by one. πŸ‘£ πŸ‘£ πŸ‘£

How to Create Partition πŸ“— πŸ“— πŸ“—

NOTE : Article contain Setup for LINUX only. πŸ”’πŸ”’πŸ”’

If you are using Virtual Machine, create a virtual volume as your requirement. Here i created 10 Gib volume disk. ( else skip this step)

Creation of Virtual Volume in Oracle VM VirtualBox πŸ“¦ πŸ“¦ πŸ“¦

πŸ“Œ TO create volume, we have to go virtual machine setting. βš™οΈ

πŸ“Œ Now go to storage and create new volume πŸ”¨

πŸ“Œ Now click next next to create volume as shown in screenshots. βŒ›οΈβŒ›οΈβŒ›οΈ

Step to create Virtual Volume

πŸ“Œ Now our Volumes are successfully created. πŸ’―πŸ’―πŸ’―

Now boot or turn on the Virtual machine.

Creation of the Partition by using fdisk command β™₯️β™₯️β™₯️

Now we create the the partition by fdisk command. But we first list all partition.

Command : β€œ fdisk -l β€œ or β€œ lsblk β€œ

Output :

fdisk command

Now we create the partition by command :

β€œ fdisk /dev/sdb β€œ

Output :

Partition Creation

Now we again check partition by using β€œ lsblk” command.

Successfully Created

As, partition successfully created so now we move ahead and format the partition. βœ”οΈβœ”οΈβœ”οΈ

Format the Partition πŸ–‹πŸ–‹πŸ–‹

To format the partition we first have to compare the filesystem and create format as requirement. ( here i create .ext4 format )

Command

mkfs.ext4 /dev/sdb1

Output :

Now we have successfully formatted the partition. Now last step is to mount the Drive. βœ”οΈβœ”οΈβœ”οΈ

Mount the Drive with a Directory πŸ”—πŸ”—πŸ”—

First we list all mounted directories by β€œdf -h β€œ command.

Mounting the drive is very easy task. We can do this by single command, but before we have to create a directory to which we will mount the drive.

So, first create directory and then mount by using following commands :

Command:

mkdir PathToDirectory

mount /dev/sdb1 /PathToDirectory

Output :

As now disk is successfully mounted, we will verify by β€œdf -h” command. βœ”οΈβœ”οΈβœ”οΈ

Now we configure our Datanode πŸ”₯πŸ”₯πŸ”₯

Configuration of Datanode ⚑️⚑️⚑️

As we have Hadoop and Java already installed, so now have to follow these steps :

1. Configuration of file :

hdfs-site.xml

core-site.xml

2. Start the Datanode

3. Check Hadoop report (to verify setup)

So, first we configure the files as follows : πŸ“‹πŸ“‹πŸ“‹

command :

cd /etc/hadoop

vi hdfs-site.xml ( and setup file as shown in ScreenShot )

vi core-site.xml ( and setup file as shown in ScreenShot )

ScreenShots :

Commands

Files Screenshots :

Files Configuration Code

Hence, files successfully configured.

Now , we start the datanode by πŸ”ŽπŸ”ŽπŸ”Ž

Command :

hadoop-daemon.sh start datanode

jps ( to check datanode started or not)

At last we check the Datanode report. πŸ“œπŸ“œπŸ“œ

Command :

hadoop dfsadmin -report

Hadoop Cluster Report

Hence, we have successfully done our task. 🎯🎯🎯

That’s all ! ⭐️⭐️⭐️

So, now it’s time to say Goodbye. Now, we will meet soon , in my upcoming blog, until that be happy and safe. πŸ€— πŸ€— πŸ€—

If you like my blog and wants such blog follow me on medium. πŸ‘πŸ‘πŸ‘

In upcoming days I am going to publish lots of articles on Cloud Computing Technologies and many case-study, So definitely follow me on Medium. πŸ‘€πŸ‘€πŸ‘€

Here is my LinkedIn profile link and if you have any queries definitely Comment. ✍️✍️✍️

--

--

Kaushal Soni

YouTuber || Instructor & Teacher || Technical Content Writer || MLOps || DevOps || Hybrid Multi Cloud || Flutter || Python || Arth Learner || Technology Lover