Hadoop HDP 2.6.2 installation issue: “libtirpc-devel required”

 

This blog will cover some insights about the problem of “libtirpc-devel required” which I have recently encountered during the course of installation of Hadoop (Flavour: Hortonworks 2.6.2) in my current enterprise environment setup. I am happy to share the solution and other technical details which I learnt during the course of resolution of this problem. So let’s dive in.

#echo “My first date with the issue”

I have faced this error while installing Hadoop Hortonworks-2.6.2 on my on-prem RedHat Linux 7.1(Mapio) machine and below are some-snapshots of the error messages which will give a glimpse of the problem.
This problem has been addressed by most of the customers during installation of HDP 2.6.X plus versions on Redhat Linux 7.X and greater versions.
There is a bright chance that you will also encounter these error messages if you are doing the installation(HDP 2.6+) using Hortonworks docs provided online. But nothing to worry as there is a nice place to find answers to all your queries related to HDP Hadoop i.e. is Hortonworks community .

#echo “Show me the screen capture”

Untitled picture

#echo “Show me the error message”

Error: Package: hadoop_2_6_2_0_205-hdfs-2.7.3.2.6.2.0-205.x86_64 (HDP-2.6)
Requires: libtirpc-devel
You could try using –skip-broken to work around the problem
You could try running: rpm -Va –nofiles –nodigest
2017-10-31 18:19:46,945 – Failed to install package hadoop_2_6_2_0_205-client. Executing ‘/usr/bin/yum clean metadata’
2017-10-31 18:19:51,330 – Retrying to install package hadoop_2_6_2_0_205-client after 30 seconds

 

#echo “Let’s go behind the scenes and dig little deeper”

 The Scope: Installation of HDP-2.6.0 (HDP-2.6.0 or Higher versions) on Linux (Redhat/Oracle) 7.0 or higher versions.

 The Reason: It’s due to the absence of libtirpc-devel package on your Linux host machine.

 The Internals: Based on choices a user makes in Ambari GUI console various packages are installed. Out of many choices, one of them you have to check is HDFS service which is responsible for implementation of basic components required for a Hadoop Distributed File System.

HDFS package has a certain dependency on libtirpc package which results in failure if not installed already on the host machine. We will explore more about this package at a later stage of this blog which will provide us insight about the need of this package in Hadoop.

By now you might have already guessed the solution for this problem. But before, jumping to solution let’s go one more level down to learn why do we even need this package and how do I install it Red Hat Linux.

 #echo “What is libtirpc-devel and why do I need this for Hadoop”

This package contains SunLib’s implementation of transport independent RPC(TI-RPC).This library forms a piece of the base of Open Network Computing (ONC). TI-RPC is an enhanced version of TS-RPC. RPC stands for Remote Procedure Call which is widely used in distributed computing and client-server type of architecture implementations.

 #echo “What the heck TI-RPC is!!!”

TI-RPC is a powerful technique for constructing distributed, client-server based applications. It is based on extending the notion of conventional, or local, procedure calling so that the called procedure need not exist in the same address space as the calling procedure. The two processes might be on the same system, or they might be on different systems with a network connecting them.

By using RPC, programmers of distributed applications avoid the details of the interface with the network. The transport independence of RPC isolates the application from the physical and logical elements of the data communications mechanism and enables the application to use a variety of transports.

S9_RPC_works.epsi

 

Cut the long story in short: As we are aware HDFS is a distributed file system and this distributed storage system of Hadoop internally make use of RPC calls and methods to convert actions performed by clients. For an example: File creation. To implement the TI-RPC, Hadoop makes use of this package which installs relevant supporting library to make these calls.

 The Solution: So, the solution is pretty much simple just install the package on the hosts where you are planning to install your Ambari-server and agents. If you are in the middle of the installation and struck at the step of this error then install the package and retry the setup of HDP Hadoop via Ambari Console.

 #echo “How do I install the package:”

  1. In Red hat you can find this package present in the optional repository (i.e. rhel-7-server-optional-rpms), so to install the package we have to first enable the optional repository. If this is already enabled then you are good to go with package installation.

         Please note: To run these commands you will need root or sudo access on your host server.

Commands for the terminal:

  • To check if the repository is enabled or not:

#yum repolist

Untitled picture1

  • To check the available subscription repositories at your site

#subscription-manager repos |grep optional

Untitled picture2.png

  • To enable the subscription to optional repositories

#subscription-manager repos –enable=rhel-7-server-optional-rpms

Untitled picture3.png

  • To verify subscription to optional repositories

#yum repolist

Untitled picture4.png

  • Install the package on host:
    #yum install libtirpc-devel* -yUntitled picture5.png
  • Verify package installation:
    #yum list installed libtirpc*Untitled picture6.png
  • Once the package is installed, you can go back to Ambari UI and resume back your installation.

 

#echo “Links referred:”

 

 

 

Author: SandeepKumarSaini

#TechLover #SASConsultant #Hadooper #Bigdata #Cloud #Linux #OpenSourceLove

Leave a comment