From techopedia
Hadoop Common refers to a collection of common utilities and libraries that support other Hadoop modules. This is an important part or module of the Apache Hadoop Framework, along with the Hadoop Distributed File System (HDFS), Hadoop YARN, and Hadoop MapReduce.
Like all other modules, Hadoop Common assumes that hardware failures are common and that they should be handled automatically using the Hadoop Framework.
Hadoop Common also known as Hadoop Core .
Hadoop Client libraries help you load data into the cluster, passes Map Shorten tasks that describe how this data should be processed, and then retrieve or view the results of the task after it is completed. Take a look at the article
This Apache link contains a list of dependencies of the Hadoop Client library.
Ravindra babu
source share