阿里云-云小站(无限量代金券发放中)
【腾讯云】云服务器、云数据库、COS、CDN、短信等热卖云产品特惠抢购

Hadoop2.2.0遇到64位操作系统平台报错,重新编译Hadoop

114次阅读
没有评论

共计 14701 个字符,预计需要花费 37 分钟才能阅读完成。

Hadoop cmake maven protobuf

问题描述

在 64 位 linux 装的 hadoop,在很多地方会遇到 libhadoop.so.1.0.0 which might have disabled stack guard. 是因为 hadoop 是 32 位的,需要手工编译 hadoop。

hadoop 为 2.2.0,操作系统为 Oracle linux 6.3 64 位。

实例和解决过程。

遇到的问题

[hadoop@hadoop01 input]$ hadoop dfs -put ./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.

 

Java HotSpot(TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might havedisabled stack guard. The VM will try to fix the stack guard now.

It’s highly recommendedthat you fix the library with ‘execstack -c <libfile>’, or link it with’-z noexecstack’.

13/10/24 04:08:55 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform… using builtin-java classes where applicable

put: `in’: No such file or directory

 

查看本地文件

[hadoop@hadoop01 input]$ file /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0

/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0:ELF 32-bit LSB shared object, Intel 80386,version 1 (SYSV), dynamically linked, not stripped

 

貌似是 32 位和 64 位的原因

http://mail-archives.apache.org/mod_mbox/hadoop-user/201208.mbox/%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E

http://www.mail-archive.com/common-issues@hadoop.apache.org/msg52576.html

操作系统 64 位,软件是 32 位。悲剧了。。。装好的集群没法用。

 

 

解决方法:重新编译 hadoop

解决方法,就是重新编译 hadoop 软件:

下载程序代码

机器得连网,如果没联网找可以联网的机器下载,但是编译时还是要下载一些东西,所以,实在不行。最好找相同平台(可以是虚拟机)能上网的机器做下面工作,弄好了再拷回来。

 

# svn checkout’http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0′

 

都下载到这了:

[hadoop@hadoop01 hadoop]$ ls

BUILDING.txt hadoop-common-project hadoop-maven-plugins hadoop-tools

dev-support hadoop-dist hadoop-minicluster hadoop-yarn-project

hadoop-assemblies hadoop-hdfs-project hadoop-project pom.xml

hadoop-client hadoop-mapreduce-project hadoop-project-dist

安装开发环境

1. 必要的包

[root@hadoop01 /]# yum install svn

[root@hadoop01 ~]# yum install autoconfautomake libtool cmake

root@hadoop01 ~]# yum install ncurses-devel

root@hadoop01 ~]# yum install openssl-devel

root@hadoop01 ~]# yum install gcc*

2. 安装 maven

下载,并解压

http://maven.apache.org/download.cgi

 

[root@hadoop01 stable]# mvapache-maven-3.1.1 /usr/local/

将 /usr/local/apache-maven-3.1.1/bin 加到环境变量中

3. 安装 protobuf

 

没装 protobuf, 后面编译做不完,结果如下:

[INFO] —hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common —

[WARNING] [protoc, –version] failed:java.io.IOException: Cannot run program “protoc”: error=2, No suchfile or directory

[ERROR] stdout: []

……………………

[INFO] Apache Hadoop Main………………………….. SUCCESS [5.672s]

[INFO] Apache Hadoop Project POM……………………. SUCCESS [3.682s]

[INFO] Apache Hadoop Annotations……………………. SUCCESS [8.921s]

[INFO] Apache Hadoop Assemblies…………………….. SUCCESS [0.676s]

[INFO] Apache Hadoop Project Dist POM……………….. SUCCESS [4.590s]

[INFO] Apache Hadoop Maven Plugins………………….. SUCCESS [9.172s]

[INFO] Apache Hadoop Auth………………………….. SUCCESS [10.123s]

[INFO] Apache Hadoop Auth Examples………………….. SUCCESS [5.170s]

[INFO] Apache HadoopCommon ………………………… FAILURE [1.224s]

[INFO] Apache Hadoop NFS…………………………… SKIPPED

[INFO] Apache Hadoop Common Project…………………. SKIPPED

[INFO] Apache Hadoop HDFS………………………….. SKIPPED

[INFO] Apache Hadoop HttpFS………………………… SKIPPED

[INFO] Apache Hadoop HDFS BookKeeperJournal …………. SKIPPED

[INFO] Apache Hadoop HDFS-NFS………………………. SKIPPED

[INFO] Apache Hadoop HDFS Project…………………… SKIPPED

安装 protobuf 过程

下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

https://code.google.com/p/protobuf/downloads/list

[root@hadoop01 protobuf-2.5.0]# pwd

/soft/protobuf-2.5.0

依次执行下面的命令即可

./configure

make

make check

make install

[root@hadoop01 protobuf-2.5.0]# protoc–version

libprotoc 2.5.0

更多详情见请继续阅读下一页的精彩内容 :http://www.linuxidc.com/Linux/2013-11/93080p2.htm

相关阅读

Ubuntu 13.04 上搭建 Hadoop 环境 http://www.linuxidc.com/Linux/2013-06/86106.htm

Ubuntu 12.10 +Hadoop 1.2.1 版本集群配置 http://www.linuxidc.com/Linux/2013-09/90600.htm

Ubuntu 上搭建 Hadoop 环境(单机模式 + 伪分布模式)http://www.linuxidc.com/Linux/2013-01/77681.htm

Ubuntu 下 Hadoop 环境的配置 http://www.linuxidc.com/Linux/2012-11/74539.htm

单机版搭建 Hadoop 环境图文教程详解 http://www.linuxidc.com/Linux/2012-02/53927.htm

搭建 Hadoop 环境(在 Winodws 环境下用虚拟机虚拟两个 Ubuntu 系统进行搭建)http://www.linuxidc.com/Linux/2011-12/48894.htm

4.cmake 安装

CMAKE 报错:

main:

[mkdir] Created dir:/soft/Hadoop/hadoop-tools/hadoop-pipes/target/native

[exec] — The C compiler identification is GNU

[exec] — The CXX compiler identification is GNU

[exec] — Check for working C compiler: /usr/bin/gcc

[exec] — Check for working C compiler: /usr/bin/gcc — works

[exec] — Detecting C compiler ABI info

[exec] — Detecting C compiler ABI info – done

[exec] — Check for working CXX compiler: /usr/bin/c++

[exec] — Check for working CXX compiler: /usr/bin/c++ — works

[exec] — Detecting CXX compiler ABI info

[exec] — Detecting CXX compiler ABI info – done

[exec] CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66(MESSAGE):

[exec] Could NOT find OpenSSL

[exec] Call Stack (most recent call first):

[exec] CMakeLists.txt:20(find_package)

[exec]

[exec]

[exec] — Configuring incomplete, errors occurred!

[INFO] Apache Hadoop Gridmix……………………….. SUCCESS [12.062s]

[INFO] Apache Hadoop Data Join……………………… SUCCESS [8.694s]

[INFO] Apache Hadoop Extras………………………… SUCCESS [6.877s]

[INFO] Apache Hadoop Pipes ………………………….FAILURE [5.295s]

[INFO] Apache Hadoop Tools Dist…………………….. SKIPPED

[INFO] Apache Hadoop Tools…………………………. SKIPPED

[INFO] Apache Hadoop Distribution…………………… SKIPPED

[INFO] Apache Hadoop Client………………………… SKIPPED

[INFO] Apache Hadoop Mini-Cluster…………………… SKIPPED

 

需要安装

root@hadoop01 ~]# yum install ncurses-devel

root@hadoop01 ~]# yum install openssl-devel

 

编译 hadoop

 

[hadoop@hadoop01 hadoop]$ pwd

/soft/hadoop

[hadoop@hadoop01 hadoop]$ ls

BUILDING.txt hadoop-client hadoop-hdfs-project hadoop-minicluster hadoop-tools

dev-support hadoop-common-project hadoop-mapreduce-project hadoop-project hadoop-yarn-project

hadoop-assemblies hadoop-dist hadoop-maven-plugins hadoop-project-dist pom.xml

[hadoop@hadoop01 hadoop]$ mvn package -Pdist,native -DskipTests -Dtar

 

编译是个很耗时的工作呀。。。。

 

下面是做完成功的结果

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main………………………….. SUCCESS [6.600s]

[INFO] Apache Hadoop Project POM……………………. SUCCESS [3.974s]

[INFO] Apache Hadoop Annotations……………………. SUCCESS [9.878s]

[INFO] Apache Hadoop Assemblies…………………….. SUCCESS [0.856s]

[INFO] Apache Hadoop Project Dist POM……………….. SUCCESS [4.750s]

[INFO] Apache Hadoop Maven Plugins………………….. SUCCESS [8.720s]

[INFO] Apache Hadoop Auth………………………….. SUCCESS [10.107s]

[INFO] Apache Hadoop Auth Examples………………….. SUCCESS [5.734s]

[INFO] Apache Hadoop Common………………………… SUCCESS [4:32.636s]

[INFO] Apache Hadoop NFS…………………………… SUCCESS [29.700s]

[INFO] Apache Hadoop Common Project…………………. SUCCESS [0.090s]

[INFO] Apache Hadoop HDFS………………………….. SUCCESS [6:15.394s]

[INFO] Apache Hadoop HttpFS………………………… SUCCESS [1:09.238s]

[INFO] Apache Hadoop HDFS BookKeeperJournal …………. SUCCESS [27.676s]

[INFO] Apache Hadoop HDFS-NFS………………………. SUCCESS [13.954s]

[INFO] Apache Hadoop HDFS Project…………………… SUCCESS [0.212s]

[INFO] hadoop-yarn………………………………… SUCCESS [0.962s]

[INFO] hadoop-yarn-api…………………………….. SUCCESS [1:48.066s]

[INFO] hadoop-yarn-common………………………….. SUCCESS [1:37.543s]

[INFO] hadoop-yarn-server………………………….. SUCCESS [4.301s]

[INFO] hadoop-yarn-server-common……………………. SUCCESS [29.502s]

[INFO] hadoop-yarn-server-nodemanager……………….. SUCCESS [36.593s]

[INFO] hadoop-yarn-server-web-proxy…………………. SUCCESS [13.273s]

[INFO] hadoop-yarn-server-resourcemanager……………. SUCCESS [30.612s]

[INFO] hadoop-yarn-server-tests…………………….. SUCCESS [4.374s]

[INFO] hadoop-yarn-client………………………….. SUCCESS [14.115s]

[INFO] hadoop-yarn-applications…………………….. SUCCESS [0.218s]

[INFO]hadoop-yarn-applications-distributedshell ……… SUCCESS [9.871s]

[INFO] hadoop-mapreduce-client……………………… SUCCESS [1.095s]

[INFO] hadoop-mapreduce-client-core…………………. SUCCESS [1:30.650s]

[INFO]hadoop-yarn-applications-unmanaged-am-launcher …. SUCCESS [15.089s]

[INFO] hadoop-yarn-site……………………………. SUCCESS [0.637s]

[INFO] hadoop-yarn-project…………………………. SUCCESS [25.809s]

[INFO] hadoop-mapreduce-client-common……………….. SUCCESS [45.919s]

[INFO] hadoop-mapreduce-client-shuffle………………. SUCCESS [14.693s]

[INFO] hadoop-mapreduce-client-app………………….. SUCCESS [39.562s]

[INFO] hadoop-mapreduce-client-hs…………………… SUCCESS [19.299s]

[INFO] hadoop-mapreduce-client-jobclient…………….. SUCCESS [18.549s]

[INFO] hadoop-mapreduce-client-hs-plugins……………. SUCCESS [5.134s]

[INFO] Apache Hadoop MapReduce Examples……………… SUCCESS [17.823s]

[INFO] hadoop-mapreduce……………………………. SUCCESS [12.726s]

[INFO] Apache Hadoop MapReduce Streaming…………….. SUCCESS [19.760s]

[INFO] Apache Hadoop Distributed Copy……………….. SUCCESS [33.332s]

[INFO] Apache Hadoop Archives………………………. SUCCESS [9.522s]

[INFO] Apache Hadoop Rumen…………………………. SUCCESS [15.141s]

[INFO] Apache Hadoop Gridmix……………………….. SUCCESS [15.052s]

[INFO] Apache Hadoop Data Join……………………… SUCCESS [8.621s]

[INFO] Apache Hadoop Extras………………………… SUCCESS [8.744s]

[INFO] Apache Hadoop Pipes…………………………. SUCCESS [28.645s]

[INFO] Apache Hadoop Tools Dist…………………….. SUCCESS [6.238s]

[INFO] Apache Hadoop Tools…………………………. SUCCESS [0.126s]

[INFO] Apache Hadoop Distribution…………………… SUCCESS [1:20.132s]

[INFO] Apache Hadoop Client………………………… SUCCESS [18.820s]

[INFO] Apache Hadoop Mini-Cluster…………………… SUCCESS [2.151s]

[INFO]————————————————————————

[INFO] BUILD SUCCESS

[INFO]————————————————————————

[INFO] Total time: 29:07.811s

[INFO] Finished at: Thu Oct 24 09:43:18 CST2013

[INFO] Final Memory: 78M/239M

[INFO]————————————————————————

使用用编译好的软件再执行一次

[hadoop@hadoop01 input]$ hadoop dfs -put ./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.

13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform… using builtin-java classes where applicable

put: `in’: No such file or directory

64 位平台的问题解决了。还有 1 个问题,需要继续解决,参见我写的另外 1 个文章:http://www.linuxidc.com/Linux/2013-11/93081.htm

更多 Hadoop 相关信息见 Hadoop 专题页面 http://www.linuxidc.com/topicnews.aspx?tid=13

Hadoop cmake maven protobuf

问题描述

在 64 位 linux 装的 hadoop,在很多地方会遇到 libhadoop.so.1.0.0 which might have disabled stack guard. 是因为 hadoop 是 32 位的,需要手工编译 hadoop。

hadoop 为 2.2.0,操作系统为 Oracle linux 6.3 64 位。

实例和解决过程。

遇到的问题

[hadoop@hadoop01 input]$ hadoop dfs -put ./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.

 

Java HotSpot(TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might havedisabled stack guard. The VM will try to fix the stack guard now.

It’s highly recommendedthat you fix the library with ‘execstack -c <libfile>’, or link it with’-z noexecstack’.

13/10/24 04:08:55 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform… using builtin-java classes where applicable

put: `in’: No such file or directory

 

查看本地文件

[hadoop@hadoop01 input]$ file /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0

/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0:ELF 32-bit LSB shared object, Intel 80386,version 1 (SYSV), dynamically linked, not stripped

 

貌似是 32 位和 64 位的原因

http://mail-archives.apache.org/mod_mbox/hadoop-user/201208.mbox/%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E

http://www.mail-archive.com/common-issues@hadoop.apache.org/msg52576.html

操作系统 64 位,软件是 32 位。悲剧了。。。装好的集群没法用。

 

 

解决方法:重新编译 hadoop

解决方法,就是重新编译 hadoop 软件:

下载程序代码

机器得连网,如果没联网找可以联网的机器下载,但是编译时还是要下载一些东西,所以,实在不行。最好找相同平台(可以是虚拟机)能上网的机器做下面工作,弄好了再拷回来。

 

# svn checkout’http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0′

 

都下载到这了:

[hadoop@hadoop01 hadoop]$ ls

BUILDING.txt hadoop-common-project hadoop-maven-plugins hadoop-tools

dev-support hadoop-dist hadoop-minicluster hadoop-yarn-project

hadoop-assemblies hadoop-hdfs-project hadoop-project pom.xml

hadoop-client hadoop-mapreduce-project hadoop-project-dist

安装开发环境

1. 必要的包

[root@hadoop01 /]# yum install svn

[root@hadoop01 ~]# yum install autoconfautomake libtool cmake

root@hadoop01 ~]# yum install ncurses-devel

root@hadoop01 ~]# yum install openssl-devel

root@hadoop01 ~]# yum install gcc*

2. 安装 maven

下载,并解压

http://maven.apache.org/download.cgi

 

[root@hadoop01 stable]# mvapache-maven-3.1.1 /usr/local/

将 /usr/local/apache-maven-3.1.1/bin 加到环境变量中

3. 安装 protobuf

 

没装 protobuf, 后面编译做不完,结果如下:

[INFO] —hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common —

[WARNING] [protoc, –version] failed:java.io.IOException: Cannot run program “protoc”: error=2, No suchfile or directory

[ERROR] stdout: []

……………………

[INFO] Apache Hadoop Main………………………….. SUCCESS [5.672s]

[INFO] Apache Hadoop Project POM……………………. SUCCESS [3.682s]

[INFO] Apache Hadoop Annotations……………………. SUCCESS [8.921s]

[INFO] Apache Hadoop Assemblies…………………….. SUCCESS [0.676s]

[INFO] Apache Hadoop Project Dist POM……………….. SUCCESS [4.590s]

[INFO] Apache Hadoop Maven Plugins………………….. SUCCESS [9.172s]

[INFO] Apache Hadoop Auth………………………….. SUCCESS [10.123s]

[INFO] Apache Hadoop Auth Examples………………….. SUCCESS [5.170s]

[INFO] Apache HadoopCommon ………………………… FAILURE [1.224s]

[INFO] Apache Hadoop NFS…………………………… SKIPPED

[INFO] Apache Hadoop Common Project…………………. SKIPPED

[INFO] Apache Hadoop HDFS………………………….. SKIPPED

[INFO] Apache Hadoop HttpFS………………………… SKIPPED

[INFO] Apache Hadoop HDFS BookKeeperJournal …………. SKIPPED

[INFO] Apache Hadoop HDFS-NFS………………………. SKIPPED

[INFO] Apache Hadoop HDFS Project…………………… SKIPPED

安装 protobuf 过程

下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

https://code.google.com/p/protobuf/downloads/list

[root@hadoop01 protobuf-2.5.0]# pwd

/soft/protobuf-2.5.0

依次执行下面的命令即可

./configure

make

make check

make install

[root@hadoop01 protobuf-2.5.0]# protoc–version

libprotoc 2.5.0

更多详情见请继续阅读下一页的精彩内容 :http://www.linuxidc.com/Linux/2013-11/93080p2.htm

相关阅读

Ubuntu 13.04 上搭建 Hadoop 环境 http://www.linuxidc.com/Linux/2013-06/86106.htm

Ubuntu 12.10 +Hadoop 1.2.1 版本集群配置 http://www.linuxidc.com/Linux/2013-09/90600.htm

Ubuntu 上搭建 Hadoop 环境(单机模式 + 伪分布模式)http://www.linuxidc.com/Linux/2013-01/77681.htm

Ubuntu 下 Hadoop 环境的配置 http://www.linuxidc.com/Linux/2012-11/74539.htm

单机版搭建 Hadoop 环境图文教程详解 http://www.linuxidc.com/Linux/2012-02/53927.htm

搭建 Hadoop 环境(在 Winodws 环境下用虚拟机虚拟两个 Ubuntu 系统进行搭建)http://www.linuxidc.com/Linux/2011-12/48894.htm

正文完
星哥说事-微信公众号
post-qrcode
 0
星锅
版权声明:本站原创文章,由 星锅 于2022-01-20发表,共计14701字。
转载说明:除特殊说明外本站文章皆由CC-4.0协议发布,转载请注明出处。
【腾讯云】推广者专属福利,新客户无门槛领取总价值高达2860元代金券,每种代金券限量500张,先到先得。
阿里云-最新活动爆款每日限量供应
评论(没有评论)
验证码
【腾讯云】云服务器、云数据库、COS、CDN、短信等云产品特惠热卖中