HadoopとHBaseのCDH4.3.0アップグレード時の障害

Hadoop・HbaseをCDH4.3.0にアップグレードしたら障害発生

ClouderaのCDH4.3.0のYUMレポジトリが公開されたので、CDH4.2.0のHadoop・HBaseをアップグレードしました。全RPMが問題なく入ったのですが、CDH4.2.0ではバッチ処理で問題なく使えていたHBaseのデータが正常に取得できない障害が発生しました。ログ出力は以下のような感じです。

13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-cdh4.3.0--1, built on 05/28/2013 02:01 GMT
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:host.name=***.********.***
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_37
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.6.0_37/jre
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/etc/hadoop/conf:/usr/lib/hadoop/lib/jersey-core-1.8.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/commons-collections-3.2.1.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.2.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/stax-api-1.0.1.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/protobuf-java-2.4.0a.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/junit-4.8.2.jar:/usr/lib/hadoop/lib/zookeeper-3.4.5-cdh4.3.0.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/slf4j-api-1.6.1.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/jsr305-1.3.9.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/jersey-json-1.8.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.2.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/kfs-0.3.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/commons-math-2.1.jar:/usr/lib/hadoop/lib/jersey-server-1.8.jar:/usr/lib/hadoop/lib/commons-io-2.1.jar:/usr/lib/hadoop/lib/jline-0.9.94.jar:/usr/lib/hadoop/lib/commons-lang-2.5.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop/.//hadoop-common-2.0.0-cdh4.3.0-tests.jar:/usr/lib/hadoop/.//hadoop-common.jar:/usr/lib/hadoop/.//hadoop-auth-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop/.//hadoop-annotations-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop/.//hadoop-common-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop/.//hadoop-auth.jar:/usr/lib/hadoop/.//hadoop-annotations.jar:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/jersey-core-1.8.jar:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.4.0a.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/zookeeper-3.4.5-cdh4.3.0.jar:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-hdfs/lib/jsr305-1.3.9.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.2.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.3.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.8.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.1.jar:/usr/lib/hadoop-hdfs/lib/jline-0.9.94.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.5.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/.//hadoop-hdfs-2.0.0-cdh4.3.0-tests.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.8.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/paranamer-2.3.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.4.0a.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/netty-3.2.4.Final.jar:/usr/lib/hadoop-yarn/lib/avro-1.7.4.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.8.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.8.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.1.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-common-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-common-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-tests-2.0.0-cdh4.3.0-tests.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-client-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-tests-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-site-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-site.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-api-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.0.0-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/jersey-core-1.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-configuration-1.6.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-collections-3.2.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jetty-util-6.1.26.cloudera.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/avro-compiler-1.7.4.jar:/usr/lib/hadoop-0.20-mapreduce/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-net-3.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/stax-api-1.0.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jsp-api-2.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/protobuf-java-2.4.0a.jar:/usr/lib/hadoop-0.20-mapreduce/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/activation-1.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-0.20-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-0.20-mapreduce/lib/junit-4.8.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/zookeeper-3.4.5-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jettison-1.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-0.20-mapreduce/lib/hadoop-fairscheduler-2.0.0-mr1-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/lib/slf4j-api-1.6.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jsr305-1.3.9.jar:/usr/lib/hadoop-0.20-mapreduce/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-digester-1.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jersey-json-1.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jetty-6.1.26.cloudera.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/kfs-0.3.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-0.20-mapreduce/lib/servlet-api-2.5.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-math-2.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jersey-server-1.8.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-io-2.1.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jline-0.9.94.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-lang-2.5.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jsch-0.1.42.jar:/usr/lib/hadoop-0.20-mapreduce/lib/guava-11.0.2.jar:/usr/lib/hadoop-0.20-mapreduce/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-0.20-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-ant-2.0.0-mr1-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-tools-2.0.0-mr1-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-core.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-tools.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-core-2.0.0-mr1-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-test-2.0.0-mr1-cdh4.3.0.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-examples.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-ant.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-test.jar:/usr/lib/hadoop-0.20-mapreduce/.//hadoop-examples-2.0.0-mr1-cdh4.3.0.jar
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/lib/hadoop/lib/native
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:java.compiler=>NA<
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-279.el6.x86_64
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:user.name=hdfs
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:user.home=/usr/lib/hadoop
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Client environment:user.dir=/var/app/analysis_classic
13/06/21 13:49:59 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=h***.********.***:2181 sessionTimeout=180000 watcher=hconnection
13/06/21 13:49:59 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 25163@***.********.***
13/06/21 13:49:59 INFO zookeeper.ClientCnxn: Opening socket connection to server h***.********.***/***.***.***.50:2181. Will not attempt to authenticate using SASL (ログイン構成を検出できません。)
13/06/21 13:49:59 INFO zookeeper.ClientCnxn: Socket connection established to h***.********.***/***.***.***.50:2181, initiating session
13/06/21 13:50:00 INFO zookeeper.ClientCnxn: Session establishment complete on server h***.********.***/***.***.***.50:2181, sessionid = 0x13f64f5dd5f0004, negotiated timeout = 40000
13/06/21 13:50:00 WARN conf.Configuration: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions:
Fri Jun 21 13:50:00 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:01 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:02 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:03 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:05 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:07 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:11 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:15 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:23 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:39 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@50739aa3, java.io.IOException: IPC server unable to read call parameters: Error in readFields

    at org.apache.hadoop.hbase.client.ServerCallable.withRetries(ServerCallable.java:183)
    at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:205)
    at org.apache.hadoop.hbase.client.ClientScanner.>init<(ClientScanner.java:120)
    at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:665)
    at org.apache.hadoop.hbase.client.HTablePool$PooledHTable.getScanner(HTablePool.java:378)
    at jp.********.tpas.analysis.common.io.hbase.UserActionClient.scanTimeRangDO(UserActionClient.java:493)
    at jp.********.tpas.analysis.common.io.hbase.UserActionClient.scanAdParamTimeRangDO(UserActionClient.java:472)
    at jp.********.tpas.analysis.tpas.job.hourly.LogSearchCVHBJob$CVSearchExecutor.run(LogSearchCVHBJob.java:165)
    at java.lang.Thread.run(Thread.java:662)
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions:
Fri Jun 21 13:50:00 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:01 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:02 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:03 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:05 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:07 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:11 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:15 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:23 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields
Fri Jun 21 13:50:39 JST 2013, org.apache.hadoop.hbase.client.ScannerCallable@7860e390, java.io.IOException: IPC server unable to read call parameters: Error in readFields

    at org.apache.hadoop.hbase.client.ServerCallable.withRetries(ServerCallable.java:183)
    at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:205)
    at org.apache.hadoop.hbase.client.ClientScanner.>init<(ClientScanner.java:120)
    at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:665)
    at org.apache.hadoop.hbase.client.HTablePool$PooledHTable.getScanner(HTablePool.java:378)
    at jp.********.tpas.analysis.common.io.hbase.UserActionClient.scanTimeRangDO(UserActionClient.java:493)
    at jp.********.tpas.analysis.common.io.hbase.UserActionClient.scanAdParamTimeRangDO(UserActionClient.java:472)
    at jp.********.tpas.analysis.tpas.job.hourly.LogSearchCVHBJob$CVSearchExecutor.run(LogSearchCVHBJob.java:165)
    at java.lang.Thread.run(Thread.java:662)
13/06/21 13:50:39 INFO hourly.LogSearchCVHBJob: buffersize : 0
org.apache.hadoop.fs.FileAlreadyExistsException: failed to create file /user/hdfs/account/4/20130603/summary/00/cv/part-r-00000 on client ***.***.***.149 because the file exists
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1865)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1771)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1747)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:418)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:207)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44942)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1701)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1697)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1695)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
    at org.apache.hadoop.hdfs.DFSOutputStream.>init<(DFSOutputStream.java:1327)
    at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1343)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1255)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1212)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:276)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:265)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:82)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:867)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:766)
    at jp.********.tpas.analysis.tpas.job.hourly.LogSearchCVHBJob.flush(LogSearchCVHBJob.java:492)
    at jp.********.tpas.analysis.tpas.job.hourly.LogSearchCVHBJob.run(LogSearchCVHBJob.java:474)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at jp.********.tpas.analysis.common.task.HadoopTask.execute(HadoopTask.java:24)
    at jp.fs.toolkit.batch.Executer.main(Executer.java:36)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException): failed to create file /user/hdfs/account/4/20130603/summary/00/cv/part-r-00000 on client ***.***.***.149 because the file exists
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1865)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1771)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1747)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:418)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:207)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44942)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1701)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1697)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1695)

    at org.apache.hadoop.ipc.Client.call(Client.java:1225)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
    at $Proxy13.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
    at $Proxy13.create(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:193)
    at org.apache.hadoop.hdfs.DFSOutputStream.>init<(DFSOutputStream.java:1324)
    ... 20 more

HBaseのリカバリ

CDH4.2.0からCDH4.3.0に移行したことでHbaseの整合性が取れなくなったのかなと思って、以下のコマンドを実行しました。

$ sudo -u hbase hbase org.apache.hadoop.hbase.util.hbck.OfflineMetaRepair
$ sudo -u hbase hbase hbck -fix
$ sudo -u hbase hbase hbck -repair

上記コマンドを実行したところ、HBaseから正常にデータが取れるようになりました。CDH4.3.0にアップグレード時はご注意を。