时间:2021-07-01 10:21:17 帮助过:22人阅读
今天 hadoop 集群任务执行失败了。报错信息如下 2013-10-26 08:00:03,229 ERROR server.TThreadPoolServer TThreadPoolServer.java:run182 - Error occurred during processing of message. at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcess
今天hadoop集群任务执行失败了。报错信息如下
- 2013-10-26 08:00:03,229 ERROR server.TThreadPoolServer (TThreadPoolServer.java:run(182)) - Error occurred during processing of message.
- at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:553)
- at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:169)
- at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
- at java.lang.Thread.run(Thread.java:662)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:277)
- at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.init(HiveServer.java:136)
- at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:550)
- ... 4 more
- at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:199)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:272)
- ... 6 more
- Caused by: java.lang.RuntimeException: java.io.FileNotFoundException: /home/hadoop/hadoop-0.20.205.0/conf/mapred-site.xml (Too many open files)
- at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1231)
- at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1093)
- at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1037)
- at org.apache.hadoop.conf.Configuration.set(Configuration.java:438)
- at org.apache.hadoop.hive.conf.HiveConf.setVar(HiveConf.java:762)
- at org.apache.hadoop.hive.conf.HiveConf.setVar(HiveConf.java:770)
- at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:169)
- at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
- at java.lang.Thread.run(Thread.java:662)
- Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.io.FileNotFoundException: /home/hadoop/hadoop-0.20.205.0/conf/core-site.xml (Too many open files)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:277)
- at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.init(HiveServer.java:136)
- at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:550)
- ... 4 more
- Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.io.FileNotFoundException: /home/hadoop/hadoop-0.20.205.0/conf/core-site.xml (Too many open files)
- at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:199)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:272)
- ... 6 more
- ulimit -HSn 32768
原文地址:hadoop too many files异常处理, 感谢原作者分享。