hadoop too many files异常处理
程序员文章站
2022-06-06 17:58:49
...
今天 hadoop 集群任务执行失败了。报错信息如下 2013-10-26 08:00:03,229 ERROR server.TThreadPoolServer TThreadPoolServer.java:run182 - Error occurred during processing of message. at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcess
今天hadoop集群任务执行失败了。报错信息如下
- 2013-10-26 08:00:03,229 ERROR server.TThreadPoolServer (TThreadPoolServer.java:run(182)) - Error occurred during processing of message.
- at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:553)
- at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:169)
- at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
- at java.lang.Thread.run(Thread.java:662)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:277)
- at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.init(HiveServer.java:136)
- at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:550)
- ... 4 more
- at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:199)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:272)
- ... 6 more
- Caused by: java.lang.RuntimeException: java.io.FileNotFoundException: /home/hadoop/hadoop-0.20.205.0/conf/mapred-site.xml (Too many open files)
- at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1231)
- at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1093)
- at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1037)
- at org.apache.hadoop.conf.Configuration.set(Configuration.java:438)
- at org.apache.hadoop.hive.conf.HiveConf.setVar(HiveConf.java:762)
- at org.apache.hadoop.hive.conf.HiveConf.setVar(HiveConf.java:770)
- at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:169)
- at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
- at java.lang.Thread.run(Thread.java:662)
- Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.io.FileNotFoundException: /home/hadoop/hadoop-0.20.205.0/conf/core-site.xml (Too many open files)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:277)
- at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.init(HiveServer.java:136)
- at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:550)
- ... 4 more
- Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.io.FileNotFoundException: /home/hadoop/hadoop-0.20.205.0/conf/core-site.xml (Too many open files)
- at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:199)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:272)
- ... 6 more
debian系统解决方式
- ulimit -HSn 32768
原文地址:hadoop too many files异常处理, 感谢原作者分享。
推荐阅读
-
在Linux中打开了太多文件(Too many open files)的三种解决方法
-
CentOS Too Many Open Files 解决
-
"Form too large"异常处理
-
Nginx failed Too many open files nginxulimit
-
java.net.SocketException: Too many open files
-
详解 Too many open files
-
详解 Too many open files
-
linux打开文件数 too many open files 解决办法
-
在Linux中打开了太多文件(Too many open files)的三种解决方法
-
如何解决Linux下Too many open files问题