beeline连接hiveserver2报错:User: root is not allowed to impersonate root

迷南。 2022-05-15 12:28 1179阅读 0赞

最近在生产中搭建HA机制的集群,碰到不少坑,会在接下来的时间里好好总结下,先说下遇到的一个大坑。我们的需求是:希望通过hive的thrift服务来实现跨语言访问Hive数据仓库。但是第一步,你得需要在节点中打通服务器端(启动hiveserver2的节点)和客户端(启动beeline的节点)的链接。整体的结构如下图所示:
这里写图片描述
但是整个的配置过程可谓是一波三折,先在数据1节点启动hiveserver2,接着在数据3节点启动beeline链接数据1。出现了以下错误:
坑:

  1. Error: Could not open client transport with JDBC Uri: jdbc:hive2://ha1:10000/hive: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root(state=08S01,code=0)

这里写图片描述







解决方法:参考网上的一般的解决方法

  1. hadoop的配置文件core-site.xml中添加如下属性:
  2. <property>
  3. <name>hadoop.proxyuser.root.hosts</name>
  4. <value>*</value>
  5. </property>
  6. <property>
  7. <name>hadoop.proxyuser.root.groups</name>
  8. <value>*</value>
  9. </property>

报错:User: root is not allowed to impersonate root(state=08S01,code=0)
就将上面配置hadoop.proxyuser.xxx.hosts和hadoop.proxyuser.xxx.groups中的xxx设置为root(即你的错误日志中显示的User:xxx为什么就设置为什么)。“*”表示可通过超级代理“xxx”操作hadoop的用户、用户组和主机。重启hdfs。
这样改的原因:
主要原因是hadoop引入了一个安全伪装机制,使得hadoop 不允许上层系统直接将实际用户传递到hadoop层,而是将实际用户传递给一个超级代理,由此代理在hadoop上执行操作,避免任意客户端随意操作hadoop,如下图:
这里写图片描述
图上的超级代理是“Oozie”,你自己的超级代理是上面设置的“xxx”。
而hadoop内部还是延用linux对应的用户和权限。即你用哪个linux用户启动hadoop,对应的用户也就成为hadoop的内部用户,如下图我的linux用户为root,对应的hadoop中用户也就是root:
这里写图片描述
如果是这么简单就好了,改完之后错误依旧,只不过是在自己虚拟机搭建的HA集群中可以正常连接,但是生产中的集群依旧不能正常链接,纠结了大半天,参考了网上的各种解决方案如刷新HDFS的用户的配置权限:

  1. bin/hdfs dfsadmin refreshSuperUserGroupsConfiguration
  2. bin/yarn rmadmin refreshSuperUserGroupsConfiguration

但是都没有效果,后来仔细对比了虚拟机正常连接的集群对应的日志和生产启动失败的集群的日志发现了一些“蛛丝马迹”:
虚拟机的日志(正常启动):

  1. 2018-08-29T10:22:11,661 INFO [main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
  2. 2018-08-29T10:22:11,665 INFO [main] metastore.ObjectStore: Initialized ObjectStore
  3. 2018-08-29T10:22:11,813 INFO [main] metastore.HiveMetaStore: Added admin role in metastore
  4. 2018-08-29T10:22:11,814 INFO [main] metastore.HiveMetaStore: Added public role in metastore
  5. 2018-08-29T10:22:11,834 INFO [main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
  6. 2018-08-29T10:22:12,032 INFO [main] metastore.HiveMetaStore: 0: get_all_functions
  7. 2018-08-29T10:22:12,036 INFO [main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_all_functions
  8. 2018-08-29T10:22:13,841 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/root/ca0849bf-7b5f-44fb-af7e-ebcdfe04d13f
  9. 2018-08-29T10:22:13,895 INFO [main] session.SessionState: Created local directory: /tmp/root/ca0849bf-7b5f-44fb-af7e-ebcdfe04d13f
  10. 2018-08-29T10:22:13,908 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/root/ca0849bf-7b5f-44fb-af7e-ebcdfe04d13f/_tmp_space.db
  11. 2018-08-29T10:22:13,936 INFO [main] sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=ca0849bf-7b5f-44fb-af7e-ebcdfe04d13f, clientType=HIVESERVER2]
  12. 2018-08-29T10:22:13,938 WARN [main] session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
  13. 2018-08-29T10:22:13,940 INFO [main] hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
  14. 2018-08-29T10:22:13,996 INFO [main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
  15. 2018-08-29T10:22:13,996 INFO [main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
  16. 2018-08-29T10:22:13,996 INFO [main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
  17. 2018-08-29T10:22:13,997 INFO [main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
  18. 2018-08-29T10:22:14,606 INFO [main] service.CompositeService: Operation log root directory is created: /tmp/root/operation_logs
  19. 2018-08-29T10:22:14,618 INFO [main] service.CompositeService: HiveServer2: Background operation thread pool size: 100
  20. 2018-08-29T10:22:14,624 INFO [main] service.CompositeService: HiveServer2: Background operation thread wait queue size: 100
  21. 2018-08-29T10:22:14,624 INFO [main] service.CompositeService: HiveServer2: Background operation thread keepalive time: 10 seconds
  22. 2018-08-29T10:22:14,653 INFO [main] service.AbstractService: Service:OperationManager is inited.
  23. 2018-08-29T10:22:14,653 INFO [main] service.AbstractService: Service:SessionManager is inited.
  24. 2018-08-29T10:22:14,653 INFO [main] service.AbstractService: Service:CLIService is inited.
  25. 2018-08-29T10:22:14,653 INFO [main] service.AbstractService: Service:ThriftBinaryCLIService is inited.
  26. 2018-08-29T10:22:14,653 INFO [main] service.AbstractService: Service:HiveServer2 is inited.
  27. 2018-08-29T10:22:14,654 INFO [main] server.HiveServer2: Starting Web UI on port 10002
  28. 2018-08-29T10:22:14,903 INFO [main] service.AbstractService: Service:OperationManager is started.
  29. 2018-08-29T10:22:14,903 INFO [main] service.AbstractService: Service:SessionManager is started.
  30. 2018-08-29T10:22:14,915 INFO [main] service.AbstractService: Service:CLIService is started.
  31. 2018-08-29T10:22:14,916 INFO [main] service.AbstractService: Service:ThriftBinaryCLIService is started.
  32. 2018-08-29T10:22:14,916 INFO [main] service.AbstractService: Service:HiveServer2 is started.
  33. 2018-08-29T10:22:14,918 INFO [main] server.Server: jetty-7.6.0.v20120127
  34. 2018-08-29T10:22:15,054 INFO [main] webapp.WebInfConfiguration: Extract jar:file:/root/apps/hive-2.1.1/lib/hive-service-2.1.1.jar!/hive-webapps/hiveserver2/ to /tmp/jetty-0.0.0.0-10002-hiveserver2-_-any-/webapp
  35. 2018-08-29T10:22:15,090 INFO [Thread-11] thrift.ThriftCLIService: Starting ThriftBinaryCLIService on port 10000 with 5...500 worker threads
  36. 2018-08-29T10:22:15,413 INFO [main] handler.ContextHandler: started o.e.j.w.WebAppContext{/,file:/tmp/jetty-0.0.0.0-10002-hiveserver2-_-any-/webapp/},jar:file:/root/apps/hive-2.1.1/lib/hive-service-2.1.1.jar!/hive-webapps/hiveserver2
  37. 2018-08-29T10:22:15,583 INFO [main] handler.ContextHandler: started o.e.j.s.ServletContextHandler{/static,jar:file:/root/apps/hive-2.1.1/lib/hive-service-2.1.1.jar!/hive-webapps/static}
  38. 2018-08-29T10:22:15,594 INFO [main] handler.ContextHandler: started o.e.j.s.ServletContextHandler{/logs,file:/root/apps/hive-2.1.1/logs/}
  39. 2018-08-29T10:22:15,645 INFO [main] server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:10002
  40. 2018-08-29T10:22:15,660 INFO [main] http.HttpServer: Started HttpServer[hiveserver2] on port 10002
  41. 2018-08-29T10:22:15,660 INFO [main] server.HiveServer2: Web UI has started on port 10002
  42. 2018-08-29T10:26:04,804 INFO [HiveServer2-Handler-Pool: Thread-39] thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V9
  43. 2018-08-29T10:26:05,732 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Created HDFS directory: /tmp/hive/root/f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  44. 2018-08-29T10:26:05,735 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Created local directory: /tmp/root/f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  45. 2018-08-29T10:26:05,745 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Created HDFS directory: /tmp/hive/root/f575ff1d-8cfa-4d94-beb8-bd7365a5bada/_tmp_space.db
  46. 2018-08-29T10:26:05,748 INFO [HiveServer2-Handler-Pool: Thread-39] session.HiveSessionImpl: Operation log session directory is created: /tmp/root/operation_logs/f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  47. 2018-08-29T10:26:05,749 INFO [HiveServer2-Handler-Pool: Thread-39] service.CompositeService: Session opened, SessionHandle [f575ff1d-8cfa-4d94-beb8-bd7365a5bada], current sessions:1
  48. 2018-08-29T10:26:05,864 INFO [HiveServer2-Handler-Pool: Thread-39] conf.HiveConf: Using the default value passed in for log id: f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  49. 2018-08-29T10:26:05,864 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Updating thread name to f575ff1d-8cfa-4d94-beb8-bd7365a5bada HiveServer2-Handler-Pool: Thread-39
  50. 2018-08-29T10:26:05,865 INFO [f575ff1d-8cfa-4d94-beb8-bd7365a5bada HiveServer2-Handler-Pool: Thread-39] conf.HiveConf: Using the default value passed in for log id: f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  51. 2018-08-29T10:26:05,865 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-39
  52. 2018-08-29T10:26:05,883 INFO [HiveServer2-Handler-Pool: Thread-39] conf.HiveConf: Using the default value passed in for log id: f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  53. 2018-08-29T10:26:05,883 INFO [f575ff1d-8cfa-4d94-beb8-bd7365a5bada HiveServer2-Handler-Pool: Thread-39] conf.HiveConf: Using the default value passed in for log id: f575ff1d-8cfa-4d94-beb8-bd7365a5bada
  54. 2018-08-29T10:26:05,883 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Updating thread name to f575ff1d-8cfa-4d94-beb8-bd7365a5bada HiveServer2-Handler-Pool: Thread-39
  55. 2018-08-29T10:26:05,883 INFO [HiveServer2-Handler-Pool: Thread-39] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-39

启动失败的日志:

  1. 2018-08-29T09:40:46,117 INFO [main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
  2. 2018-08-29T09:40:46,119 INFO [main] metastore.ObjectStore: Initialized ObjectStore
  3. 2018-08-29T09:40:46,215 INFO [main] metastore.HiveMetaStore: Added admin role in metastore
  4. 2018-08-29T09:40:46,217 INFO [main] metastore.HiveMetaStore: Added public role in metastore
  5. 2018-08-29T09:40:46,231 INFO [main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
  6. 2018-08-29T09:40:46,368 INFO [main] metastore.HiveMetaStore: 0: get_all_functions
  7. 2018-08-29T09:40:46,371 INFO [main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_all_functions
  8. 2018-08-29T09:40:47,422 INFO [main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
  9. 2018-08-29T09:40:47,423 INFO [main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
  10. 2018-08-29T09:40:47,825 INFO [main] service.AbstractService: Service:CLIService is inited.
  11. 2018-08-29T09:40:47,826 INFO [main] server.HiveServer2: Starting Web UI on port 10002
  12. 2018-08-29T09:40:47,906 INFO [main] service.AbstractService: Service:OperationManager is started.
  13. 2018-08-29T09:40:47,906 INFO [main] service.AbstractService: Service:SessionManager is started.
  14. 2018-08-29T09:40:47,907 INFO [main] service.AbstractService: Service:CLIService is started.
  15. 2018-08-29T09:40:47,907 INFO [main] service.AbstractService: Service:ThriftBinaryCLIService is started.
  16. 2018-08-29T09:40:47,908 INFO [main] service.AbstractService: Service:HiveServer2 is started.
  17. 2018-08-29T09:40:47,910 INFO [main] server.Server: jetty-7.6.0.v20120127
  18. 2018-08-29T09:40:48,102 INFO [main] server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:10002
  19. 2018-08-29T09:40:48,105 INFO [main] server.HiveServer2: Web UI has started on port 10002
  20. 2018-08-29T09:40:48,105 INFO [main] http.HttpServer: Started HttpServer[hiveserver2] on port 10002
  21. 2018-08-29T09:41:03,343 WARN [HiveServer2-Handler-Pool: Thread-43] service.CompositeService: Failed to open session
  22. at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
  23. at java.security.AccessController.doPrivileged(Native Method)
  24. at javax.security.auth.Subject.doAs(Subject.java:422)
  25. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
  26. at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:327)
  27. at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:279)
  28. at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:189)
  29. at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:423)
  30. at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:312)
  31. at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377)
  32. at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362)
  33. at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
  34. at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
  35. at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
  36. at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
  37. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  38. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  39. at java.lang.Thread.run(Thread.java:748)
  40. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
  41. at org.apache.hadoop.ipc.Client.call(Client.java:1469)
  42. at org.apache.hadoop.ipc.Client.call(Client.java:1400)
  43. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
  44. at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source)
  45. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  46. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  47. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  48. at java.lang.reflect.Method.invoke(Method.java:498)
  49. at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  50. at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  51. at com.sun.proxy.$Proxy31.getFileInfo(Unknown Source)
  52. at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1977)
  53. at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
  54. at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
  55. at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
  56. at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:689)
  57. at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:635)
  58. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:563)
  59. ... 28 more
  60. 2018-08-29T09:41:03,350 WARN [HiveServer2-Handler-Pool: Thread-43] thrift.ThriftCLIService: Error opening session:
  61. at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:336)
  62. at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:279)
  63. at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:189)
  64. at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:423)
  65. at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362)
  66. at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
  67. at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
  68. at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
  69. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  70. at java.lang.Thread.run(Thread.java:748)
  71. Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root
  72. at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89)
  73. at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
  74. at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
  75. at java.security.AccessController.doPrivileged(Native Method)
  76. at javax.security.auth.Subject.doAs(Subject.java:422)
  77. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
  78. at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
  79. at com.sun.proxy.$Proxy37.open(Unknown Source)
  80. at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:327)
  81. ... 13 more
  82. Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root
  83. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
  84. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526)
  85. at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:168)
  86. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  87. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  88. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  89. at java.lang.reflect.Method.invoke(Method.java:498)
  90. at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
  91. ... 21 more
  92. Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root
  93. at org.apache.hadoop.ipc.Client.call(Client.java:1469)
  94. at org.apache.hadoop.ipc.Client.call(Client.java:1400)
  95. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
  96. at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source)
  97. at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
  98. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

最后一对比,发现正常链接的日志中有对HDFS某目录的操作,失败的则没有,如下图所示:

这里写图片描述
所以推测是用户权限的问题(只怪自己不懂内部运行机制,只能这样猜了。。)
于是自己又更改了HDFS中对应的/tmp文件更改权限:

  1. hadoop fs -chmod -r 777 /tmp

后来发现依旧不行。。。。
此刻博主被折磨的不行不行了。。。不过还得接着解决,不然公司的小姐姐就没法接下来的工作,深刻对比了两个集群的状态后(虚拟机和生产集群),突然发现生产中启动hiveserver2的节点的namenode状态为standy(搭建的是HA机制的集群有两个namenode,一个为active状态,一个为standy,standy状态的节点没有对HDFS的操作权限。PS:即使read的权限也没有,毕竟两个namenode只有一个掌控对应的HDFS的权限),而虚拟机中启动hiveserver2的节点的状态为Active。于是感觉希望又来了,果断kill掉生产中对应的active状态的namenode,这样standy状态的namenode也就转化为active状态也就有了操作HDFS的权限,操作过后,总算大功告成,连接生效。

总结:

其实整个过程只要把第一步的超级代理用户配置好,然后在最后一步的启动hiveserver2的NameNode(我们这里称为ha1)的状态改为active状态应该就OK了,因为这样你就能用ha1在hadoop环境下的root用户去操作HDFS,即使这里再出现用户权限不足的问题,那我们可以接着修改对应的文件的访问权限。而当另一个namenode为active状态,ha1为standy状态时,我们就无法用ha1下的root用户去访问HDFS,所以也就造成启动日志中,一直无法加载生成对应的HDFS文件。

其他一些不错的参考:
https://blog.csdn.net/sunnyyoona/article/details/51648871
http://debugo.com/beeline-invalid-url/
https://blog.csdn.net/yunyexiangfeng/article/details/60867563

发表评论

表情:
评论列表 (有 0 条评论,1179人围观)

还没有评论,来说两句吧...

相关阅读