在windows环境下,跑Spark程序,报“tmp/hive”权限不对,用docs命令加上权限,又提示“ChangeFileModeByMask error”,网上没有一个正确的解决方案,几经折腾,最终搞定,经验分享给大家。
我在window10跑Spark离线任务,抛如下异常:
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwx---;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.alCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.alCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:141)
at org.apa
本文发布于:2024-01-30 02:56:54,感谢您对本站的认可!
本文链接:https://www.4u4v.net/it/170655461718736.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
留言与评论(共有 0 条评论) |