We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
问题描述: 创建hdfs原生表后,可以正常查询,但是一天后,查询失败,提示starlet err Open hdfs file xxx.dat 上机器看文件是存在的
日志报错
storage volume创建语句 CREATE STORAGE VOLUME def_volume TYPE = HDFS LOCATIONS = ("xxx.db") PROPERTIES ( "enable"="true", "hadoop.security.authentication" = "kerberos", "hadoop.security.kerberos.ticket.cache.path" = "/tmp/krb5cc_999" );
现在排查思路是hdfs认证,因为每次都是一天后失败,而刚好认证的ticket cache的有效期是一天,希望各位大佬帮忙看下
The text was updated successfully, but these errors were encountered:
check cn.out/be.out or jni.log if there is any useful log.
cn.out/be.out
jni.log
Sorry, something went wrong.
No branches or pull requests
问题描述:
创建hdfs原生表后,可以正常查询,但是一天后,查询失败,提示starlet err Open hdfs file xxx.dat
上机器看文件是存在的
日志报错
storage volume创建语句
CREATE STORAGE VOLUME def_volume
TYPE = HDFS
LOCATIONS = ("xxx.db")
PROPERTIES
(
"enable"="true",
"hadoop.security.authentication" = "kerberos",
"hadoop.security.kerberos.ticket.cache.path" = "/tmp/krb5cc_999"
);
现在排查思路是hdfs认证,因为每次都是一天后失败,而刚好认证的ticket cache的有效期是一天,希望各位大佬帮忙看下
The text was updated successfully, but these errors were encountered: