Hive

刘超 16天前 ⋅ 4678 阅读   编辑

  

分类 版本 错误 是否
解决
总结
hdp apache cdh
使用
2.5
ParseException line 2:38 cannot recognize input near 'as' '"是否使用"' ',' in selection target
FAILED: ParseException line 2:38 character '是' not supported here line 2:39 character '否' not supported here
The maximum path component name limit of job_1547433444922_21811-1550227000794-root-create+table+temp.tmp
Could not open input file for reading
comment中文乱码
执行sql语句后,长时间没响应
SemanticException Line 1:17 Invalid path
org.apache.hadoop.hive.serde2.typeinfo.StructTypeInfo cannot be cast to org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo
Error while compiling statement: FAILED: UDFArgumentException explode() takes an array or a map as a parameter
Principal [name=hive, type=USER] does not have following privileges for operation LOAD [ADMIN]
MetaException(messageVersion information not found in metastore. )
FAILED SemanticException Range based Window Frame can have only 1 Sort Key
Only COMPLETE mode supported for row_number function
Cannot construct key, missing provided value: actual_flag
Unable to instantiate class with 1 arguments org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
Too Many fetch failures.Failing the attempt
hive新增advindustry字段其默认值是1
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
数据类型
java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.DoubleWritable (state=,code=0)
beeline
beeline连hive设置tez任务名字 待深入研究

在beeline中指定hive(执行引擎是tez)队列
beeline使用hiveconf指定参数报Exception in thread "main" java.lang.NullPointerException
Permission denied: user=anonymous, access=WRITE, inode="/user/anonymous":mapred:mapred:drwxr-xr-x  
修改hive table的location后报File hdfs://opera/apps/hive/warehouse/adx_app.db/r_day_stat_video/day=20191211 does not exist  
 Unauthorized connection for super-user: hive from IP 172.17.28.16 (state=,code=0)  
orc
org.apache.hadoop.hive.ql.exec.mr.MapRedTask. GC overhead limit exceeded
Failed with exception java.io.IOException:java.io.IOException: ORC does not support type conversion from file type array (21) to reader type string (21)
parquet
spark向hive插入数据(parquet格式),为null
llap
Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.
transform using No such file or directory
  使用transform,结果有显示Null
tez
  Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  TezSession has already shutdown.
Vertex vertex_1535783866149_221416_3_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE, Vertex vertex_1535783866149_221416_3_00 [ 待深入研究

Reducer_2: ShuffleRunner failed with error org.apache.tez.runtime.library.common.shuffle.orderedgrouped.Shuffle$ShuffleError: error in shuffle in fetcher {Reducer_2} #17
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. DEFAULT_MR_AM_ADMIN_USER_ENV  
cdh  
The health test result for HIVEMETASTORE_CANARY_HEALTH has become bed:The Hive metastore canary failed to create a database  
和其他组件一起用
spark
spark hive字段名数字开头
Relative path in absolute URI:file:E:/Workspace/Eclipse_Daimler_bak/Daimler/spark-warehouse
spark访问hive报cannot resolve '`advIndustry`' given input columns
org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict
Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
kerberos
 
kettle GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Invalid option setting in ticket re
hbase hive建立hbase外表导致ZooKeeper连接数增加
sqoop 分析人员反馈,呼叫记录有问题(分隔符问题)
--hive-overwrite失败,显示无法移动文件
hdfs org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 172.17.28.16 (state=,code=0)
操作系统
mac hivesql不能复制到Mac zsh中
hive去重哪种更好,为啥
两集群的某张hive表同步后,记录数不匹配
COMMENT 'from deserializer'
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask


注意:本文归作者所有,未经作者允许,不得转载

全部评论: 0

    我有话说: