Hive下查看表占用空间大小的方法

浅浅的花香味﹌ 2022-05-24 02:20 951阅读 0赞

一、Hive下查看数据表信息的方法

方法1:查看表的字段信息

desc table_name;

方法2:查看表的字段信息及元数据存储路径

desc extended table_name;

方法3:查看表的字段信息及元数据存储路径

desc formatted table_name;

备注:查看表元数据存储路径是,推荐方法3,信息比较清晰。

  1. hive> desc parquet;
  2. OK
  3. member_id string
  4. name string
  5. stat_date string
  6. province string
  7. add_item string add new item comment
  8. Time taken: 0.096 seconds, Fetched: 5 row(s)
  9. hive> desc extended parquet;
  10. OK
  11. member_id string
  12. name string
  13. stat_date string
  14. province string
  15. add_item string add new item comment
  16. Detailed Table Information Table(tableName:parquet, dbName:yyz_workdb, owner:a6, createTime:1510023589, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:member_id, type:string, comment:null), FieldSchema(name:name, type:string, comment:null), FieldSchema(name:stat_date, type:string, comment:null), FieldSchema(name:province, type:string, comment:null), FieldSchema(name:add_item, type:string, comment:add new item comment )], location:hdfs://localhost:9002/user/hive/warehouse/yyz_workdb.db/parquet, inputFormat:org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe, parameters:{serialization.format= , field.delim=
  17. Time taken: 0.15 seconds, Fetched: 7 row(s)
  18. hive> desc formatted parquet;
  19. OK
  20. # col_name data_type comment
  21. member_id string
  22. name string
  23. stat_date string
  24. province string
  25. add_item string add new item comment
  26. # Detailed Table Information
  27. Database: yyz_workdb
  28. Owner: a6
  29. CreateTime: Tue Nov 07 10:59:49 CST 2017
  30. LastAccessTime: UNKNOWN
  31. Retention: 0
  32. Location: hdfs://localhost:9002/user/hive/warehouse/yyz_workdb.db/parquet
  33. Table Type: MANAGED_TABLE
  34. Table Parameters:
  35. COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"}
  36. last_modified_by a6
  37. last_modified_time 1526612655
  38. numFiles 1
  39. numRows 5
  40. rawDataSize 20
  41. totalSize 792
  42. transient_lastDdlTime 1526612655
  43. # Storage Information
  44. SerDe Library: org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe
  45. InputFormat: org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat
  46. OutputFormat: org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat
  47. Compressed: No
  48. Num Buckets: -1
  49. Bucket Columns: []
  50. Sort Columns: []
  51. Storage Desc Params:
  52. field.delim \t
  53. serialization.format \t
  54. Time taken: 0.101 seconds, Fetched: 37 row(s)

二、查看表容量大小

方法1:查看一个hive表文件总大小时(单位为Byte),我们可以通过一行脚本快速实现,其命令如下:

--#查看普通表的容量

$ hadoop fs -ls /user/hive/warehouse/table_name | awk -F ‘ ‘ ‘{print $5}‘|awk ‘{a+=$1}END {print a}‘

48

这样可以省去自己相加,下面命令是列出该表的详细文件列表

$ hadoop fs -ls /user/hive/warehouse/table_name

--#查看分区表的容量

$ hadoop fs -ls /user/hive/warehouse/table_name/yyyymm=201601 | awk -F ‘ ‘ ‘{print $5}‘|awk ‘{a+=$1}END {print a/(1024*1024*1024)}‘

39.709

这样可以省去自己相加,下面命令是列出该表的详细文件列表

$ hadoop fs -ls /user/hive/warehouse/table_name/yyyymm=201601

方法2:查看该表总容量大小,单位为G

$ hadoop fs -du /user/hive/warehouse/table_name|awk ‘{ SUM += $1 } END { print SUM/(1024*1024*1024)}‘

方法3:

$ hadoop fs -du /user/hive/warehouse/table_name/ | awk ‘{ sum=$1 ;dir2=$2 ; hum[1024**3]=”Gb”;hum[1024**2]=”Mb”;hum[1024]=”Kb”; for (x=1024**3; x>=1024; x/=1024){ if (sum>=x) { printf “%.2f %s \t %s\n”,sum/x,hum[x],dir2;break } }}‘

发表评论

表情:
评论列表 (有 0 条评论,951人围观)

还没有评论,来说两句吧...

相关阅读