Quantcast
Channel: SCN : Discussion List - SAP HANA Developer Center
Viewing all articles
Browse latest Browse all 6412

SAP HANA Data Compression calculation

$
0
0

Hi All,

 

I have a single gender column table with 1 billion values with exactly 2 distinct values (MALE, FEMA).

 

create columntable T (GENDER char(4))

 

According to my understanding of the Nbit dictionary encoding compression algorithm, HANA should encode the data using 1 bit, so the estimated size of my table should be:

 

1,000,000,000 bits = 125MB + some extra from Lookup Table

 

However, when running the following query I get the results below:

 

select table_name, column_name, ceil(memory_size_in_total/1024/1024) as "memory_size_in_MB",

ceil(uncompressed_size/1024/1024) as "Uncompressed_size_in_MB",

compression_type

from M_CS_ALL_COLUMNS where table_name = 'T'

 

 

TABLE_NAME;COLUMN_NAME;memory_size_in_MB;Uncompressed_size_in_MB;COMPRESSION_TYPE

T;GENDER;215;4,769;PREFIXED

T;$trex_udiv$;2,693;3,815;DEFAULT

T;$rowid$;2,534;7,630;DEFAULT

 

Can somebody please explain why trex_udiv & rowid are consuming approx 5GB in size!?

 

select table_name, memory_size_in_total, memory_size_in_main, memory_size_in_delta

from M_CS_TABLES where table_name = 'T'

 

TABLE_NAME;MEMORY_SIZE_IN_TOTAL;MEMORY_SIZE_IN_MAIN;MEMORY_SIZE_IN_DELTA

T;     5,704,892,278;     5,704,872,070;     20,208

 

As you can see from the outputs, the table is consuming a total of approximately 5GB when in my calculation this should be around 125MB.

 

I would really appreciate if someone can clarify the above as I really need to demonstrate the compression of HANA to some clients.

 

Best Regards,

Asif.


Viewing all articles
Browse latest Browse all 6412

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>