Quantcast
Channel: SCN : Discussion List - SAP HANA Developer Center
Viewing all articles
Browse latest Browse all 6412

HANA Size Calculation

$
0
0

Hi All,

I need a help regarding the HANA size calculation.

 

Normally for HANA size calculation we refer to the note 1514966. In brief this note says that

  • Get the Source data footprint value
  • Get compression factor in HANA
  • calculate the HANA memory size with the help of Source data footprint and compression factor.


In our case the problem is we do not have any existing source database. Its a new application which sends data to HANA via SLT and then do analysis on top of that. We need to calculate the expected HANA size for a customer scenario.



Approach that we are following now:


Say the table name is PROD_HDR.

  1. I have created 2 tables in HANA having same structure as PROD_HDR. Say PROD_HDR_1 and PROD_HDR_2.
  2. I have generated 1 million records for PROD_HDR_1 and 2 million records for PROD_HDR_2
  3. Now, I am using MEMORY_SIZE_IN_TOTAL column of table M_CS_TABLES to get the actual memory size used by the tables. (I am using the LOAD and MERGE statements to load the table fully before running the queries).
  4. Now we have memory size for 1 million and 2 million records. Assuming a linear relationship we are now calculating the size of any no of records (say 10 million).

 

 

Once we have the table size information, we calculate the no of records (approx.) in table as per customer scenario and calculate the expected HANA size in customer landscape.

 

 

Problems:

I don't think this approach is 100% correct. But this was the best we could come up with. Is there any other way to calculate the memory size of a table in a better way?

 

 

Regards,

Raja


Viewing all articles
Browse latest Browse all 6412

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>