Quantcast
Channel: SCN : Discussion List - SAP HANA Developer Center
Viewing all articles
Browse latest Browse all 6412

Out of memory issue during INSERT

$
0
0

Hello All,

 

I am trying to create test data of million rows. I have three tables:

 

Table1 : Supplier ID with 1 column and 25 rows

Table2 : ArtikelID with 1 column     and 80 rows

Table3 : Businessdate with 1 column and 100 rows

Table4 . Sales  data 15 columns and 50 rows

 

So, the resulting size of the data from the below SQL would be 25*80*100*50=10000000 rows. But I try to load only first 10000 rows into the target table and I get the following

 

error [129]: transaction rolled back by an internal error: exception 1000002: Out of memory

My SQL is as follows:

 

INSERT INTO "Target.table"
SELECT TOP 10000 T4.ID,T4.REFID_LFDNR,T4.REFID_POS,T1.Supplier as "Supplier ",  T2.ARTIKELNUMBER as "ARTIKEL", T4.ORGID, T4.MATNR, T4.SALES, T4.PIECES, T4.MARGIn, T4.STOCK, T4.STOREID,  T4.SALES_QUANTITY ,T3.BUSINESS_DATE as BUSINESS_DATE, T4.ACTUAL_PRICE  from "SCHEMA"."SUPPLIER" T1,               "SCHEMA"."ARTIKEL" T2,          "SCHEMA"."BUSINESSDATE2" T3,          "SCHEMA"."SALES TABLE" T4

Well, I tried to execute for only 100 rows but still this fails with the same error. Any ideas how i could fix this?

 

Thanks


Viewing all articles
Browse latest Browse all 6412

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>