Hi there,
I have to import to HANA a large amount of Data from local files (Excel Files).
I have a lot of tables with 100+ columns and like 10kk, 20kk or 30kk of rows.
At the moment i do this things:
1) Download a table from SAP in a .txt format, with like 800k rows per file;
2) I use a program to split each file into many files with 65k rows each;
3) Import the smaller file to excel. Here i keep all columns in Text type, except for Decimal Numbers for wich i use General Type;
4) I use the import feature of Hana Studio for each excel file.
So, for example, i have the table AFVC with 34kk rows. I downloaded 44 .txt files with like 500k ~ 800k rows each.
After that i split each file into like 16 files with 65k rows each one. I open excel and import each file and i have to change some coloumn types.
Now i can import each xls file into HANA with hana Studio.
So i have to import to hana like 700 files. After i imported to excel each file..
This is what i'm doing, and it's not so fast, i'm getting crazy.
What can i do to improve the load?
I don't have the FTP user yet (i requested that) and i don't have SLT avaiable.
My real problem is that with so many coloumns i can't import to hana files with more than 65535 rows, or it give me a "out-of-memory" error.
Thanks
Alessandro