Quantcast
Channel: SCN : Discussion List - SAP HANA Developer Center
Viewing all 6412 articles
Browse latest View live

HANA XS Advanced Command Line - Untrusted Certificate

$
0
0

Hey folks,

 

I have just installed SPS11 + XS Advanced. I am now trying to connect using the xs command line tool:

 

xs login

API_URL> https://hanadev:30030

FAILED: SSL connection error (supposedly untrusted connection, check the certificates)

 

Has anyone run in to this? This is just a sandbox system so I'm not to worried about getting a valid cert - if there's a way to ignore invalid certificates then I'm OK with that for now.

 

For production - I will obviously need a valid certificate. What are the steps to install a new cert?


Clone/Copy database schemas

$
0
0

Hello Experts,

 

i have a question about best practice or possibilities in cloning a SAP HANA Database schema.

Some informations about our scenario:

- We developing a java-application in the Hana Cloud Platform which stores the data in the SAP HANA Database

- We are using Core Data Services for the database development

- The database project includes about 100 files (procedures, functions, views)

- All database objects are stored in one schema

 

 

Our problem is that we need our database schema multiple times in the same SAP HANA instance. (1 schema for development, 1 for QA, 1 for live-demos, ..., ...)

So i need a smooth possibility to clone the database schema.

 

 

Currently i have found following ways to clone the schema (but they are not very good):

 

1. Copy all CDS Files into a second SAP HANA Repository package.

Positive:

  - When i update existing tables the tables keeps their data

Negative:

  - I have to manually rename the schema name in every file.

  - I have to manually change the namespace in each file (cause the package is different, and the package-name has to be the namespace...)

  - (Same problem when i want to update the cloned schema)

 

The effort for cloning the schema is very big and thats not really smooth.

 

 

2. Export and Import the schema

With this two commands i can clone a schema very quick

EXPORT "schema"."*" AS BINARY INTO '/tmp' WITH REPLACE CATALOG ONLY;

IMPORT "schema"."*" AS BINARY FROM '/tmp' WITH REPLACE CATALOG ONLY RENAME SCHEMA "schema" TO "schema_clone";

Positive:

  - Very quick and smooth to use.

Negative:

  - When i update existing tables all data get lost (very bad)

  - Because the SAP HANA Instance is in the HCP, we don't have access to the Export/Import-Directory.

  So we run in following problem: Clone schema with procedure X. Rename procedure X in Y. Clone Schema again.

  --> Both procedures will be imported, cause the old procedure still exists in the folder /tmp. We still can't use a different folder because we can't create one and we only can export in existing folders.

 

Loosing all data and importing deleted old objects is also not very smooth.

 

 

Does anybody know a better way to clone schemas?

 

 

Thanks for help

 

Christoph

Invalid Results during execution of SVM algorithm in HANA PAL

$
0
0

Dear SCN community,

 

We are building a predictive model in HANA PAL using the SVM algorithm. The sample code was taken from the HANA PAL Guide and adjusted according to our dataset. This meant changing the schema name and importing the data set manually into the input data table. Our training dataset (TRAIN) contains 798 instances and the test data (TEST) set contains 202 instances. There are 19 independent variables and 1 dependent variable.

The model was created successfully but the predictions made on test data are wrong. The model predicts a value of 0 for all the 202 instances which is incorrect. We have created the same model in R and it worked fine with a prediction accuracy of about 78%. Therefore our questions now are:

 

  1. Is the parameter description suitable for our data? Please find attached the SQL code below.
  2. Should the target variable ‘class2’ be imported as an INTEGER or a DOUBLE? The target variable has only values 0 and 1.
  3. How can we find out the probability of the target variable being 0 or 1? This is explained for other algorithms but not for SVM.

 

I have attached the training and testing data that I am using below. I would be very grateful for any help!!

 

Thank you for your time and help!

 

SQL CODE

SETSCHEMA DE001_D14_710;

--prepare input training data table type--

--DROP TYPE PAL_SVM_TRAININGSET_T;

 

CREATETYPE PAL_SVM_TRAININGSET_T ASTABLE (ID integer, class2 double,

credit_usage integer,

own_telephone varchar (100),

existing_credits varchar (100),

other_payment_plans varchar (100),

property_magnitude varchar (100),

residence_since integer,

personal_status varchar (100),

other_parties varchar (100),

purpose varchar (100),

credit_history varchar (100),

over_draft varchar (100),

current_balance integer,

Average_Credit_Balance varchar (100),

employment varchar (100),

cc_age integer,

housing varchar (100),

job varchar (100),

num_dependents integer

);

 

--prepare argument table type--

--DROP TYPE PAL_CONTROL_T;

 

CREATETYPE PAL_CONTROL_T ASTABLE( NAME varchar(50), INT#PAL_CONTROL_TBL

integer, DOUBLE#PAL_CONTROL_TBL double, STRING#PAL_CONTROL_TBL varchar(100));

--prepare result table type--

 

--DROP TYPE PAL_SVM_MODELPART1_T;

 

CREATETYPE PAL_SVM_MODELPART1_T ASTABLE( ID varchar(50), VALUEE double);

--prepare result table type--

 

--DROP TYPE PAL_SVM_MODELPART2_T;

 

CREATETYPE PAL_SVM_MODELPART2_T ASTABLE( ID integer, class2 double,

credit_usage integer,

own_telephone varchar (100),

existing_credits varchar (100),

other_payment_plans varchar (100),

property_magnitude varchar (100),

residence_since integer,

personal_status varchar (100),

other_parties varchar (100),

purpose varchar (100),

credit_history varchar (100),

over_draft varchar (100),

current_balance integer,

Average_Credit_Balance varchar (100),

employment varchar (100),

cc_age integer,

housing varchar (100),

job varchar (100),

num_dependents integer

);

 

--DROP TYPE PAL_SVM_MODELPART3_T;

 

CREATETYPE PAL_SVM_MODELPART3_T ASTABLE( ID integer, MAPSTRING varchar(100),

MAPPOSITION integer);

 

----create PAL procedure for training----

 

--DROP TABLE PAL_SVM_PDATA_TBL;

 

CREATETABLE PAL_SVM_PDATA_TBL("POSITION"INT, "SCHEMA_NAME"NVARCHAR(256),

"TYPE_NAME"NVARCHAR(256), "PARAMETER_TYPE"VARCHAR(7));

INSERTINTO PAL_SVM_PDATA_TBL VALUES (1,'DE001_D14_710','PAL_SVM_TRAININGSET_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (2,'DE001_D14_710','PAL_CONTROL_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (3,'DE001_D14_710','PAL_SVM_MODELPART1_T','OUT');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (4,'DE001_D14_710','PAL_SVM_MODELPART2_T','OUT');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (5,'DE001_D14_710','PAL_SVM_MODELPART3_T','OUT');

 

--call SYS.AFLLANG_WRAPPER_PROCEDURE_DROP('DE001_D14_710','PAL_SVM_TRAIN');

 

call SYS.AFLLANG_WRAPPER_PROCEDURE_CREATE('AFLPAL','SVMTRAIN','DE001_D14_710','PAL_SVM_TRAIN',PAL_SVM_PDATA_TBL);

 

--create input training data table--

--DROP TABLE PAL_SVM_TRAININGSET_TBL;

 

CREATECOLUMNTABLE PAL_SVM_TRAININGSET_TBL LIKE PAL_SVM_TRAININGSET_T

 

----- We manually import the training data in the above created empty column table using the interface (TRAIN.CSV)

 

 

--DROP TABLE #PAL_CONTROL_TBL;

 

CREATELOCALTEMPORARYCOLUMNTABLE #PAL_CONTROL_TBL (NAME varchar(50),

INT#PAL_CONTROL_TBL integer, DOUBLE#PAL_CONTROL_TBL double,

STRING#PAL_CONTROL_TBL varchar(100));

 

--create model part 1 table--

 

--DROP TABLE PAL_SVM_MODELPART1_TBL;

 

CREATECOLUMNTABLE PAL_SVM_MODELPART1_TBL( ID varchar(50), VALUEE double);

--create model part 2 table--

 

--DROP TABLE PAL_SVM_MODELPART2_TBL;

 

CREATECOLUMNTABLE PAL_SVM_MODELPART2_TBL( ID integer, ALPHA double,

credit_usage integer,

own_telephone varchar (100),

existing_credits varchar (100),

other_payment_plans varchar (100),

property_magnitude varchar (100),

residence_since integer,

personal_status varchar (100),

other_parties varchar (100),

purpose varchar (100),

credit_history varchar (100),

over_draft varchar (100),

current_balance integer,

Average_Credit_Balance varchar (100),

employment varchar (100),

cc_age integer,

housing varchar (100),

job varchar (100),

num_dependents integer

);

 

--create model part 3 table--

 

--DROP TABLE PAL_SVM_MODELPART3_TBL;

 

CREATECOLUMNTABLE PAL_SVM_MODELPART3_TBL(ID integer, MAPSTRING varchar(100), MAPPOSITION integer);

 

---insert data into input training argument---

INSERTINTO #PAL_CONTROL_TBL VALUES('THREAD_NUMBER',8,null,null);

INSERTINTO #PAL_CONTROL_TBL VALUES('KERNEL_TYPE',2,null,null);

INSERTINTO #PAL_CONTROL_TBL VALUES('TYPE',1,null,null);

INSERTINTO #PAL_CONTROL_TBL VALUES('CROSS_VALIDATION',0,null,null);

INSERTINTO #PAL_CONTROL_TBL VALUES('NR_FOLD',5,null,null);

 

CALL DE001_D14_710.PAL_SVM_TRAIN(PAL_SVM_TRAININGSET_TBL,#PAL_CONTROL_TBL,PAL_SVM_MODELPART1_TBL,PAL_SVM_MODELPART2_TBL,PAL_SVM_MODELPART3_TBL) WITH OVERVIEW;

 

--check the result--

SELECT * FROM PAL_SVM_TRAININGSET_TBL;

SELECT * FROM #PAL_CONTROL_TBL;

SELECT * FROM PAL_SVM_MODELPART1_TBL;

SELECT * FROM PAL_SVM_MODELPART2_TBL;

SELECT * FROM PAL_SVM_MODELPART3_TBL;

 

--prepare input predicting test data table type--

--DROP TYPE PAL_SVM_TESTINGSET_T;

 

CREATETYPE PAL_SVM_TESTINGSET_T ASTABLE ( ID integer,

credit_usage integer,

own_telephone varchar (100),

existing_credits varchar (100),

other_payment_plans varchar (100),

property_magnitude varchar (100),

residence_since integer,

personal_status varchar (100),

other_parties varchar (100),

purpose varchar (100),

credit_history varchar (100),

over_draft varchar (100),

current_balance integer,

Average_Credit_Balance varchar (100),

employment varchar (100),

cc_age integer,

housing varchar (100),

job varchar (100),

num_dependents integer

);

 

 

--prepare argument table type--

--DROP TYPE PAL_CONTROL_T;

 

CREATETYPE PAL_CONTROL_T ASTABLE( NAME varchar(50), INT#PAL_CONTROL_TBL

integer, DOUBLE#PAL_CONTROL_TBL double, STRING#PAL_CONTROL_TBL varchar(100));

 

--prepare model part 1 table type--

 

--DROP TYPE PAL_SVM_MODELPART1_T;

 

CREATETYPE PAL_SVM_MODELPART1_T ASTABLE( ID varchar(50), VALUEE double);

--prepare model part 2 table type--

 

--DROP TYPE PAL_SVM_MODELPART2_T;

 

CREATETYPE PAL_SVM_MODELPART2_T ASTABLE( ID integer, ALPHA double,

credit_usage integer,

own_telephone varchar (100),

existing_credits varchar (100),

other_payment_plans varchar (100),

property_magnitude varchar (100),

residence_since integer,

personal_status varchar (100),

other_parties varchar (100),

purpose varchar (100),

credit_history varchar (100),

over_draft varchar (100),

current_balance integer,

Average_Credit_Balance varchar (100),

employment varchar (100),

cc_age integer,

housing varchar (100),

job varchar (100),

num_dependents integer

);

 

--prepare model part 3 table type--

 

--DROP TYPE PAL_SVM_MODELPART3_T;

 

CREATETYPE PAL_SVM_MODELPART3_T ASTABLE(ID integer, MAPSTRING varchar(100),

MAPPOSITION integer);

 

--prepare predicting result table type--

 

--DROP TYPE PAL_SVM_PREDICTION_T;

 

CREATETYPE PAL_SVM_PREDICTION_T ASTABLE( ID integer, PREDICT double);

----create PAL procedure for predicting----

 

--DROP TABLE PAL_SVM_PDATA_TBL;

 

CREATECOLUMNTABLE PAL_SVM_PDATA_TBL("POSITION"INT, "SCHEMA_NAME"

NVARCHAR(256), "TYPE_NAME"NVARCHAR(256), "PARAMETER_TYPE"VARCHAR(7));

INSERTINTO PAL_SVM_PDATA_TBL VALUES (1,'DE001_D14_710','PAL_SVM_TESTINGSET_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (2,'DE001_D14_710','PAL_CONTROL_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (3,'DE001_D14_710','PAL_SVM_MODELPART1_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (4,'DE001_D14_710','PAL_SVM_MODELPART2_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (5,'DE001_D14_710','PAL_SVM_MODELPART3_T','IN');

INSERTINTO PAL_SVM_PDATA_TBL VALUES (6,'DE001_D14_710','PAL_SVM_PREDICTION_T','OUT');

 

--call SYS.AFLLANG_WRAPPER_PROCEDURE_DROP('DE001_D14_710','PAL_SVM_PREDICT');

 

call SYS.AFLLANG_WRAPPER_PROCEDURE_CREATE('AFLPAL','SVMPREDICT','DE001_D14_710','PAL_SVM_PREDICT',PAL_SVM_PDATA_TBL);

 

--create input predicting test data table--

--DROP TABLE PAL_SVM_TESTINGSET_TBL;

 

CREATECOLUMNTABLE PAL_SVM_TESTINGSET_TBL LIKE PAL_SVM_TESTINGSET_T

 

----- We manually import the testing data in the above created empty column table using the interface (TEST.CSV)

 

--create predicting argument table--

 

--DROP TABLE #PAL_CONTROL_TBL;

 

CREATELOCALTEMPORARYCOLUMNTABLE #PAL_CONTROL_TBL (NAME varchar(50),

INT#PAL_CONTROL_TBL integer, DOUBLE#PAL_CONTROL_TBL double,

STRING#PAL_CONTROL_TBL varchar(100));

--create predicting result table--

 

--DROP TABLE PAL_SVM_PREDICTION_TBL;

 

CREATECOLUMNTABLE PAL_SVM_PREDICTION_TBL( ID integer, PREDICT double);

 

 

---insert data into input predicting argument---

 

 

INSERTINTO #PAL_CONTROL_TBL VALUES('THREAD_NUMBER',8,null,null);

 

CALL DE001_D14_710.PAL_SVM_PREDICT(PAL_SVM_TESTINGSET_TBL,#PAL_CONTROL_TBL,PAL_SVM_MODELPART1_TBL,PAL_SVM_MODELPART2_TBL,PAL_SVM_MODELPART3_TBL,PAL_SVM_PREDICTION_TBL) WITH OVERVIEW;

 

SELECT * FROM PAL_SVM_PREDICTION_TBL;

Recover .war file off HCP server

$
0
0

Hi Experts,

 

I am creating an extension application and seem to have corrupted my solution and was wondering if it would be possible to recover my .war file from the hana cloud platform server it was uploaded to so that I can decompile it to get the source code? If this is possible how would I go about attempting it?

 

Thanks in advance

XS Advanced: Update to Patch Level 13 failed

$
0
0

Hello,

I just tried to update my XS Advanced from Patch Level 9 (from initial installation of Rev. 110) to Patch Level 13 (latest available) after installing HANA Rev. 111.

 

Basically everything went OK, but in the end I got an error:

 

6:05:38.321 - INFO:   ---------------------------------------------------------
16:05:38.321 - INFO:   Calling postInstall event handler
16:05:38.321 - INFO:   ---------------------------------------------------------
16:05:38.321 - INFO:     isUpdate=1
16:05:38.321 - INFO:     -------------------------------------------------------
16:05:38.321 - INFO:     Starting system...
16:05:38.321 - INFO:     -------------------------------------------------------
16:05:38.322 - INFO:       Starting instance on host 'hdb' (worker, xs_worker)...
16:05:38.322 - INFO:       Parameters: instance number = 00, user = hdbadm
16:05:38.326 - INFO:         Instance is already running
16:05:38.326 - INFO:       Instance on host 'hdb' (worker, xs_worker) started
16:05:38.326 - INFO:     -------------------------------------------------------
16:05:38.326 - INFO:     END: Start system
16:05:38.326 - INFO:     -------------------------------------------------------
16:05:38.326 - INFO:     Configuring software...
16:05:38.326 - INFO:       6 Package(s) installed
16:05:38.326 - INFO:       baseVersion = 1.00.00.262519
16:05:38.326 - INFO:       Write parameters to secure store
16:05:38.379 - INFO:     Starting external program /hana/shared/HDB/xs/installation-scripts/installation/storeParameters
16:05:38.380 - INFO:       Command line is: /hana/shared/HDB/xs/installation-scripts/installation/storeParameters
16:05:38.584 - INFO:       Output line 1: Initially filling HANA Secure Store with parameters
16:05:38.634 - INFO:       Output line 2: RSecSSFs: Entering function "RSecSSFsGetRecord" [/bas/745_REL/src/krn/rsec/rsecssfs.c 1433]
16:05:38.634 - INFO:       Output line 3: RSecSSFs: Using explicitly set configuration data [/bas/745_REL/src/krn/rsec/rsecssfs.c 7811]
16:05:38.634 - INFO:       Output line 4: RSecSSFs: SSFS-1440: File "/usr/sap/HDB/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.DAT" cannot be opened in mode "rb": No such file or directory (errno = 2) [/bas/745_REL/src/krn/rsec/rsecssfs.c 1794]
16:05:38.634 - INFO:       Output line 5: RSecSSFs: SSFS-4182: Data file "/usr/sap/HDB/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.DAT" does not exist (this is not an error per se: a non-existing data file is a valid situation and treated equally as one without entries) [/bas/745_REL/src/krn/rsec/rsecssfs.c 4908]
16:05:38.634 - INFO:       Output line 6: RSecSSFs: SSFS-4187: Record with key "XSA/CORE/RUNTIME_DB_USER_NAME" not found in secure storage [/bas/745_REL/src/krn/rsec/rsecssfs.c 4911]
16:05:38.634 - INFO:       Output line 7: RSecSSFs: Exiting function "RSecSSFsGetRecord" with return code 1 (message: SSFS-4187: Record with key "XSA/CORE/RUNTIME_DB_USER_NAME" not found in secure storage <-- SSFS-4182: Data file "/usr/sap/HDB/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.DAT" does not exist (this is not an error per se: a non-existing data file is a valid situation and treated equally as one without entries) <-- SSFS-1440: File "/usr/sap/HDB/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.DAT" cannot be opened in mode "rb": No such file or directory (errno = 2)) [/bas/745_REL/src/krn/rsec/rsecssfs.c 1522]
16:05:40.916 - INFO:       Output line 8: Exception in thread "main" com.sap.security.nw.SecStoreFS.SecStoreFSException: SSFS-1560: Could not open lockfile; open(/usr/sap/HDB/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.LCK) == 2 (No such file or directory) [17=EEXIST,13=EACCES,2=ENOENT,22=EINVAL,20=ENOTDIR] (RC = -3)
16:05:40.916 - INFO:       Output line 9: at com.sap.security.nw.SecStoreFS.SecStoreFS.putRecord(Native Method)
16:05:40.917 - INFO:       Output line 10: at com.sap.xs2rt.installation.impl.util.InstallationParameterWriterSecureStoreImpl.initParameters(InstallationParameterWriterSecureStoreImpl.java:235)
16:05:40.917 - INFO:       Output line 11: at com.sap.xs2rt.installation.impl.util.InstallationParameterWriterSecureStoreImpl.<init>(InstallationParameterWriterSecureStoreImpl.java:213)
16:05:40.917 - INFO:       Output line 12: at com.sap.xs2rt.installation.impl.util.InstallationParameterWriterSecureStoreImpl.createFromStdIn(InstallationParameterWriterSecureStoreImpl.java:142)
16:05:40.917 - INFO:       Output line 13: at com.sap.xs2rt.installation.impl.util.InstallationParameterWriterSecureStoreImpl.main(InstallationParameterWriterSecureStoreImpl.java:121)
16:05:40.922 - INFO:       Program terminated with exit code 1
16:05:40.922 - INFO:     Failed to write parameters to secure store.
16:05:40.922 - ERR :     Custom event postInstall failed
16:05:40.922 - INFO:   ---------------------------------------------------------
16:05:40.922 - INFO:   END: postInstall event handler (start: 16:05:38.321 duration: 00:00:02.601)
16:05:40.922 - INFO:   ---------------------------------------------------------
16:05:40.923 - INFO: -----------------------------------------------------------
16:05:40.923 - INFO: END: Installing SAP HANA XS RUNTIME (start: 16:05:17.192 duration: 00:00:23.730)
16:05:40.923 - INFO: -----------------------------------------------------------
16:05:40.922 - ERR : Cannot install
16:05:40.923 - ERR : error installing
16:05:40.923 - ERR : Installation failed
16:05:40.923 - INFO: Summary of critical errors
16:05:40.923 - ERR :   Installation failed
16:05:40.923 - ERR :     error installing
16:05:40.922 - ERR :       Cannot install
16:05:40.922 - ERR :         Custom event postInstall failed

 

 

Of course the file /usr/sap/HDB/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.DAT doesn't exist since it is at /usr/sap/HDB/SYS/global/security/rsecssfs/data/SSFS_HDB.DAT (there is one "/HDB" too much in the path).

 

It looks like installation was nearly complete, but now xscontroller, xsexecagent and xsuaaserver are not running (status red via studio). There are no errors related to those 3 processes in trace folder, but I found out that the number of instances have been set to 0 in daemon.ini configuration file.

 

What am I supposed to do now? Should I try setting the number of instances to 1? Is this an error in PL 13?

Any hints appreciated,

Fabian

Backup storage location for 1+1 instance

$
0
0

Hi,

 

 

 

I have a quick question. A 1+1 instance: Main node and standby is set up. I'm referring to the SAP Fiber Channel Storage connector Admin Guide 

in the link: http://service.sap.com/sap/support/notes/1900823

 

 

 

The documentation says that the backup location must be available in the same location on each node.  This makes sense where there are more than 1 worker nodes, but in a 1+1 configuration, the backup filesystem can failover to the standby in the same way that log and data does.  Is this correct? 

 

Please guide me in the right direction. Thanks in advance.

 

 

 

Regards,

Apoorva

Why does hdbnameserver process exit?

$
0
0

Hi all,

 

I am a newbie of SAP HANA, and want to run SAP HANA in docker container.

 

I have installed HANA successfully in a docker container, but after building this container to a new docker image, I find can't run HANA successfully using this new docker image.

 

The phenomenon is the hdbnameserver process will exit after running a while, and other hdb* processes can't start.

 

Could anyone give some clues or comments about this issue? Thanks in advance!

 

Best Regards

Nan Xiao

How to use SMART DATA ACCESS in SAP Hana Cloud Platform?

$
0
0

Hi,


We have sap hana cloud platform developer instance. We are trying to import a MSSQL Server-2012 tables into this HANA Instance using “SMART DATA ACESS” from SAP HANA Studio.

 

But while trying to do this we are getting an error like “

SAP DBTech JDBC: [403]: internal error: Cannot get remote source objects: [unixODBC][Driver Manager]Data source name not found, and no default driver specified “ which requires a unixODBC need to install in LINUX Server where HANA hosted and set  a default path?

 

We are referring the following video for this     https://www.youtube.com/watch?v=Y2r2FjBJP9w&list=PLkzo92owKnVx_X9Qp-jonm3FCmo41Fkzm&index=2 (SDA Connecting with MSSQL).

 

Can you please help on this to connect MSSQL from HANA Studio or can we get an access to the host instance of SAP HANA Development addition(OS Level Access) ??

 

 

Thanks in Advance,

Chandrababu Katta


Java error in SAP Hana Studio installation

$
0
0

Hello

 

I am running Windows 8.1 on 64 bit system.

 

I installed the SAP HANA client successfully.

 

When I install the SAP HANA studio, I get the following error in a pop up box

 

"Cannot access java executable 'C:\ProgramData\Oracle\Java\javapath\javaw.exe'

 

I have tried/verified the following:

1) uninstalled previous version of Java and downloaded the latest one (8.25)

2) checked that java -version command from command prompt is working. i.e. java is being recognized on the system (via some environment  variable)

3) rebooted after java uninstall and install

 

Please help.

 

thanks

vikas

Stored Procedure need an Array as output

$
0
0

Hi Experts,

 

I need to use a procedure, with an array as output.

How would you do this?

 

 

PROCEDURE "TBASE_PUBLIC"."development...procedures::procGetLabordateForPatient"  (

    IN patientID integer,

    out rs_labordatum "TBASE_PUBLIC"."development...data::TBASE_TT.Procedures.ttLabordate"

   

)

  LANGUAGE SQLSCRIPT

  SQL SECURITY INVOKER

  --DEFAULT SCHEMA <default_schema_name>

  READS SQL DATA AS

BEGIN

 

 

    rs_labordatum =

        SELECT 

            L."PatientID",

            L."Datum"

        FROM "TBASE_PUBLIC"."development...data::TBASE.cds.Labor" L

        WHERE

                L."PatientID" = :patientID

        ORDER BY L."Datum" DESC

    ;

END

HANA SQL Script - Cumulate Function

$
0
0

Hi experts,

 

I have a requirement to creat a new collumn in my Calculated View to cumulate another collumn values.

 

The cumulated collumn should cumulate 12 months (11 months back until the current month). So I created the following SQL Script:

 

     (SELECTSUM(B.SALES)

            FROM"_SYS_BIC"."LSA.ADM.STR/CVSTR_FIN_REP_ACTUAL_3"AS B

                  WHERE B.CC_FISCPER_AUX <= A.CC_FISCPER_AUX

                  AND B.SEGMENT = A.SEGMENT

                  AND B.COMP_CODE = A.COMP_CODE

                  AND B.CC_PLANT = A.CC_PLANT

                  AND B.FUNC_AREA = A.FUNC_AREA

                  AND B.GL_ACCOUNT = A.GL_ACCOUNT

                  AND B.COSTCENTER = A.COSTCENTER

                  AND B.COST_ELEM = A.COST_ELEM

                  AND B.CO_AREA = A.CO_AREA

                  AND B.PROFIT_CTR = A.PROFIT_CTR

                  AND B.VERSION = A.VERSION

                  AND B.CC_CURTYPE = A.CC_CURTYPE

                  AND B.CURRENCY = A.CURRENCY

                  AND B.UNIT = A.UNIT 

                  AND B.VTYPE = A.VTYPE

                  AND B.CC_FISCVAR = A.CC_FISCVAR

                  AND B.CC_FISCPER_AUX >= ADD_MONTHS(TO_DATE(A."CC_FISCPER_AUX", 'YYYYMM'), -11)

                  AND B.CC_FISCPER_AUX <= A.CC_FISCPER_AUX) AS"SALES_***12"

  

            FROM"_SYS_BIC"."LSA.ADM.STR/CVSTR_FIN_REP_ACTUAL_3"AS A

 

The calculation is working perfectly, despite the fact that it's not performing. My CV is based on FAGLFLEXT table, and it has more that 5.000.000 records.

When I run a Data Preview in my Analytic View (under FAGLFLEXT) it opens fast and easy. But my cumulative SQL Script is taking more than 3 minutes to return the results.

 

I was refering to HANA Business Function Library and I have found the CUMULATE function, but I coudn't understand the example that they provided there. Can you guys help me to understand how could I use CUMULATE function in this situation of mine, so I can achieve a better performance?

 

Thanks in advance.

 

Adrianon Frossard.

Help: Auto-Login to HANA Web Workbench for HCP Trial not working after the update last night

$
0
0

Hi,

 

I have a problem in a HCP trial account with a HANA XS application after the update last night.

 

When I select the application in the HCP cockpit, there is a link named "Open in Web-based Development Workbench" which normalles logs you in automatically to the Web Workbench. In my case however, I get since last night a HANA login prompt (not an SAP Accounts SAML login prompt!), which apparently wants the internal HANA credentials which I don't have on a HCP trial account with a shared HANA instance.

 

The URL

https://s9hanaxs.hanatrial.ondemand.com/sap/hana/xs/ide/editor/index.html?startURI=p1941627274trial/hc/demo

is opened, which redirects to

https://s9hanaxs.hanatrial.ondemand.com/sap/hana/xs/formLogin/login.html?x-sap-origin-location=%2Fsap%2Fhana%2Fxs%2Fide%2Feditor%2Findex.html%3FstartURI%3Dp1941627274trial%2Fhc%2Fdemo

instead of logging me in automatically.

 

What can I do to access my application again?

 

Regards,

Wolfgang

Access in abap report to hana view in SYS schema

$
0
0

I want to create a report in abap and execute this query:

     "Select PATH from M_DISK"

 

and I have this error: "M_DISK" is not defined in the ABAP Dictionary as a table, projection view, or database view.

 

M_DISK is a view in SYS schemafrom hana.

 

Thanks

HANA On-premise XS SAML Authentication: Unable to verify XML signature

$
0
0

We are doing SAP HANA SSO integration with our IdP. The following steps have been performed:

  1. We have created a Simple Hello World XS Application (using Create Your First HANA XS Application using HANA Studio). The application was tested with basic authentication and it worked.
  2. Following Use SAML to enable SSO for your SAP HANA XS App (SPS 09 rev 92 or later), we have configured SAML SSO (excluding step 4).
  3. In the Trust Store, we have imported IWA Root certificate and IdP's Digital Signing Certificate.
  4. Under Service Provider Configuration, we are using SHA1 as our Hash logic.
  5. SP metadata content carried the ACS url as https://<server-name>:4300/sap/hana/xs/saml/login.xscfunc

 

Post configuration when we access our XS application it authenticates with our IdP. But when it hits the ACS url it displays the following error - "StatusCode in ResponseMessage != OK; please refer to the database trace for more information". The trace shows -

 

e XSSession    XSSessionLifecycle.cpp(00254) : Assertion authentication failed with reason: Unable to verify XML signature(StatusCode: , StatusMessage: )

 

Amendments Tried:

  1. On IdP end, we have tried both the signature type - Assertion and Response.
  2. In the trace portal, we have set the trace level to Debug for our application as well as sap.hana.xs.saml. But still we receive only the above message.

 

Queries:

  1. Are we using the correct ACS?
  2. How can we increase the trace level to get better detailing of the error message?
  3. We have also implemented the solution provided in Troubleshooting Issues when implementing SAML SSO in HANA XS Engine but did not succeed. So please let us know if there any different options that can be tried out?

How to Prepare for HANA Certification (C_HANAIMP151) ?

$
0
0

I am preparing for SAP HANA Application associate certification - C_HANAIMP151 using:


-HA100

-HA300

-HA200 (for further content).


However i am finding the material in these manuals in less to understand HANA and prepare for certification according the subtopics for e.g


- Modelling Function

- SAP HANA Architecture

- Data provisioning

 

Hence I need guidance on which material sources to prepare for HANA ??


I am planning to take SAP HANA E-academy - HAIMPE, I have 3 months time to prepare.

 


Data type NVARCHAR larger than 5000?

$
0
0

Hello,

 

There is a following definition in the SAP document "SAP HANA SQL and System Views Reference - SAP HANA Platform SPS 11 (Document Version: 1.0 – 2015-11-25)“:

"The NVARCHAR(n) data type specifies a variable-length Unicode character set string, where <n> indicates the maximum length in characters and is an integer between 1 and 5000.”

 

But HAHA allows nvarchar larger than 5000 (see test procedure 1). The allowable maximum length is 8388607 (see test procedure 2)

My question is whether may I use nvarchar larger than 5000?

 

Regards.

Yujun Hu

 

---------------------------------------------

createprocedure TEST_PROC_1

as

begin

       declare s nvarchar(20000) = '';     -- test with length 20000

      

       declare i int;

       for i in 1 .. 2000 do

             s := :s || '1234567890';

       endfor;

      

       select :s as X, length(:s) as LEN from dummy;

    /* It works. Output: LEN=20000 */

end;


---------------------------------------------

createprocedure TEST_PROC_2

as

begin

       declare s nvarchar(9000000) = '';

    /* error */

end;


SAP HANA message:

specified length too long for its datatype: the identifier "S" is too long. Maximum length is 8388607

 

---------------------------------------------


HANA Live for ERP EHP4 installation after EHP7

$
0
0

Hi Friends,

 

We have installed HANA Live for ERP EHP7 in HANA Studio, However it contains only one view for SAP HCM. I explored more and found few more HANA views for HCM are available in EHP4. My question is can we add EHP4 tgz file after EHP7 installation? I tried yesterday but got errors. Just want to understand the right way to install both the components in HANA Studio.

 

SAP ECC EHP7

HDB 1.0 SP10

 

Thanks,

Gaurav

Spark and SAP HANA Integration

$
0
0

Hi Everyone,

 

We have project to integrate SPARK data into SAP HANA.

 

1. On HANA side for HANA - SPARK integration, do we need anything else besides HANA Spark adapter to be installed ?


2. For HANA - Spark integration, is SPARK standalone installation is sufficient or do we need additional installation such as HADOOP, Thrift Servr, Ambari etc.. ?


3. Does HANA - Spark integration, require storage to be HDFS or we could leverage storage such as AWS S3 ?


4. Would HANA Developer Edition, include feature such as Spark integration or is part of Enterprise edition only ?


5. Could latest version of Apache Spark 1.6.x be leveraged and supported for integration with HANA or we are restricted to Version 1.4.X/1.5.X of Spark for support/compatibility purposes?

 

Your assistance is the most welcome.

 

Thank you,


Eric.

Slow Performance in SAP HANA

$
0
0

Hi,

 

We observed very slow responses in sap hana studio when creating information views or roles.

for example:

 

I have created one simple Calculation View.

When I try to validate or activate it,its in Running Status for very long times and finally ending up with time out error as

 

Repository activation lock time out;another activation is still running;try again later.


Similarly for creation of roles or other activities taking longer times.


Request you to please let me know how can we resolve this issue and further analysis.

where can we find the traces.


Thank you in advance for the help.


Best Regards

ERP.

Use different column Name in Analytic View depending by logon session language

$
0
0

Hi everybody.

 

Is it possible to have different column names for ONE single Analytic View column depending by the Client session language?

 

For instance:

- we have one (only one)  column name let's say that is 'fatturato' (for an Italian Client session) and 'invoiced' (for an English Client session) etc.. etc..

in other word the 'column name' changes depending by the language but the column is only 1 (one).

 

Is it possible to manage this and how? In SAP Hana Studio?

 

Is it possible to store in some way the different names of that column and relate them to the column itself?

 

Thank you in advance

 

Best regards.

Sergio

Viewing all 6412 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>