Quantcast
Channel: SCN : Discussion List - SAP HANA Developer Center
Viewing all articles
Browse latest Browse all 6412

ERROR SPS10 com.sap.hana.spark.conf.HanaESConfig - Hana Extened Store Configuration is missing

$
0
0

I'm Using SAP HANA SPS10

 

When I try to start the Spark Controller I get the following.

 

[root@quickstart bin]# ./hanaes start

 

+ export HANA_ES_HEAPSIZE=8172

+ HANA_ES_HEAPSIZE=8172

+ export HANA_ES_PIDFILE=/tmp/hana.spark.controller

+ HANA_ES_PIDFILE=/tmp/hana.spark.controller

++ dirname ./hanaes

+ bin=.

++ cd .

++ pwd

+ bin=/usr/sap/spark/controller/bin

+ DEFAULT_ESCONF_DIR=/usr/sap/spark/controller/bin/../conf

+ '[' -f /usr/sap/spark/controller/bin/../conf/hana_hadoop-env.sh ']'

+ . /usr/sap/spark/controller/bin/../conf/hana_hadoop-env.sh

++ export HADOOP_CONF_DIR=/etc/hadoop/conf

++ HADOOP_CONF_DIR=/etc/hadoop/conf

++ export HIVE_CONF_DIR=/etc/hive/conf

++ HIVE_CONF_DIR=/etc/hive/conf

+ '[' 1 -gt 1 ']'

+ '[' -e /conf/hadoop-env.sh ']'

+ DEFAULT_CONF_DIR=etc/hadoop/conf

+ export HADOOP_CONF_DIR=/etc/hadoop/conf

+ HADOOP_CONF_DIR=/etc/hadoop/conf

+ '[' -f /etc/hadoop/conf/hadoop-env.sh ']'

+ . /etc/hadoop/conf/hadoop-env.sh

+++ [[ ! /usr/lib/hadoop-mapreduce =~ CDH_MR2_HOME ]]

+++ echo /usr/lib/hadoop-mapreduce

++ export HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce

++ HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce

++ export 'YARN_OPTS=-Xms52428800 -Xmx52428800 -Djava.net.preferIPv4Stack=true '

++ YARN_OPTS='-Xms52428800 -Xmx52428800 -Djava.net.preferIPv4Stack=true '

++ export 'HADOOP_CLIENT_OPTS=-Djava.net.preferIPv4Stack=true '

++ HADOOP_CLIENT_OPTS='-Djava.net.preferIPv4Stack=true '

+ '[' -f /libexec/hdfs-config.sh ']'

+ [[ -z /usr/java/jdk1.7.0_67-cloudera ]]

+ JAVA=/usr/java/jdk1.7.0_67-cloudera/bin/java

+ '[' 8172 '!=' '' ']'

+ JAVA_HEAP_MAX=-Xmx8172m

+ CLASSPATH='/usr/sap/spark/controller/bin/../conf:/etc/hadoop/conf:/etc/hive/conf:../*:../lib/*:/*:/lib/*:/*:/lib/*'

+ CLASSPATH='/usr/jars/*:/usr/sap/spark/controller/bin/*'

+ HANAES_OUT=/var/log/hanaes/hana_controller.log

+ case $1 in

+ echo -n 'Starting HANA Spark Controller ... '

Starting HANA Spark Controller ... + '[' -f /tmp/hana.spark.controller ']'

++ cat /tmp/hana.spark.controller

+ kill -0 15035

+ echo ' Class path is /usr/jars/*:/usr/sap/spark/controller/bin/*'

Class path is /usr/jars/*:/usr/sap/spark/controller/bin/*

+ '[' 0 -eq 0 ']'

+ /bin/echo -n 20646

+ nohup /usr/java/jdk1.7.0_67-cloudera/bin/java -cp '/usr/jars/*:/usr/sap/spark/controller/bin/*' -XX:PermSize=128m -XX:MaxPermSize=256m -Xmx8172m com.sap.hana.spark.network.Launcher

+ sleep 1

+ echo STARTED

STARTED

 

### OUTPUT FROM THE LOG FILE

[root@quickstart bin]# cat /var/log/hanaes/hana_controller.log

 

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/jars/slf4j-simple-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/jars/spark-assembly-1.3.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/jars/livy-assembly-3.7.0-cdh5.4.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/jars/avro-tools-1.7.6-cdh5.4.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/jars/pig-0.12.0-cdh5.4.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]

[main] ERROR com.sap.hana.spark.conf.HanaESConfig - Hana Extened Store Configuration is missing

[main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Exception in thread "main" java.lang.IllegalArgumentException: Can not create a Path from an empty string

at org.apache.hadoop.fs.Path.checkPathArg(Path.java:127)

        at org.apache.hadoop.fs.Path.<init>(Path.java:135)

        at com.sap.hana.spark.network.Launcher$.setupClassPath(Launcher.scala:36)

        at com.sap.hana.spark.network.Launcher$delayedInit$body.apply(Launcher.scala:19)

        at scala.Function0$class.apply$mcV$sp(Function0.scala:40)

        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)

        at scala.App$$anonfun$main$1.apply(App.scala:71)

        at scala.App$$anonfun$main$1.apply(App.scala:71)

        at scala.collection.immutable.List.foreach(List.scala:318)

        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)

        at scala.App$class.main(App.scala:71)

        at com.sap.hana.spark.network.Launcher$.main(Launcher.scala:17)

        at com.sap.hana.spark.network.Launcher.main(Launcher.scala)

 

 

 

Any help would be greatly appreciated.


Viewing all articles
Browse latest Browse all 6412

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>