Skip to Content

Errors with HANA Hadoop Integration


I'm trying HANA Hadoop Integration using SAP HANA SPS09, Hadoop 2.7.1 and HIVE 1.2.1.

I have followed the videos on the youtube (SAP HANA Academy - SDA : Hadoop Enhancements - 1. Creating User [SPS09] - YouTube) and also read the HANA documents (Administration Guide) on this topic.

All I want to do is to follow the video to run WordCount using HANA Hadoop Integration.

So I have assigned all the necessary privileges to the user and created MR Job Archives, Remote Source and Virtual Functions as shown below:

I have created the mrjobs.

Then I have run the two SQLs successfully.

create remote source hadoop_source

adapter "hadoop"

configuration 'webhdfs_url=http://myserver:50070;webhcat_url=http://myserver:50111'

with credential type 'PASSWORD'

using 'user=hduser;password=hdpass';

create virtual function HADOOP_WORD_COUNT()

RETURNS TABLE ("word" NVARCHAR(60), "count" integer)

package DEV."dev.HanaShared::WordCount"

CONFIGURATION 'enable_remote_cache=true;mapred_jobchain=[{"mapred_input":"/data/mockingbird/",




Currently I'm experiencing an issue when executing the last step.

So I wish to check the results:

select * from HADOOP_WORD_COUNT();

But then the console gives me these errors:

Could not execute 'select * from HADOOP_WORD_COUNT()' in 74 ms 384 µs .

SAP DBTech JDBC: [2048]: column store error: search table error:  [2620] executor: plan operation failed;

I went to check the log file of my HANA and found these:

pop = executorPy.ceCustomCppPop() # pop1



pop.addViewAttribute('word', datatype=83, intDigits=60, sqlType=37, sqlLength=60)

pop.addViewAttribute('count', datatype=73, sqlType=3)



pop.addPlanDebugOpDataInfo(, scenarioName = 'DEV:_SYS_SS_CE_166142_139899986562304_3_TMP')

pop.addKeyValuePair( 'CONFIGURATION','enable_remote_cache=true;mapred_jobchain=[{"mapred_input":"/data/mockingbird","mapred_mapper":"sap.WordCountMapper","mapred_reducer":"sap.WordCountReducer"}]')

pop.addKeyValuePair( 'PACKAGE_NAME','dev.HanaShared09::WC')

pop.addKeyValuePair( 'PACKAGE_SCHEMA','DEV')

pop.addKeyValuePair( 'REMOTE','HADOOP_SOURCE')

pop.addKeyValuePair( 'RETURN_TYPE_INFO','[{"ftcType":37,"index":0,"length":60,"name":"word","scale":0},{"ftcType":3,"index":1,"length":0,"name":"count","scale":0}]')






[16093]{328130}[29/-1] 2015-10-21 14:24:36.867483 e cePlanExec       cePlanExecutor.cpp(07145) : Error during Plan execution of model DEV:_SYS_SS_CE_166142_139899986562304_3_RET (-1), reason: executor: plan operation failed;

[16060]{-1}[-1/-1] 2015-10-21 14:26:17.056104 w Logger           SavepointImpl.cpp(02149) : NOTE: BACKUP DATA needed to ensure recoverability of the database

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373304 i TraceContext     TraceContext.cpp(00823) : UserName=SYSTEM, ApplicationUserName=xxxxxxxx, ApplicationName=HDBStudio, ApplicationSource=csns.admin.commands.AdministrationHandler$1$;csns.admin.commands.AdministrationHandler$1$;;java.util.concurrent.ThreadPoolExecutor.runWorker(;java.util.concurrent.ThreadPoolExecutor$;;

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373288 e PlanViz : PlanVizContext is NULL!!

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373324 e PlanViz : Current session context: systemWatermark=30125,slaveInitCount=-1,version=5,contextId=104427,globalSessionId=328227,anchorGlobalSessionId=328227,version=0,user=SYSTEM,schema=SYSTEM,locale=en_US,collate=BINARY,client=,curr_id_val=-1,app=HDBStudio,app_user=xxxxxxx,dateformat=,reserveprefix=true,ddlautocommit=false,checkPasswordChangeNeeded=false,abapVarcharMode=false,largeNumberOfParametersSupport=false,isFederationCallbackSession=false,associatedConnectionId=0,totalRowCount=0,enableDeferredLobOperation=0,hasStatefulCtxBitmap=4,tmpTableCount=0,transactionIsolationLevel=1

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373356 e PlanViz : Stack trace:

1511995[thr=18169]: SqlExecutor at

1: 0x00007f4867900052 in Execution::ContextFunctions::dumpInfo(Execution::Context&, ltt::basic_ostream >&, bool, bool, bool, bool, bool)+0x390 at ContextFunctions.cpp:657 (

2: 0x00007f485bfd91aa in ptime::PlanVizActionParam::init(ptime::Env const&)+0x506 at (

3: 0x00007f485bf81ce2 in ptime::BuiltinProcedure_PLANVIZ_ACTION::execute(ptime::Env&) const+0x190 at PlanVizAction.h:14 (

4: 0x00007f485b53f4f6 in ptime::Proc_call::execute(ptime::Env&) const+0x3a2 at (

5: 0x00007f485b54006c in ptime::Proc_call::operator()(ptime::Env&) const+0x728 at (

6: 0x00007f485be292ae in ptime::Query::_execute(ptime::Transaction&, char const*, ptime::Query::Plan*, ptime::Query::param_t*, ptime::Query::result_t*, bool)+0x5fa at (

7: 0x00007f485be2f9db in ptime::Query::execute(ptime::Transaction&, char const*, ptime::Query::param_t*, ptime::Query::Plan*, ptime::Query::result_t*, ptime::Statement*, bool)+0x647 at (

8: 0x00007f485ab51b79 in ptime::Statement::execute_(Execution::Context&, bool, bool, bool, bool)+0x355 at (

9: 0x00007f485ab7b03c in ptime::CallableStatement::execute(Execution::Context&, bool, bool, bool, bool, ptime::Statement::BatchProcessingState, bool, bool, bool)+0x588 at (

10: 0x00007f4862863c8f in ptime::Session::executeQuery(Execution::Context&, ptime::Action&)+0xdb at (

11: 0x00007f4862858ff0 in ptime::SessionHandler::handleEvent(Execution::Context&, ptime::AppEvent*)+0x4f0 at (

12: 0x00007f486285a401 in ptime::SessionHandler::receiveMessage(Execution::Context&, ptime::CommEvent*)+0x960 at (

13: 0x00007f486287907c in ptime::TcpReceiver::doWork(Execution::Context&, ptime::CommMgr*)+0xd78 at (

14: 0x00007f4862879b0a in ptime::TcpReceiver::run(void*)+0x1d6 at (

15: 0x00007f4875feb4d4 in TrexThreads::PoolThread::run()+0x810 at PoolThread.cpp:256 (

16: 0x00007f4875fecfb0 in TrexThreads::PoolThread::run(void*&)+0x10 at PoolThread.cpp:124 (

17: 0x00007f4867958439 in Execution::Thread::staticMainImp(void**)+0x875 at Thread.cpp:488 (

18: 0x00007f4867958ffd in Execution::Thread::staticMain(void*)+0x39 at ThreadMain.cpp:26 (

Judging from the logs it seems the plan executor fails before the job could be able to be started.


I can successfully see the controller.jar in hadoop

Also I have uploaded the jars in the lib folder following the administration guide.

I tried different HANAs including SPS09 and SPS10 to perform the integration, all give me the same error.

So my question is has anyone happened to face the same issue?

Or if there is anything wrong with my previous steps?

Thanks in advance.

Lib.PNG (19.1 kB)
package.PNG (7.0 kB)
Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

1 Answer

  • avatar image
    Former Member
    Nov 24, 2015 at 03:37 PM

    Hi Jiang,

    I am also facing some error while connecting to Hadoop using SAP HANA SPS 10.

    I am getting  "SAP DBTech JDBC: [403]: internal error: Cannot get remote source objects: Credential not found" error while creating remote source.

    I followed SAP Note 2177933. In this note, it is mentioned to provide any user name and password. so I have used "hanaes" user name and password.

    CREATE REMOTE SOURCE "spark_demo" ADAPTER "sparksql" 

    CONFIGURATION 'port=7860;ssl_mode=disabled;server=<actual server>' 

    WITH CREDENTIAL TYPE 'PASSWORD' USING 'user=hanaes;password=hanaes'; 

    Can you please let me know what credentials should I use in remote connection.

    Thank you.

    Best regards,


    Add comment
    10|10000 characters needed characters exceeded

    • Former Member Ziyi Jiang

      Hello Ziyi Jiang,

      Thanks a lot for the help.

      As we are working on spark controller, I think this is little different with HIVE. But definitely I will check these posts.

      Thank you.

      Best regards,