Skip to Content
author's profile photo Former Member
Former Member

Hadoop HANA Integration using BODS Error

We are trying to connect Hadoop HDFS file system using BODS.

  • Hadoop is installed and configured in Linux (3 nodes)
  • BODS is installed and configured in Windows (MS SQL Server as backend)
  • HDFS File system is accessible over the web browsers
  • DataFlow Attached
  • Job Log Attached
  • Software Version information provided as below

Hadoop System Version

Node versions : 2.7.1.2.3.4.0-3485

File System : hdfs

Operating System : RHEL 6.5

BODS Details

Version : BODS 4.1 SP07

Database : Microsoft SQL Server

Operating System : Windows 2012 Enterprise

Job Log

3188 5744 JOB 8/9/2016 7:49:11 AM Reading job <52be4348_a113_4768_9c20_ef9e09934999> from the repository; Server version is <14.2.6.1082>; Repository version is

3188 5744 JOB 8/9/2016 7:49:11 AM <14.2.6.0000>.

3188 5744 JOB 8/9/2016 7:49:11 AM Current directory of job <52be4348_a113_4768_9c20_ef9e09934999> is <C:\Program Files (x86)\SAP BusinessObjects\Data

3188 5744 JOB 8/9/2016 7:49:11 AM Services\bin>.

3188 5744 JOB 8/9/2016 7:49:12 AM Starting job on job server host <SAPBODS>, port <3500>.

3188 5744 JOB 8/9/2016 7:49:12 AM Job <HDFS_Test> of runid <2016080907491231885744> is initiated by user <Administrator>.

3188 5744 JOB 8/9/2016 7:49:12 AM Processing job <HDFS_Test>.

3188 5744 JOB 8/9/2016 7:49:12 AM Optimizing job <HDFS_Test>.

3188 5744 JOB 8/9/2016 7:49:12 AM Job <HDFS_Test> is started.

3372 5516 DATAFLOW 8/9/2016 7:49:13 AM Process to execute data flow <New_DataFlow8> is started.

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM The specified locale <eng_us.cp1252> has been coerced to <Unicode (UTF-16)> for data flow <New_DataFlow8> because the datastore

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM <SAP_HANA> obtains data in <Unicode (UTF-16)> codepage.

3372 5516 JOB 8/9/2016 7:49:14 AM Initializing transcoder for datastore <SAP_HANA> to transcode between engine codepage<Unicode (UTF-16)> and datastore codepage

3372 5516 JOB 8/9/2016 7:49:14 AM <<DEFAULT>>

3372 5516 JOB 8/9/2016 7:49:14 AM Initializing transcoder for datastore <File_Format_313> to transcode between engine codepage<Unicode (UTF-16)> and datastore

3372 5516 JOB 8/9/2016 7:49:14 AM codepage <UTF-8>

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM Data flow <New_DataFlow8> is started.

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM Cache statistics determined that data flow <New_DataFlow8> uses 0 caches with a total size of 0 bytes, which is less than (or

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM equal to) 3753902080 bytes available for caches in virtual memory. Data flow will use IN MEMORY cache type.

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM Data flow <New_DataFlow8> using IN MEMORY Cache.

3372 5516 BLKLOAD 8/9/2016 7:49:14 AM HANA table <TESTHADOOP>, type <Column store>, commit size <10000>, auto correct load <no>, update method <update>, update rows

3372 5516 BLKLOAD 8/9/2016 7:49:14 AM <no>, delete rows <no>.

3372 5516 BLKREAD 8/9/2016 7:49:14 AM Starting Pig. Command line used is <pig -f "C:/ProgramData/SAP BusinessObjects/Data

3372 5516 BLKREAD 8/9/2016 7:49:14 AM Services/log/hadoop/New_DataFlow8_3372_1/hdfsRead.ctl">.

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM Data flow <New_DataFlow8> is terminated due to error <50616>.

3372 5516 DATAFLOW 8/9/2016 7:49:14 AM Process to execute data flow <New_DataFlow8> is completed.

3188 5744 JOB 8/9/2016 7:49:14 AM Job <HDFS_Test> is terminated due to error <50616>.

Error Information

3372 5516 SYS-050616 8/9/2016 7:49:14 AM |Data flow New_DataFlow8

3372 5516 SYS-050616 8/9/2016 7:49:14 AM Cannot find the full path for file <pig>, due to error <2>.

3188 5744 SYS-050616 8/9/2016 7:49:15 AM |Data flow New_DataFlow8

3188 5744 SYS-050616 8/9/2016 7:49:15 AM Cannot find the full path for file <pig>, due to error <2>.

Could anyone, please help to resolve this error?

Thank you,

Karthic Sekar

Add a comment
10|10000 characters needed characters exceeded

Assigned Tags

Related questions

2 Answers

  • Posted on Aug 09, 2016 at 05:42 PM

    Moving to Data Services Discussions.

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Aug 10, 2016 at 09:40 AM

    Hi Karthic,

    The file should be placed on the server where the job server resides. Please refer below link for more details:


    Shared Directory access - Enterprise Information Management - SCN Wiki

    Regards,

    Santosh G.

    Add a comment
    10|10000 characters needed characters exceeded

    • Former Member Aasavari Bhave

      Hi Aasarvari,

      Thanks for your suggestion.

      Yes, I am not able to preview the data. I have attached the error screenshot on my original post, please refer BODS_Dataflow.png

      And may I please know where I can download HDFS and PIG client and where I should place it in order to set the path in windows server.

      Thank you,

      Karthic Sekar

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.