Skip to Content
0
Oct 30, 2019 at 10:10 AM

Data Intelligence: How to export files from "Train" operator other than as artifacts ?

133 Views Last edit May 27, 2020 at 02:02 PM 4 rev

Hi,

when trying to copy files from the "train" operator environment to other storage types we experinece issues:

  • Using api.send for sending to the pipeline does not seem to work . The api package is unknown.
  • Copy to file system is also not working for us: Error is "no such file or directory"
from shutil import copyfile
copyfile("test.txt", "tmp/test.txt") 

# copyfile("test.txt", "vrep/vflow/tmp/test.txt") did not work also
  • using the datalake connection also failed with "socket.gaierror: [Errno -2] Name or service not known"
from hdfs import InsecureClient
client = InsecureClient('http://datalake:50070')
  • using this does not throw an error and the os.listdir(dataset_artefacts_path ) command actually lists the file, but it is not visible in the explorer.
dataset_artefact_path = data_artifact.get_path() + '/test.txt'
shutil.copy2("test.txt",  dataset_artefact_path)

Any help is appreciated !

Marcus