When a customer is working in Environments using Docker and have mounted their shared drive into  EC2 Instance and again those folders mounted to container in some path (ex: /bam/aws/devel/team_qa ), and when they try the below code in the code component of Datagaps it will return empty list.

(Note : DataOps Suite have suppor for glob.glob() function in the code component)



The reason for this is user has given wrong path at directory code field. User should give the correct path, in this example they can give path like : " directory = '/bam/aws/devel/team_qa/****' " then it will not result the empty list.


In such scenarios we also need to check the files in the paths and we have to add all required paths and files. Below code will help us to check this point.


import os

print(os.listdir('/etc/spark/conf.dist/'))
#b=open("/databricks/spark/conf/datagaps.properties")#b=open("/databricks/spark/logs/python-utilities.log")

b=open("/etc/spark/conf.dist/datagaps.properties")

for line in b:

    print(line)