To use DataOps REST API In the ADF pipeline we need to add 2 web activities from the General section.
for example, if we add two activities with the names Web1 and Web2(Run Dataflowby Id/Run Pipeline BY Id)
for Web1 activity, we need to use the below settings(This is to get a token, we will use this token in Web2 using JSON output)
If we install DataOps in private network then we need to setup the integration runtime
Create Azure integration runtime - Azure Data Factory & Azure Synapse | Microsoft Learn
Integration runtime - Azure Data Factory & Azure Synapse | Microsoft Learn
In the advanced option, we have to select Integration runtime.
URL: https://<host>/dataopssecurity/oauth/token
Method:POST
Headers: Name: Content-Type Value: application/x-www-form-urlencoded
Body:username=<username>&password=<encrypted password>%3D&grant_type=password
! need to replace <username> and <encrypted password>(you can get it from the user profile screen we have to remove ending letter "=" in place of to we are putting %3D)
Authentication: Basic
Username: <Client ID need to get it from datagaps Team>
Password: <Client Secret need to get it from datagaps Team>
for Web2(Run DataFlow By Id) activity, we need to use the below settings
URL: https://<host>/DataFlowService/api/v1.0/dataFlows/executeDataFlow?dataflowId=<dataflowID>&livyid=<Livy Server ID>
You can get <dataflowID> from the dataflow edit screen and <Livy Server ID> from the cluster screen
Method: POST
Headers: Name: Authorization Value:@{concat('Bearer ', activity('Web1').output.access_token)}
for Web2(Run Pipeline By Id) activity, we need to use the below settings
URL: https://<host>/piper/jobs
You can get <pipelineID> from the pipeline edit screen and <engine ID> from the cluster screen
Method: POST
Body:
{
"pipelineId":"be23b136979e4a12968ab8586650f4e6",
"inputs":{
"pipelineDefaultEngine": 101,
"param": "limit 10"
}
}
In Body Param and Default Engine are optional
Headers: Name: Authorization Value:@{concat('Bearer ', activity('Web1').output.access_token)}