Below Url's DataOps Server and DataOpsEngine with Yarn



Dataops Server Installation Document Reference.

1.Please go through the Prerequisites 
2.To Install the only Server use the below Command inside the extracted folder.
sudo ./DataOpsServer_Redhat
After installation please update the below line in /etc/systemd/system/dataops.service
Environment='JAVA_OPTS=-Djava.awt.headless=true -Djava.security.egd=file:/dev/urandom -Djava.io.tmpdir=/opt/datagaps/DataOpsServer/temp'
3.Need relocate postgresql data directory.

DataOps Engine Yarn setup
Prerequisites 
1.Need passwordless communication between nodes and within nodes.
2.datagaps user required in all nodes make home directory /opt/datagaps
Deployment
1.extract deployment files in /opt/datagaps/
2.edit the below files according to the environment.
   a./opt/datagaps/hadoop/etc/hadoop/core-site.xml
         hdfs://<hostname>:9000
   b./opt/datagaps/hadoop/etc/hadoop/yarn-site.xml
        updatehost and update values from qa environment
   c./opt/datagaps/livy7/conf/livy.conf
        livy.server.host <hostname>
   d./opt/datagaps/spark-3.0.1-bin-without-hadoop/conf/spark-defaults.conf
        update host name from 72 to 76th line
   e.add below lines to  /opt/datagaps/ .bashrc
       export HADOOP_HOME=/opt/datagaps/hadoop
      export PATH=${PATH}:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin
export JAVA_HOME=/opt/datagaps/hadoop/jre
export PATH=${PATH}:${JAVA_HOME}/bin
export HADOOP_CONF_DIR=/opt/datagaps/hadoop/etc/hadoop
f. add below line to /opt/datagaps/.profile
PATH=/opt/datagaps/hadoop/bin:/opt/datagaps/hadoop/sbin:$PATH
 3.send hadoop folder to other two nodes /opt/datagaps/ make sure the owner should be datagaps.
 4.Run below commands as datagaps user.  in master
      a.hdfs namenode -format
      b.start-dfs.sh
      c.start-yarn.sh
      d.hdfs dfs -mkdir -p hdfs:///var/log/spark/apps
      e.hdfs dfs -mkdir -p hdfs:///user/spark/warehouse
 5.copy livyserver.service to /etc/systemd/system/
 6.systemctl daemon-reload
 7.systemctl start livyserver.service

Temp and log directories locations
DataOpsServer
/opt/datagaps/DataOpsServer/temp
/opt/datagaps/DataOpsServer/logs
/tmp
DataOpsEngine
/opt/datagaps/hadoop/tmp
/opt/datagaps/spark-3.0.1-bin-without-hadoop/tmp
/opt/datagaps/livy7/tmp
/opt/datagaps/livy7/logs
/opt/datagaps/hadoop/logs
/opt/datagaps/spark-3.0.1-bin-without-hadoop/logs
/tmp