## Clone HiC Repo This may not be necessary, but running the various tests to get a feel for the inputs and outputs is nice. ``` # if using git ssh key git clone git@github.com:ENCODE-DCC/hic-pipeline.git ``` ## Environment Setup The HiC pipeline is controlled by the Caper job manager. Caper is available on PyPi and only requires an additional installation of the Java Development Kit to function. Java can also be installed via `conda` for ease of use, or the Java module can be loaded on Cheaha. The following commands will set up an `encode` environment. The `env.yml` is included in this snippet. ``` module load Anaconda3 conda create -n encode -f env.yml ``` After the environment is created, you only need to activate the environment to run `caper` now and in the future. ``` conda activate encode ``` ## Caper Configuration Some initial configuration is needed for Caper to run correctly. First, activate the conda environment using the command above. Then run the following: ``` caper init slurm ``` This will set up some configuration files for caper. You will need to change the `$HOME/.caper/default.conf` file to have the following: ``` backend=slurm slurm-partition=amd-hdr100,long slurm-leader-job-resource-param=-t 150:00:00 --mem 8G local-loc-dir=/scratch/<username>/caper_cache cromwell=/home/<username>/.caper/cromwell_jar/cromwell-82.jar womtool=/home/<username>/.caper/womtool_jar/womtool-82.jar ``` Replace `<username>` with your username before running Caper. It does not look like it expands bash environment variables in the paths, so using `$USER_SCRATCH` or `$USER` does not work. This default configuration will submit all jobs to the `amd-hdr100` or `long` partitions. All of the nodes in that partition have enough resources to run each job in the test pipeline, but these are subject to change based on the analysis. ## Running Caper Tests Change directory to wherever you would like the outputs to be saved. This section assumes you have cloned the pipeline repository, are in the top level directory for the repo, and are running the general HiC test. ``` caper hpc submit hic.wdl -i tests/functional/json/test_hic.json --singularity --leader-job-name test_encode_hic ``` This will submit a leader job that manages the other jobs in the pipeline. You can monitor the status of the child jobs using `squeue -u $USER`.