Biowulf at the NIH
RSS Feed
Dindel on Biowulf

Dindel is a program for calling small indels from short-read sequence data ('next generation sequence data'). It is currently designed to handle only Illumina data.

Dindel takes BAM files with mapped Illumina read data and enables researchers to detect small indels and produce a VCF file of all the variant calls. It has been written in C++ and can be used on Linux-based and Mac computers (it has not been tested on Windows operating systems).

This program was developed by Cornelis Albers together with Gerton Lunter (Wellcome Trust Centre for Human Genetics, University of Oxford) and Richard Durbin (Wellcome Trust Sanger Institute).


The environment variable(s) need to be set properly first. The easiest way to do this is by using the modules commands as in the example below.

$ module avail dindel
-------------------- /usr/local/Modules/3.2.9/modulefiles ----------------------

$ module load dindel

$ module list
Currently Loaded Modulefiles:
1) dindel/1.01 $ module unload dindel $ module load dindel/1.01 $ module show dindel ------------------------------------------------------------------- /usr/local/Modules/3.2.9/modulefiles/dindel/1.01: module-whatis Sets up dindel 1.01 prepend-path PATH /usr/local/apps/dindel/1.01/binaries -----------------------------------------------------------------

Submitting a Single Batch Job

1. Create a script file along the following lines.

# This file is FileName
#PBS -N RunName
#PBS -m be
#PBS -k oe

module load dindel
cd /data/user/dindel/run1 
dindel --analysis getCIGARindels --bamFile sample.bam --outputFile sample.dindel_output --ref ref.fa

3. Submit the script using the 'qsub' command on Biowulf.

$ qsub -l nodes=1 /data/$USER/theScriptFileAbove


Submitting a Swarm of Jobs

Using the 'swarm' utility, one can submit many jobs to the cluster to run concurrently.

Set up a swarm command file (eg /data/$USER/cmdfile). Here is a sample file:

dindel --analysis getCIGARindels --bamFile sample1.bam --outputFile sample.dindel_output1 --ref ref.fa
dindel --analysis getCIGARindels --bamFile sample2.bam --outputFile sample.dindel_output2 --ref ref.fa
dindel --analysis getCIGARindels --bamFile sample3.bam --outputFile sample.dindel_output3 --ref ref.fa


-f: the swarm command file name above (required)
--module: setup environmental variables for each dindel job
-g: GB of memory needed for each line of the commands in the swarm file above.(optional)

By default, each line of the commands above will be executed on '1' processor core of a node and uses 1GB of memory. If this is not what you want, you will need to specify '-g' flags when you submit the job on biowulf.

Say if each line of the commands above also will need to use 10gb of memory instead of the default 1gb of memory, make sure swarm understands this by including '-g 10' flag:

biowulf> $ swarm -g 10 --module dindel -f cmdfile

For more information regarding running swarm, see swarm.html


Running an Interactive Job

User may need to run jobs interactively sometimes. Such jobs should not be run on the Biowulf login node. Instead allocate an interactive node as described below, and run the interactive job there.

$ qsub -I -l nodes=1
$ module load dindel
$ dindel --analysis getCIGARindels --bamFile sample.bam --outputFile sample.dindel_output --ref ref.fa
$ exit
	qsub: job 2236960.biobos completed

User may add property of node in the qsub command to request specific interactive node. For example, if you need a node with 8gb of memory to run job interactively, do this:

$ qsub -I -l nodes=1:g8