Biowulf at the NIH
RSS Feed
Rseg on Biowulf

The RSEG software package is aimed to analyze ChIP-Seq data, especially for identifying genomic regions and their boundaries marked by diffusive histone modification markers, such as H3K36me3 and H3K27me3.


NOTE: Related rseg files have been downloaded under /fdb/rseg


The environment variable(s) need to be set properly first. The easiest way to do this is by using the modules commands as in the example below.

$ module avail rseg
-------------------- /usr/local/Modules/3.2.9/modulefiles ----------------------
rseg/0.4.4          rseg/0.4.7          rseg/0.4.8(default)

$ module load rseg

$ module list
Currently Loaded Modulefiles:
1) rseg/0.4.8 $ module unload rseg $ module load rseg/0.4.7 $ module show rseg ------------------------------------------------------------------- /usr/local/Modules/3.2.9/modulefiles/rseg/0.4.8: module-whatis Sets up rseg 0.4.8 prepend-path PATH /usr/local/apps/rseg/0.4.8/bin prepend-path LD_LIBRARY_PATH /usr/local/gsl-x86_64/lib -----------------------------------------------------------------


Submitting a Single Batch Job

1. Create a script file alone the following lines:

# This file is FileName
#PBS -N RunName
#PBS -m be
#PBS -k oe

module load rseg
cd /data/$USER/rseq
rseg -c /fdb/rseg/XXX.bed -o /data/$USER/rseg/out -i 20 -v /data/$USER/fileName 

3. Submit the script using the 'qsub' command on Biowulf.

$ qsub -l nodes=1 /data/username/theScriptFileAbove


Submitting a Swarm of Jobs

Using the 'swarm' utility, one can submit many jobs to the cluster to run concurrently.

Set up a swarm command file (eg /data/$USER/cmdfile). Here is a sample file:

module load rseg; rseg -c /fdb/rseg/XXX.bed -o /data/$USER/rseg/out1 -i 20 -v /data/$USER/fileName1
module load rseg; rseg -c /fdb/rseg/XXX.bed -o /data/$USER/rseg/out2 -i 20 -v /data/$USER/fileName2
module load rseg; rseg -c /fdb/rseg/XXX.bed -o /data/$USER/rseg/out3 -i 20 -v /data/$USER/fileName3

There are one flag of swarm that's required '-f'.

-f: the swarm command file name above (required)

By default, each line of the commands above will be executed on '1' processor core of a node and uses 1GB of memory. If this is not what you want, you will need to specify '-g' flags when you submit the job on biowulf.

Say if each line of the commands above also will need to use 10gb of memory instead of the default 1gb of memory, make sure swarm understands this by including '-g 10' flag:

biowulf> $ swarm -g 10 -f cmdfile

For more information regarding running swarm, see swarm.html


Running an Interactive Job

User may need to run jobs interactively sometimes. Such jobs should not be run on the Biowulf login node. Instead allocate an interactive node as described below, and run the interactive job there.

$ qsub -I -l nodes=1
qsub: waiting for job 2236960.biobos to start
qsub: job 2236960.biobos ready

$ module load rseg
$ cd /data/$USER/somewhereWithInputfile
$ rseg -c /fdb/rseg/XXX.bed -o /data/$USER/rseg/out -i 20 -v /data/$USER/fileName
$ ...........
$ exit
qsub: job 2236960.biobos completed

User may add property of node in the qsub command to request specific interactive node. For example, if you need a node with 8gb of memory to run job interactively, do this:

$ qsub -I -l nodes=1:g8