Biowulf at the NIH
RSS Feed
MISO on Biowulf

MISO (Mixture-of-Isoforms) is a probabilistic framework that quantitates the expression level of alternatively spliced genes from RNA-Seq data, and identifies diff erentially regulated isoforms or exons across samples. By modeling the generative process by which reads are produced from isoforms in RNA-Seq, the MISO model uses Ba yesian inference to compute the probability that a read originated from a particular isoform.

The MISO framework is described in Katz et. al., Analysis and design of RNA sequencing experiments for identifying isoform regulationNature Methods (2010).

 

Running a batch job

NOTE: Some annotations are available under /fdb/miso. If you believe your annotation is commonly used and does not exist, please contact staff@helix.nih.gov

biowulf> $ mkdir /data/$USER/miso
biowful> $ cd /data/$USER/miso

1. Create a script file. The file will contain the lines similar to the lines below. Modify the path of program location before running.

#!/bin/bash
# This file is YourOwnFileName
#
#PBS -N yourownfilename
#PBS -m be
#PBS -k oe

module load miso
cd /data/$USER
index_gff.py --index /fdb/miso/hg19/SE.hg19.gff3 indexed_SE_events/
run_events_analysis.py --compute-genes-psi indexed_SE_events/ my_sample1.bam --output-dir my_output1/ --read-len 36 --use-cluster 
run_events_analysis.py --compute-genes-psi indexed_SE_events/ my_sample2.bam --output-dir my_output2/ --read-len 36 --use-cluster
....
....

2. Submit the script using the 'qsub' command on Biowulf,

biowulf> $ qsub -l nodes=1:g24:c16 /data/$USER/theScriptFileAbove

 

 

Submitting a swarm of jobs

Using the 'swarm' utility, one can submit many jobs to the cluster to run concurrently.

Set up a swarm command file (eg /data/$USER/cmdfile). Here is a sample file:

cd /data/$USER/XXX-1; index_gff.py --index SE.mm9.gff indexed_dir; run_events_analysis.py ........
cd /data/$USER/XXX-2; index_gff.py --index SE.mm9.gff indexed_dir; run_events_analysis.py ........
........
........ cd /data/$USER/XXX-20; index_gff.py --index SE.mm9.gff indexed_dir; run_events_analysis.py ........

 

There are one flag of swarm that's required '-f' and another often used flag '-g'.

-f: the swarm command file name above (required)
-g: GB of memory needed for each line of the commands in the swarm file above.(optional)

By default, each line of the commands above will be executed on '1' processor core of a node and uses 1GB of memory. If this is not what you want, you will need to specify '-g' flags when you submit the job on biowulf.

Say if each line of the commands above also will need to use 10gb of memory instead of the default 1gb of memory, make sure swarm understands this by including '-g 10' flag:

[user@biowulf]$ swarm -g 10 -f cmdfile --module miso

For more information regarding running swarm, see swarm.html

 

Running an interactive job

User may need to run jobs interactively sometimes. Such jobs should not be run on the Biowulf login node. Instead allocate an interactive node as described below, and run the interactive job there.

[user@biowulf] $ qsub -I -l nodes=1
qsub: waiting for job 2236960.biobos to start
qsub: job 2236960.biobos ready

[user@p4]$ cd /data/user/myruns
[user@p4]$ module load miso
[user@p4]$ index_gff.py --index SE.mm9.gff indexed
[user@p4]$ .......
[user@p4] exit
qsub: job 2236960.biobos completed
[user@biowulf]$

User may add property of node in the qsub command to request specific interactive node. For example, if you need a node with 24gb of memory to run job interactivel y, do this:

[user@biowulf]$ qsub -I -l nodes=1:g24:c16

 

Documentation

http://genes.mit.edu/burgelab/miso/docs/