Wiki Markup |
---|
h2. Computer |
...
Our five clusters are Xerxes, Darius2, Darius1, Cyrus1, and Quantum2. See below for additional information. For information about using Gaussian at MIT, see below.
...
Darius1
...
Clusters Our five clusters are Xerxes, Darius2, [Darius1|#darius], [Cyrus1|#cyrus1], and [Quantum2|#quantum2]. See below for additional information. For information about using [Gaussian at MIT|#gaussian], see below. {anchor:darius} h3. Darius1 _Hostname: darius1.csbi.mit.edu |
...
_ Darius1 is a 35-node cluster, |
...
installed April 2010. |
...
Information about Darius can be found on its [dedicated page |
...
Cyrus1
...
|darius]. {anchor:cyrus1} h3. Cyrus1 _Hostname: cyrus1.csbi.mit.edu |
...
_ Cyrus1 is a 24-node cluster, installed December 2008. |
...
Please note that the CSBi network, on which Cyrus1 is hosted, does not allow access from IP addresses external to MIT. For remote access to Cyrus1, see the [MIT IST VPN site |
...
Status notes
...
|http://ist.mit.edu.ezproxyberklee.flo.org/services/network/vpn]. h6. Status notes Node 10 (n010) is currently inaccessible. |
...
h6. Node information |
...
The table below lists the memory and swap file size of each node on Cyrus1, along with the amount of space used and free in each node's local /scratch/ directory. This information is current as of April 1, 2010. |
...
node
...
| node | memory (MB) |
...
| swap (MB) |
...
| scratch used |
...
| scratch free |
...
head
...
7982
...
16386
...
...
...
n001
...
7982
...
8197
...
147G
...
52G
...
n002
...
7982
...
8197
...
51G
...
148G
...
n003
...
7982
...
8197
...
50G
...
149G
...
n004
...
7982
...
8197
...
102G
...
97G
...
n005
...
7982
...
8197
...
29G
...
170G
...
n006
...
7982
...
8197
...
11G
...
188G
...
n007
...
7982
...
8197
...
47G
...
152G
...
n008
...
7982
...
8197
...
66G
...
133G
...
n009
...
7982
...
8197
...
84G
...
115G
...
n010
...
...
...
...
...
n011
...
7982
...
8197
...
27G
...
171G
...
n012
...
7982
...
8197
...
126G
...
73G
...
n013
...
7982
...
8197
...
23G
...
176G
...
n014
...
7982
...
8197
...
50M
...
198G
...
n015
...
7982
...
8197
...
22G
...
177G
...
n016
...
7982
...
8197
...
33M
...
198G
...
n017
...
7982
...
8197
...
63G
...
135G
...
n018
...
7982
...
8197
...
60G
...
139G
...
n019
...
7982
...
8197
...
2.1G
...
196G
...
n020
...
7982
...
8197
...
61G
...
138G
...
n021
...
7982
...
8197
...
33M
...
198G
...
n022
...
7982
...
8197
...
90G
...
109G
...
n023
...
7982
...
8197
...
33G
...
166G
...
n024
...
7982
...
8197
...
29G
...
170G
...
Quantum2
Hostname: quantum2.mit.edu
Quantum2 is a 20-node cluster installed in October 2007, which features high-memory nodes for quantum-chemical calculations.
...
Available Software
...
| | head | 7982 | 16386 | | | | n001 | 7982 | 8197 | 147G | 52G | | n002 | 7982 | 8197 | 51G | 148G | | n003 | 7982 | 8197 | 50G | 149G | | n004 | 7982 | 8197 | 102G | 97G | | n005 | 7982 | 8197 | 29G | 170G | | n006 | 7982 | 8197 | 11G | 188G | | n007 | 7982 | 8197 | 47G | 152G | | n008 | 7982 | 8197 | 66G | 133G | | n009 | 7982 | 8197 | 84G | 115G | | n010 | | | | | | n011 | 7982 | 8197 | 27G | 171G | | n012 | 7982 | 8197 | 126G | 73G | | n013 | 7982 | 8197 | 23G | 176G | | n014 | 7982 | 8197 | 50M | 198G | | n015 | 7982 | 8197 | 22G | 177G | | n016 | 7982 | 8197 | 33M | 198G | | n017 | 7982 | 8197 | 63G | 135G | | n018 | 7982 | 8197 | 60G | 139G | | n019 | 7982 | 8197 | 2.1G | 196G | | n020 | 7982 | 8197 | 61G | 138G | | n021 | 7982 | 8197 | 33M | 198G | | n022 | 7982 | 8197 | 90G | 109G | | n023 | 7982 | 8197 | 33G | 166G | | n024 | 7982 | 8197 | 29G | 170G | {anchor:quantum2} h3. Quantum2 _Hostname: quantum2.mit.edu_ Quantum2 is a 20-node cluster installed in October 2007, which features high-memory nodes for quantum-chemical calculations. h6. Available Software VMD 1.8.7 has been installed, and is working with a user's local Xwindows server. The executable is "/usr/local/vmd-1.8.7" (actually not a binary executable, but a script), and can be invoked using "vmd-1.8.7." |
...
A number of software packages have been installed in /home/gpw501/software/ |
...
Please contact gwood@mit.edu for any questions; a very limited usage guide is given below. |
...
GAMESS-US quantum chemistry package. |
...
{code |
}
Usage: /home/gpw501/software/gamess/rungms JOB VERNO NCPUS >& JOB.log &
JOB is the name of JOB.inp file to be executed
VERNO is the current version of gamess (01 at the time of writing)
NCPUS number of cpus
you must make a scratch directory that matches your user name on the
node you are running from i.e. /scratch/$USER
|
See GAMESS homepage for more details.
...
{code} See [GAMESS homepage|http://www.msg.ameslab.gov/GAMESS/] for more details. [CPMD|http://www.cpmd.org/] (QMMM version) plane-wave/PP quantum CP and BO dynamics. |
...
{code |
}
Usage: mpirun -n NCPUS cpmd.x JOB PATH-TO-PPs >& JOB.out &
mpirun should be set to /opt/openmpi/tcp-gnu/bin/mpirun in your .bashrc
NCPUS number of cpus
JOB name of job file to be executed
PATH-TO-PPs the path to a pseudo potential library (see for eg /home/gpw501/software/cpmd/pseudos/ )
JOB.out name of output file
|
...
{code} [Amber 10|http://ambermd.org/] molecular dynamics program. Executables are located in /home/gpw501/software/amber10/exe/ |
...
[Gromacs|http://www.gromacs.org/] molecular dynamics program. Was installed as root so binaries for MD and the gromacs tools are located in /usr/local/bin/. |
...
propka electrostatic and pka computations for proteins. See /home/gpw501/software/propka2.0src/README_PROPKA2.0 |
...
Status notes
...
Nodes 4 and 14 (n004 and n014) are currently inaccessible, due to apparent hard disk problems.
...
Node information
...
The table below lists the memory and swap file size of each node on Quantum2, along with the amount of space used and free in each node's local /scratch/ directory. This information is current as of April 1, 2010.
node | memory (MB) | swap (MB) | scratch used | scratch free |
n001 | 7970 | 16386 | 104G | 90G |
n002 | 7970 | 16386 | 163G | 31G |
n003 | 7970 | 16386 | 13G | 181G |
n004 |
|
| ||
n005 | 7970 | 16386 | 150G | 44G |
n006 | 7970 | 16386 | 138G | 56G |
n007 | 7970 | 16386 | 194G | 0 |
n008 | 7970 | 16386 | 182G | 12G |
n009 | 7970 | 16386 | 155G | 39G |
n010 | 7970 | 16386 | 133G | 61G |
n011 | 7970 | 16386 | 143G | 51G |
n012 | 7970 | 16386 | 122G | 72G |
n013 | 7970 | 16386 | 153G | 41G |
n014 |
|
| ||
n015 | 7970 | 16386 | 6.8G | 187G |
n016 | 7970 | 16386 | 15G | 180G |
n017 | 3942 | 16386 | 22G | 172G |
n018 | 7970 | 16386 | 16G | 178G |
n019 | 16026 | 16386 | 39G | 155G |
...
Gaussian03 at MIT
...
h6. Status notes Nodes 4 and 14 (n004 and n014) are currently inaccessible, due to apparent hard disk problems. h6. Node information The table below lists the memory and swap file size of each node on Quantum2, along with the amount of space used and free in each node's local /scratch/ directory. This information is current as of April 1, 2010. | node | memory (MB) | swap (MB) | scratch used | scratch free | | n001 | 7970 | 16386 | 104G | 90G | | n002 | 7970 | 16386 | 163G | 31G | | n003 | 7970 | 16386 | 13G | 181G | | n004 | | | | n005 | 7970 | 16386 | 150G | 44G | | n006 | 7970 | 16386 | 138G | 56G | | n007 | 7970 | 16386 | 194G | 0 | | n008 | 7970 | 16386 | 182G | 12G | | n009 | 7970 | 16386 | 155G | 39G | | n010 | 7970 | 16386 | 133G | 61G | | n011 | 7970 | 16386 | 143G | 51G | | n012 | 7970 | 16386 | 122G | 72G | | n013 | 7970 | 16386 | 153G | 41G | | n014 | | | | n015 | 7970 | 16386 | 6.8G | 187G | | n016 | 7970 | 16386 | 15G | 180G | | n017 | 3942 | 16386 | 22G | 172G | | n018 | 7970 | 16386 | 16G | 178G | | n019 | 16026 | 16386 | 39G | 155G | {anchor:gaussian} h3. Gaussian03 at MIT MIT currently has a site license to run Gaussian03 on IS&T-owned computers, such as those in Athena clusters. For more information, see [Gaussian on Athena|http://ist.mit.edu.ezproxyberklee.flo.org/services/software/gaussian/athena]. For information about creating/submitting Gaussian jobs, see these notes from the 10.675J course taught at MIT: [Basics of Running G03|http://ocw.mit.edu.ezproxyberklee.flo.org/NR/rdonlyres/Chemical-Engineering/10-675JFall-2004/5F4B11D4-60DF-46BF-82CD-0F193EA0E01D/0/g03_on_win.pdf]. |