Budget guidance for grant submissions can be found here: Grant Budgeting Information
Research technology offers access to
- High performance computing, including GPUs
- 10TB of shared research storage
- A range of software and applications ready to use
- Assistance and advice on computing, data analytics, data visualization and programming support
- Container based application development environments
- A quick turnaround on technical issues or problems
batch is a compilation of 21 compute nodes of varying age, CPU and Memory structure, ranging from 12 - 48 CPUs per node, and ~64 - 256GB of ram. CPU architecture also spans sse4_2, avc, av2, and avx512. Therefore, certain programs may run incorrectly or fail to spawn on different nodes which should be taken into consideration when submitting a script. Please contact us if you are having trouble getting a program to compile or execute on a specific node.
To get access to the computing and storage resources, send an email with your netid to firstname.lastname@example.org
batch runs the Oracle RHEL 7 linux operating system, so all software used must be compatible with linux to use this system. All compute nodes are using the same linux image and base packages. If you are not familiar with linux please schedule a consultation or follow one of the many introduction to linux courses or videos available on youtube.
Connecting to batch
Access to batch is provided by first connecting to a headnode over Secure SHell (SSH). SSH can also be used to copy data to or from a given directory on batch, using scp or sftp. Two headnodes are provided for accessing storage and computational resources: head.arcc.albany.edu and headnode7.arcc.albany.edu. The headnodes should not be used to run computations, all computations must be submitted to batch via the scheduler. On a mac or linux operating system, you can connect via SSH through the terminal, and on windows you can use PuTTY or X2GO. For more information, see the How-to: Connect via SSH (PuTTY, macOS terminal, X2Go) guide.
batch is only accessible from within the University at Albany network. To connect from an external network, you must use the VPN. Please read more at the VPN GlobalProtect Service page.
For a live view of cluster usage, please see: https://computing.app.arcc.albany.edu/
Each user has a $HOME (/network/rit/home/[netid]) directory, with a quota of 10GB. The $HOME directory is backed up with "snapshot", which are daily for 21 days and hourly for the past 23 hours. If you need to restore a file, please see the How-to: Restoring a deleted file or directory guide.
Each research faculty member can apply for a $LAB (/network/rit/lab/[lab_name]) directory. The $LAB directory has a default quota of 10TB, and can be shared with members in the university as well as external users, by request of the directory owner. The $LAB directory is also backed up with "snapshot", which are daily for 21 days and hourly for the past 23 hours. To request a $LAB directory, please fill out the Research Storage Request form. To see your storage footprint including your snapshot space, please visit https://storage.app.arcc.albany.edu.
If you would like to access your storage drives from off campus, please see How-to: Mapping to a network drive
Note that snapshot space does impact the overall directory quota. If you need snapshots purged, please contact email@example.com
To run code on batch, you will have to submit a job to the scheduler, no computations should be performed on the headnodes. batch uses the SLURM scheduler to assign resources to each job, and manage the job queue. When requesting resources, the smallest unit that can be requested is 1 CPU and 100mb of memory. The maximum request is up to three compute nodes. If you need help optimizing your job scheduling, please contact firstname.lastname@example.org. For more information, please see How-to: Scheduling via SLURM.
Group Owners can request additions and deletions to their groups/lab folders.
Faculty can request guest accounts for researchers outside of the university, who need access to their datasets for collaborations. Please fill out the Guest Account For Research Collaboration Request Form.
Faculty members have access to 10TB of storage for their lab groups. To request a research storage allocation, please fill out the Research Storage Request Form
Base linux environments are available for use in the classroom. To request an academic virtual machine, please fill out the Academic VM Request Form
|How-to: Scheduling via SLURM||Nicholas Schiraldi||Apr 08, 2020|
|How-to: Using RStudio||Nicholas Schiraldi||Mar 03, 2020|
|How-to: Connect via SSH (PuTTY, macOS terminal, X2Go)||Nicholas Schiraldi||Jan 30, 2020|
|How-to: Mapping to a network drive||Nicholas Schiraldi||Jun 27, 2019|
|How-to: Use Matlab on the cluster||Nicholas Schiraldi||Jun 24, 2019|
|How-to: Using CermVM-FS||Nicholas Schiraldi||Jun 07, 2019|
|How-to: Using screen or tmux to preserve a Linux terminal session||Nicholas Schiraldi||Apr 25, 2019|
|How-to: Restoring a deleted file or directory||Nicholas Schiraldi||Apr 15, 2019|
|How-to: Using Jupyterhub||Nicholas Schiraldi||Apr 15, 2019|
- No labels