Overview of IT Support for scientific laboratories
Information technologies provides professionally managed, and secured, IT facilities and services in support of UMass Chan research. Facilities include a 5,000 CPU core computational cluster accompanied by 44 GPU nodes interconnected by a fast ethernet and infiniband interconnect to 2 Petabytes of solid-state storage, housed in a commercial data center. UMass Chan also provides cloud-based software infrastructure and tools in AWS. These systems are managed and secured by a professional staff to meet federal and international regulations for the protection of electronic protected health information (ePHI) including HIPAA, GDPR, SOC1 Type II, SOC2 Type II, ISO 27001, PCI DSS, and ISO 27701. Professional staff also provide consultation services, collaboration tools such as electronic lab notebooks, electronic data capture, and data transfer capabilities.
Dedicated Professional Staffing
- Dedicated professional IT staff that administer, operate and secure research IT systems
- Robust support for research tools, data management, and data repositories.
Cybersecurity
- Dedicated cybersecurity team ensuring compliance with NIH and federal regulations
Specialized Research IT Support
- Dedicated IT support staff with expertise in biomedical research applications
- Consultation services for computational biology, bioinformatics, and data science
- Custom software development and integration services for research-specific needs
- Comprehensive support for servers & equipment belonging to labs
High-Performance Computing (HPC)
- Scientific Computing for Innovation (SCI) Cluster: 5,000+ CPU cores, 44+ GPU nodes, interconnected by both 100GB ethernet and Infiniband interconnects.
- Red Hat Enterprise Linux 8 OS, using the IBM Spectrum LSF job scheduler.
- Specialized biomedical software suite including bioinformatics tools, molecular modeling packages, and data analysis platforms
- Open OnDemand cluster access for web and graphical applications
- Support for parallel computing and large-scale data processing
Data Storage
- SCI Single-tiered storage system with over 2 PB Flash (SSD) capacity
- High-speed access to research data via 100 Gbps network backbone
Network Infrastructure
- Campus-wide WiFi 6 with 10 Gbps ethernet to the desktop. Spine and leaf network topology with 40 Gbps interconnects among research buildings, and 100 Gbps connectivity to the data center
- Secure VPN access for remote access and collaboration
- Participation in Internet2 and regional research networks for high-speed data transfer with collaborating institutions
- 100 Gbps WAN connectivity to Internet2, direct connects to all major cloud providers.
Datacenter
- Commercial data center in Lowell with 20-year history of 100% sustained uptime with diverse, redundant power from 3 independent providers.
- Certified compliance with GDPR, SOC1 Type II, SOC2 Type II, ISO 27001, HIPAA, PCI DSS, and ISO 27701.
- 1,750 sq feet (expandable) with 22kW power to each rack and 900 tons cooling.
Security
- Multi-factor authentication and encryption protocols for data protection
- Regular security audits and vulnerability assessments
- User access and audit log aggregation
- Vulnerability and attack vector scans
Cloud Computing Resources
- Support for cloud-based data analysis and machine learning workflows
- Secure data transfer between on-premises and cloud environments
- AWS-based software infrastructure and tools available secured at HIPAA levels.
- Redundant and durable storage options to help ensure compliance with NIH data retention policies
- Disaster recovery planning - Air gapped & off-site data backups available in different geographic regions
- Dedicated dark-fiber interconnects from campus to cloud datacenters
Collaboration Tools
- High-definition video conferencing facilities for remote collaboration
- Electronic lab notebook system (LabArchives) for efficient data management and sharing
- Globus endpoints supporting faster file transfer and sharing of research data
- Project management and version control tools for coordinating research efforts
UMass Chan Medical School is committed to providing and maintaining cutting-edge IT resources that enable our researchers to conduct world-class biomedical research. Our IT facilities are continuously updated to meet the evolving needs of our research community and support demanding computational and data-intensive projects in biomedical science.
Appendix
Detailed High Performance Computing node descriptions
· 62 Dell R640 nodes with 40 Intel Xeon Gold 6230 CPU cores running at @2.10 GHz and 380GB of memory.
o Available to the interactive, large, long, and short queues.
· 20 Dell C6525 nodes with 128 AMD EPYC 7702 cores running at 3.9 GHz and 500GB of memory.
o Available to the interactive, long, and short queues.
· 10 Dell C4140 nodes with four NVIDIA Tesla V100-SXM2-32GB devices, 40 Intel Xeon Gold 6230 CPU cores running at @2.1 GHz and 380GB of memory.
o Available to the gpu queue.
· 1 Dell XE8545 node with four NVIDIA A100-SXM4-40GB devices, 128 AMD EPYC 7763 Cores running at 2.4GHz, and 500GB of memory.
o Available to the gpu queue.
Rates
- No cost for cluster computation
- $100/Terabyte/year for Flash (SSD) storage