Internet2 Announces Selection of Two Research Teams for Final Phase of Exploring Clouds for Acceleration of Science Project
The Internet2 project, supported in part through contributions from Amazon Web Services and Google Cloud, awards researchers from the Massachusetts Institute of Technology (MIT) and State University of New York Downstate (SUNY Downstate) second year of support with a focus on delivering scientific results
WASHINGTON, D.C., September 3, 2020 – Internet2 confirmed the selection of two research teams using an external academic review panel for the second and final phase of the Exploring Clouds for Acceleration of Science (E-CAS) project that was first announced in November 2018. The two research projects are:
- Investigating Heterogeneous Computing at the Large Hadron Collider, Philip Harris, MIT. Only a small fraction of the 40 million collisions per second at the Large Hadron Collider (LHC) are stored and analyzed due to the huge volumes of data and the compute power required to process it. This project proposes a redesign of the algorithms using modern machine learning techniques that can be incorporated into heterogeneous computing systems, allowing more data to be processed and thus larger physics output and potentially foundational discoveries in the field.
- Deciphering the Brain’s Neural Code Through Large-Scale Detailed Simulation of Cortical Circuits, Salvador Dura-Bernal and William Lytton, SUNY Downstate. This project aims to help decipher the brain’s neural coding mechanisms with far-reaching applications, including developing treatments for brain disorders, advancing brain-machine interfaces for people with paralysis, and developing novel artificial intelligence algorithms. Using a software tool for brain modeling, researchers will run thousands of parallelized simulations exploring different conditions and inputs to the simulation of brain cortical circuits.
The second phase of the E-CAS project builds on lessons learned and leading practices that have been identified by the six research proposals that were selected in March 2019 with the goal of producing a deeper understanding of the use of cloud computing in accelerating scientific discoveries.
“The first phase of the E-CAS project supported six teams to develop their computational workflows and test them at scale, and the results from all teams were very impressive,” said Howard Pfeffer, president and CEO, Internet2. “Now in the second phase, the teams from MIT and SUNY Downstate have the opportunity to build on their technological achievements using the commercial cloud platforms with a focus on the scientific outcomes of their work.”
Investigating Heterogeneous Computing at the Large Hadron Collider
The research team from MIT has developed a range of new tools and code to take advantage of the newest graphical processing units (GPUs) and field programmable gate arrays (FPGAs) to perform accelerated machine learning tasks in Amazon Web Services (AWS) and Google Cloud using remote procedure calls from their main workflows running on high-performance clusters at MIT and FermiLab.
This allows the MIT-led team to accelerate the processing of data generated from the Large Hadron Collider by offloading the deep neural network algorithms to the cloud platforms, while executing the main central processing unit (CPU) intensive work on campus or at national research infrastructures. The use of hardware accelerators and machine learning has increased the throughput of certain algorithms by a factor of 20, or as much as 1,000 times that of CPU-based clusters and has allowed for the potential to integrate new and more advanced algorithms.
Fig. A: Overview of the grid computing that is used to process data coming from the LHC. Collisions like the one shown in the black box, which is a Higgs Boson identified with an advanced Deep Neural Network, start in Geneva, Switzerland, and are then sent across the globe to high performance computing (HPC) clusters to be processed. As part of the E-CAS project, the MIT-led team has integrated GPUs in Google Cloud and FPGAs in AWS into the existing infrastructure to allow HPCs to perform seamlessly in applying deep neural networks at speeds orders of magnitude faster than the local HPCs.
Deciphering the Brain’s Neural Code Through Large-Scale Detailed Simulation of Cortical Circuits
The research team from SUNY Downstate used a cluster of more that 100,000 cores on Google Cloud to run large scale, highly detailed models of the brain’s motor cortex and auditory cortex. This enabled them to rapidly explore large parameter spaces and employ computationally-demanding evolutionary algorithms to further refine the models and reproduce experimental results.
The simulations then provided insights into the molecular, cellular, and network mechanisms leading to movement and perception, which can help develop novel treatments for brain disorders. The team also deployed an auto-scaling, multi-user Google Cloud Kubernetes cluster that provided the wider scientific community with a friendly graphical user interface-based tool to develop their own brain circuit models online: www.netpyne.org.
Fig. B: Multiscale brain circuit modeling tool (NetPyNE) graphical interface showing a simplified version of the motor cortex model. The tool is hosted on Google Cloud Kubernetes and can be accessed for free by all student researchers who can build, simulate, and analyze brain circuit models online.
- Watch an E-CAS phase 1 workshop presentation on Deciphering Neural Code Through Large-Scale Motor Cortex Simulation by SUNY Downstate.
- Watch an E-CAS phase 1 workshop presentation on Heterogeneous Computing at the Large Hadron Collider by Massachusetts Institute of Technology.
- Read a blog article on studying the brain’s neural code by simulating cortical circuits on 100k simultaneous cores using Google Cloud.
For more information about the E-CAS project, visit www.internet2.edu/ecas.
This material is based upon work supported by the National Science Foundation under Grant No. 1904444.
Internet2® is a non-profit, member-driven advanced technology community founded by the nation’s leading higher education institutions in 1996. Internet2 serves 323 U.S. universities, 60 government agencies, 43 regional and state education networks and through them supports more than 100,000 community anchor institutions, over 1,000 InCommon participants, and 56 leading corporations working with our community, and 70 national research and education network partners that represent more than 100 countries.
Internet2 delivers a diverse portfolio of technology solutions that leverages, integrates, and amplifies the strengths of its members and helps support their educational, research and community service missions. Internet2’s core infrastructure components include the nation’s largest and fastest research and education network that was built to deliver advanced, customized services that are accessed and secured by the community-developed trust and identity framework.
Sara Aly, Internet2