As an NSF-funded Center of Excellence, SGX3 supports researchers and teams running or operating a science gateway. SGX3 can support Globus users by sharing technical insight on science gateway platforms where data and research products can be stored or accessed. SGX3 offers a full range of services and expertise, including: building and running gateways, community resources and networking, and education and training.
Argonne National Lab
Data Science and Learning Division (DSL) scientists have undertaken a project to develop a lab service for interactive, scalable, reproducible data science, leveraging machine learning methods to reduce simulation costs and increase data quality and value for researchers.
Researchers are now able to reliably transfer and share large data sets, and even share protected data with ease. With the increase in Globus managed endpoints, researchers at UC San Diego can easily transfer and share data both internally and with researchers outside UC San Diego.
New Zealand eScience Infrastructure (NeSI)
A few years ago NeSI adopted Globus as their de facto national data transfer platform for research.
Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform (TERRA-REF)
Used Globus to build portal which will provide open access to huge datasets that will guide breeding decisions, facilitate collaboration, and allow unprecedented data sharing
University of Pittsburgh
The University of Pittsburgh together with the Pittsburgh Supercomputing Center are one of five funded components contributing to the infrastructure for the NIH-funded Human BioMolecular Atlas Program (HuBMAP), which relies heavily on Globus for software infrastructure to build a frictionless research platform.
University of Chicago, Argonne National Laboratory
The Data and Learning Hub for Science (DLHub) serves as an automated facilitator and interconnection point for ML models and associated data transformation and analysis tools. It allows researchers to describe and publish such tools in ways that support discovery and reuse; run published tools over the network (with tools executed on a scalable hosted infrastructure); and link models, other tools, and data sources into complete ML/AI pipelines that can themselves be published, discovered, and run. DLHub relies on Globus services for data management.