A guide to running computational workflows on DesignSafe, from interactive exploration in a Jupyter notebook to production-scale simulations on TACC supercomputers.
Contents¶
How It Works The DesignSafe portal, compute environments, storage, and workflow design
Compute Environments JupyterHub, VMs, HPC systems, queues, and allocations
Storage and File Management Storage areas, paths across environments, file staging, and dapi file operations
Submitting a Job Through the Portal Step-by-step walkthrough with screenshots
Running HPC Jobs Job submission with dapi, resource parameters, and parallel execution
Debugging Failed Jobs Job states, output files, and common failure patterns
Parameter Sweeps Running hundreds of independent simulations with PyLauncher
DesignSafe Applications Catalog of 45+ available tools
Advanced Topics Tapis internals, execution strategies, and custom app development
Quick example¶
Submit and monitor an HPC job from a Jupyter notebook using dapi:
from dapi import DSClient
ds = DSClient()
input_uri = ds.files.to_uri("/MyData/analysis/input/")
job_request = ds.jobs.generate(
app_id="opensees-mp-s3",
input_dir_uri=input_uri,
script_filename="model.tcl",
max_minutes=60,
allocation="your_allocation",
)
job = ds.jobs.submit(job_request)
job.monitor()For dapi installation, authentication, and API reference, see the dapi documentation.