Skip to content

Tools for developing pluvial (excess rainfall) and fluvial scenarios for probabilistic flood risk analyses

License

Notifications You must be signed in to change notification settings

Dewberry/pfra-hydromet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Binder License


Description

pfra-hydromet is a collection of tools for developing pluvial (excess rainfall) and fluvial scenarios for input to hydraulic models.

Pluvial:

These tools (jupyter notebooks ) ingest data from the NOAA Hydrometeorological Design Studies Center (HDSC) and return unique, weighted excess rainfall events suitable for use in 2D hydraulic rain-on-grid models. This approach relies on:

  1. Meteorological data
  2. Random sampling
  3. Hydrologic transform
  4. Convolution algorithm for grouping

Fluvial:

These jupyter notebooks ingest HEC-SSP .rpt files containing flow frequency data for a specific USGS Stream Gage calculated at a range of confidence limits and return a series of events, statified by the annual exceedance probability, with discharge based on the mean flow frequency curve. This approach relies on:

  1. Bulletin 17C flow frequency analysis
  2. Mean flow frequency curve
  3. Stratified sampling

NOTE: EventsTable is the primary notebook for developing excess precipitation scenarios, and SSP_to_Mean_Curve is the primary notebook for calculating the mean flow frequency curve. Both of these notebooks can be called by Papermill, where papermill is designed to act as a manager to maintain consistency in computation, and ensure cells are executed in order. Manager notebooks are designated with the PM- prefix. Executed notebooks should be saved as documentation of the inputs, outputs, and results for a given project location.


Contents

notebooks:

pluvial:
  • PrecipTable: Retrieve NOAA Atlas 14 precipitation statisics for an Area of Interest (AOI).

  • PM-EventsTable: Manager notebook that executes EventsTable and/or reEventsTable.

  • EventsTable: Calculates excess rainfall using area-averaged NOAA Atlas 14 precipitation data, temporal distributions, the curve number (CN)* transform, and a convolution reduction algorithm (grouping). The output is a set of unique, weighted excess rainfall time series.

  • EventsTable_Stratified: Calculates a stratified sample of runoff events given rainfall and maximum potential retention distributions. For each each event and corresponding return interval, the event weight, runoff value, maximum potential retention value, and rainfall value are calculated.

  • reEventsTable: Calculates the reduced excess rainfall given a user-specified stormwater removal rate and capacity. Given user-specified contributing areas (stormsheds), the lateral inflow hydrograhs are also calculated for each event.

  • distalEventsTable: Calculates excess rainfall using updated randomized curve numbers and the original precipitation events calculated in EventsTable.ipynb. The events are combined using the groups determined during the convolution steps in EventsTable.ipynb. The reEventsTable notebook can be then be executed in order to calculate the reduced excess rainfall.

  • JSON_to_DSS: Converts the JSON files generated by PM-EventsTable and/or distalEventsTable to a single DSS.

  • MetadataExplorer: Explores the metadata file created by PM-EventsTable or distalEventsTable during the excess rainfall calculations.

  • Convolution_Parameters: Describes the test statistic and parameters used during the convolution step in the EventsTable notebook.

  • ProjectArea_ModelName_Pluvial_Parameters.xlsx : Excel Workbook used to store the CN, stormwater removal rate and capacity, and information on lateral inflow domains for each pluvial domain within a pluvial model. This Workbook is called by EventsTable, PM-EventsTable, distalEventsTable, and reEventsTable.

fluvial:
  • PM_Sampler_Ops: Manager notebook that executes SSP_to_Mean_Curve, Stratified_Sampler, and Make_Production_Run_List.

  • SSP_to_Mean_Curve: Calculates the mean flow frequency curve using Bulletin 17C confidence limits calculated in HEC-SSP.

  • Stratified_Sampler: Calculates the weight of a specified number of annual exceedance probabilities/recurrence intervals uniformly selected between the minimum and maximum value within log space.

  • Make_Production_Run_List: Calculates the discharge for each annual exceedance probability (AEP) within the weights table using the mean flow frequency curve.

DataRepository:

  • Temporal_Distributions: Folder containing csv files of temporal distributions of observed rainfall patterns broken down by volume, region, duration, and quartile NOAA Published. Note that the original data were compiled into csv's for uniform formatting.

  • Temporal_Distributions_Plots: Folder containing a Jupyter Notebook for each NOAA Atlas 14 volume with the plotted temporal distributions for each region, duration, and quartile.

  • NEH630_Table_10_1.json: A formatted copy of Table 10-1 from the National Engineering Handbook [Chapter 10].(https://www.wcc.nrcs.usda.gov/ftpref/wntsc/H&H/NEHhydrology/ch10.pdf.) which lists the CN values for dry and wet antecedent moisture conditions.

  • NOAA_Atlas_Volume_Codes.json: Metadata that maps the NOAA Atlas 14 volume number to the volume code. Source

  • NOAA_Temporal_Areas_US.geojson: geojson file containing the vector ploygons of the NOAA Atlas 14 temporal distribution areas. This file was constructed using the individual vector ploygons for each volume. Source

  • Temporal_Distribution_Data_Map.json: Metadata used to extract the temporal distribution data from the csv files saved within the Temporal_Distributions folder.

  • Temporal_Quartile_Ranks.xlsx: Excel Workbook that contains the percentage of precipitation events whose temporal distributions are represented by those in each quartile of a specific volume/region/duration. Source

*The (CN Method) is currently the only transform method in use for this project. Other transforms are available and can be adopted into the tool with minor modifications.


Documentation

Complete project documentation can be found in read the docs.