Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Landlab Landslide Model Testing #31

Open
1 task
ChristinaB opened this issue Mar 19, 2020 · 5 comments
Open
1 task

Landlab Landslide Model Testing #31

ChristinaB opened this issue Mar 19, 2020 · 5 comments
Assignees

Comments

@ChristinaB
Copy link
Contributor

  • Synthetic
  1. use code in .py to create mean and std from a netcdf and save the pickle

  2. test the code inside 20191206_netcdf_DataDriven_spatial_Depth_Synthetic_LandlabLandslide.ipynb save the output

  3. upload the pickle into the lognormal spatial synthetic notebook. and input the pickle instead of the random input data

@RondaStrauch
Copy link
Collaborator

What I was thinking of task is to:

  1. covert and test the mean and stdev. from Nicoleta's Step_2 notebook from a xarray to a vector or bumpy array that can be exported/saved as a text so that we can use directly in a lognormal notebook similar to (20191207_netcdf_Lognormal_spatial_Depth_Synthetic_LandlabLandslide.ipynb), line 13, but real data. We just need to make sure a flattened matrix started with first value representing grid 0, bottom left corner. Thus, we need to test this. The 'chunk' function, may do this already and we just need to bring that into Nicoleta's Step_2 notebook.

image

@RondaStrauch
Copy link
Collaborator

RondaStrauch commented Mar 19, 2020

Think we could do this with an array to flatten from the bottom left up to top right:

array=np.array([[6, 7, 8], [3, 4, 5], [0, 1, 2]])
fliparray = np.flip(array, axis=0) [so only y axis flips]
landlabarray =fliparray.flatten() [flattens by rows]
type(landlabarray) [should be np array
landlabarray

[0 1 2 3 4 5 6 7 8]

@ChristinaB
Copy link
Contributor Author

This is the file to continue coding.
20191206_netcdf_DataDriven_spatial_Depth_SCL_LandlabLandslide
or
20191206_netcdf_DataDriven_spatial_Depth_Synthetic_LandlabLandslide.ipynb

@ChristinaB
Copy link
Contributor Author

ChristinaB commented Mar 31, 2020

March 20, 2020 Update
Update

  1. I have two folders of dates in this Slippery Future Data HS resource: this is the link to the Skagit dates.
  2. I finished postprocessing the Skagit historic model but its a huge model folder getting zipped. still running...is there any other processing we need to do? Streamflow? Trying to not get distracted but want to check.
  1. Landlab input dictionary:
    Nicoleta's code was:
    #calculate mean of all grid cells
    mean_dtw_scl = data.wt.mean("time")

from here I sorted out these four outputs that are arrays the length of the nodes. I'm running it on both her netcdf built from the ascii AND the flipped version of my netcdf, and we can compare. [ one line with xarray or 100 lines with me hacking it...let's go with xarray]. The second block is how we will build the dictionary for the lognormal forcing. Look good? We can query and compare the two to be sure it gets flattened to an array as expected.

  • Rerun this code with Skagit dates and Skagit map files. Input: one map file. Xarray. Calc. mean/std from netcdf. Test flippy. Save two arrays as two pickles for each model for Landlab I/O.
    See 20200331_map2netcdf2array_lognormal_spatial_Depth_SCL_LandlabLandslide.ipynb and https://www.hydroshare.org/resource/4cac25933f6448409cab97b293129b4f/

  • Load arrays into Landlab Notebook for lognormal_spatial. Output ascii files for probability.

  • Testing Notebook to loop through 7 - check space and memory limits on HydroShare

  • Setup runs on XSEDE for I/O and data driven spatial

  1. Planning for figures and data management
  • 3-hrly climate forcings in a new HydroShare repository separate / same as Sauk model hydroShare repository
  1. Landlab component updates
  • test notebook testing four synthetic examples.

  • zoom into fire case study Goodell Creek for lognormal spatial

@ChristinaB
Copy link
Contributor Author

ChristinaB commented Apr 1, 2020

Notebook 1.0 performs the following Hydrologic (DHSVM data) processing functions to create Landlab model inputs with Visualizations that illustrate Methods:

  • Import libraries and data from HydroShare
  • Run Notebook from CUAHSI JupyterHub server (public browser access)
  • Process DHSVM model output: Input DTW ascii for model instance (DHSVM model w/unique timeseries of storms saved based on max saturation annual events unique for each climate forcings (historic (1) & future (6))
  • Add time stamp as date not string
  • Scale/Resample Hydro grid (150m) to Landlab grid (30m)

  • Use numpy and xarray to process and analyze mean and std of single model instance (30 m)

  • Add Markdown to discuss how to design Landlab utility;

  • Add and test Topmodel approach to resampling.

  • Visualize netcdf and landlab grid maps to demonstrate spatial orientation of array outputs
  • Output mean and standard deviation DTW arrays as model instance for Landlab lognormal spatial landslide model input (format.txt)

  • Loop through time to extract one array per node

  • Output dictionary DTW pickle as model instance for Landlab data-driven spatial landslide model input (format.pickle)

Use next Slippery Future Paper (Notebook 2.0) to Process Multiple Models:

Version 1: multiple hyd. models with various climate forcings and lognormal spatial landslides for SCL domain

  • Import multiple models to generate multiple mean and standard deviation arrays as model instances comparing historic and multiple futures
  • Output mean & stddev.txt, format.pickle, netcdf formats

Landlab Landslide (Notebook 3.0) for running multiple Landlab landslide model instances (uniform, lognormal, lognormal-spatial, data-driven spatial)


Notebook 3.1 Synthetic domain run historic climate data with four Landslide models

Notebook 3.2 SCL domain Lognormal Spatial Climate Forcing Comparison (1-7 model instances):
This Jupyter Notebook runs the Landlab LandslideProbability component on a Seattle City Light
Landlab grid using four depth to water table options to replace recharge options described in the paper:

  • Import mean and standard deviation arrays as model instance for Landlab lognormal spatial landslide model input (format.txt)
  • Create a grid and data fields used to calculate landslide probability
  • Specify Depth to Water Table Distributions to compare four options
  • Run LandslideProbability function from Landlab landslide component
  • Compare the sensitivity based on seven Depth to Water Table options
  • Output Landslide probability.asc, output netcdf

Use next Slippery Future Paper (Notebook 4.0) to Visualize Hydro + Landslides:

  • Import multiple model (netcdf) instances comparing historic and multiple futures
  • Create comparison visualizations of DTW differences
  • Create comparison visualizations of Landsliding differences
  • Create comparison visualizations of Climate Forcing differences
  • Output animations and journal figures

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants