Skip to content

Project Journal

Ronda Strauch edited this page Oct 15, 2019 · 46 revisions

SCL LANDSLIDE PROJECT JOURNAL

Project Motivation: Produce a credible scientific hazard map for Skagit stakeholders.

Publication of the improved scientific processes, data, and models in peer reviewed open source online publications will demonstrate at a regional scale the influence of fire on landslide probability given climate change predictions.

Purpose: to track progress and planning for project activities. NOTE: moved most recent to top for ease of follow-up

Priorities by Tuesday Oct 22, 2019 Next meetings (Link to Landlab Driver Notebooks)[https://www.hydroshare.org/resource/4cac25933f6448409cab97b293129b4f/]

10/14/2019 meeting

C + N - schedule meeting before Oct 14

  • plan 3 month calendar
  • Writing tasks outlined (draft)

Short List

  1. Add utility CCORA conference presentation to Figshare. Add link and version of slides to HydroShare resource with visualization data. DONE - but this has nothing to do with this landslide project - put on 'figshare' instead
  2. get DOI for HS data table resource. Consider data publication (DIB? Christina, Crystal, Ronda, Erkan?).
  3. Reply to AGU. Christina will present Lightning Talk. DONE 10/15 (including notifying of presenter change)
  4. upload sketches of AGU figure design - DONE 10/15
  5. make calendar for figure, writing and poster deadlines
  6. finish invoice updating and follow up with Jill

Long List

Book AGU flights Confirm all AGU posters and presentations.

Christina

  1. Get fire intensity into Google layer, Get Phi into a raster layer for GIS input.
  2. model instance backup on HS
  3. test hypothetical Landlab component
  4. results to Dan
  5. UW-Dan Miller contract
  6. Test notebooks and new component code, edit to read in xray datasets (Ronda and Christina)
  7. Push new component to Landlab (Christina)

Ronda

  1. Ronda - review Dan's comments. Confirm if our approach is sound. Adjust approach if necessary. If ok, we can get the soil unit and associated phi from GIS onto hydroshare as our final input needed. - Ronda

Nicoleta

  1. PR to https://github.com/Freshwater-Initiative/SkagitLandslideHazards. script extracting the row/column of all the input data that fits our study area (matching the row/columns we gave Nicoleta) such that we have a matching set up inputs to run the landslide model…maybe put into another hydroshare resource or existing data resource.

  2. test workflow with Skagit dhsvm outputs - get distribution of dtw (Nicoleta) - where is the data? put on HydroShare in pyDHSVM folder in Slippery Future Skagit Landslide Hazard Model

DONE) Test visualizing results with XrViz - AWESOME!!!

Oct 14 Agenda

  • test hypothetical Landlab component (Ronda and Christina)
  • test skagit dtw forcings on Landlab component (Ronda and Christina)
  • Nicoleta priorities

To do Next

  • write up landlab component equation changes - schedule for work session after Oct 14

09/25/2019 Final model runs from hyak to hydroshare (Christina)

Worked on GIS files making maps.Google map of Goodell Fire and surrounding HUCS Ready for interactive publications!! Created new HS resource for GIS files for Skagit Basin at Skagit Basin Geospatial Model Information

Added ice and snow to HydroShare archive - talked to Nicoleta to reverify that depth to water table is on HydroShare at: Landslide Hazard Modeling in the Skagit Basin

Made plans for tracking end of project in Slippery Future Google Drive.

07/24/2019

Worked on finalizing Landlab component update for recharge. Equations for high resolution work cancel everything out. Connected the soil's data to the soils feature class in GIS so we could visualize the phi and texture spatially. Sent images to Dan for review of phi and soil texture from GIS.

7/10/2019

Work on phi write up. Talked to Dan Miller.

06/272019

Work on phi research from Pellewiter 1981 and LISA manual.

06/03/2019

  1. Edited landlab component code, push to github
  2. Updloaded SkagitDHSVM GIS files to Google Drive and HydroShare for Nicoleta

5/16/2019

  1. Study are bounds in Lat/Lon to clip inputs to DHSVM
  2. Go through Landlab component code design
  3. Derive equation updates and write up
  4. Get rasters of % sand, % clay from Skagit soils database - upload existing soil table to Hydroshare DONE - * Reclass LULC from cohesion in paper x 100 - set ice/snow/water to 1, wetland to 2 (Ronda) DONE review landslide code (Ronda and Christina)
  5. Assign paragraph/table/figure sized writing tasks as Github Issues, that can be moved over as cards.
  6. Review paper outline

5/7/2019

Ronda completed

  1. LNIC for N. Cascades
  2. Recoded to 8 LULC
  3. Converted reclass & cohesion and put on Hydroshare

Christina completed

  1. Restart 100 year model
  2. Make new HS resource
  3. Test hypothetical model with depth
  4. Copy original data resource and finish metadata

Nicoleta completed

  1. code update for correct saturation date printing

Work Agenda 5/7/2019

  1. Setup test notebook with hypothetical model and linked to new component updated for depth to water table instead of recharge.

  2. Writing tasks from below.

  3. Review figures.

Work Agenda 5/2/2019

  1. Update 'what is a storm?' code and finalize. (Nicoleta). Testing by Ronda and Christina

  2. Write up methods. Add figure to paper with description (Nicoleta).

  3. Design workflow for DHSVM - Landlab coupling: ASCII to probability distribution for 150 m and 30 m grid cells (Christina and Nicoleta). Christina will get code samples from Jim's work.

  4. Finalize GIS inputs for Landlab component (Ronda). Write up sections and data management for digital appendix. Move paper to Google doc or Word with versions on Hydroshare.

  5. Setup test notebook linked to new component updated for depth to water table instead of recharge. Test with recharge first.

  6. Review figures and plan for next meeting.

Work DONE 5/2/2019:

Set up Zotero

Set up Google Team Drive Doc and reviewed Paper outline, figures, and research questions.

Set up Github fork connections to Master.

Christina delivered saturation extent dates to Nicoleta on Github, and depth to water table DHSVM outputs on HydroShare. Download dtw_DHSVMoutput_20190502.tar

Worked on items 1-4 above. To do for next meeting is #5

Updated issues: GIS input parameters, What is a Storm, DHSVM Groundwater,

Work accomplished 4/16/2019

  1. Reviewed forcing data paper for methodology input
  2. Review streamflow tools based on DHSVM modeling
  3. Identified GIS files remaining to be uploaded to hydroshare
  4. Justified GCM 3 model selection
  5. Load landslide.py code to hydroshare needed to edit for groundwater depth

Intentions for 4/16/2019 0. Run model for last 10 years

  1. Finish GIS inputs - on Hydroshare and ready to use in a Notebook Be ready to edit Landslide component
  2. Figure out code workflow needs for May work - big.asc on Hydroshare with python code to generate animations; clip skagit row/col to landlab row/col
  3. finalize hydroshare DHSVM appendix from Skagit Report

Housekeeping:

  1. Update and Add new Github issues (realignment process, collaboration culture review,...)
  2. Add due dates and milestones to the issues.
  3. Set next sprints in calendar.
  4. Amend contract

4/5/2019

  1. look at DHSVM results DONE
  2. upload GIS files to hydroshare DONE

++

Mapped depth to water table, soil, and ice for sample outputs.

Investigated inconsistencies in Okanogan soil data

Cleaned up merged soil polygon file for mapping DHSVM parameter inputs.

Reviewed soil depth and soil parameter processes from previous work - started new Github issue.

Finished study area outline and boundary - on Hydroshare.

Discussed strategy to subset landslide model as a clip within Skagit DHSVM

4/4/2019

Notice that the check marks are faint but definitive. DONE!
Ronda and Christina cleaned up this Github repo, closed or assigned remaining issues and dropped cards into Project.

Landslide Sprint #1

10/1/2018 Nicoleta and Christina and Ronda met at eScience. Reviewed code to print dates for DHSVM model runs. Discussed model runs on Mox.Hyak and transfer formats. Outlined paper:

Is landslide probability more sensitive to climate change (recharge) or fire (cohesion) ? Use Goodell Fire footprint 2015 August.

10/1/2018 Nicoleta and Christina met to review 9-24 decisions and plans for next steps - deliver list of peak annual saturation events in DHSVM formatted dates for all provided time periods, scenarios, and models (N) to start Skagit model runs on AWS (C).

9-24-2018 Nicoleta and Christina met to review code for selecting storms. Reviewing sediment peaks, temperature model availability, updated Github with all available saturation_extent.txt files. Later work for paper discussion and figures will look at peak events from streamflow and precipitation.

It would take 2 Terrabytes of space to run the Sauk temperature model for 1915-2099 for 20 models (10 GCM; 2 Scenarios). Can we change DHSVM temperature model to print saturation_extent but not other big outflow.only? To get a continuous saturation_extent time series, we need to rerun the temperature model for missing parts on Pogolinux.

Christina and Ronda met to review the contract. Files from the DHSVM-RBM Sauk model results for historic and future were committed to this repo under the folder 'saturation_extent_files'. 'What is a Storm?' Issue was updated.

8-30-2018 Nicoleta presented draft code and results for saturation extent - Jupyter Notebook is on this repo (see Code).

8-2018 Christina worked on Skagit and Sauk model instance updates. The temperature model must be 'On' for the saturation extent file to be printed. This result includes printing a very large Outflow.Only file which gives streamflow at every timestep for each link in the network (1300 for Skagit; 400 for Sauk). The plan is to use the Sauk saturation extent output file to extract the date of the annual maximum peak saturation extent, and use this to print DHSVM model outputs. Pseudo code and plan discussed with Nicoleta. The limitation is that the Sauk report and model runs include only 1960-2010 and 2030-2099 time series. Our goal is to have an annual time series from with to develop a distribution. TBD - do we need to rerun the Sauk model? Or the Skagit model? for 1915-2099.

Christina worked on setting up purchase order and virtual machine instance the correct size for the Skagit model runs on Amazon Web Service. The cost per run for the Skagit is estimated at $1000 (actual cost TBD).

Hyak has been overhauled with a new operating system and batch job submission code needs to be rewritten. Christina investigated with the Computational Hydrology lab (Bart) and has a 8/31 meeting set up with Oriana to learn the new code needed to run the Skagit model on Hyak - but currently there is not sufficient storage space to run the model for 1915-2099 (for all future runs - how many do we really need?)

7-2018 Rebudgeted project to include Nicoleta Cristea and Crystal Raymond.

6-2018 Hired Nicoleta Cristea as Freshwater-eScience research scientist

5-2018 Developed project slide show and Ronda presented to the Climate Impacts Group.

4-19-18

Added Issue for code publication with checklist of plan PPT of project overview added to Github. Christina will share with CEE collaborators (Jeff, Jessica, Erkan) to see how Jeff's paper contributes (or not).

What is a metadata management plan? Who is funding? Contributing? Collaborating? So that research products are accurately annotated.

Deliverable management plan: list of research products = data, code, papers, maps, reports, Storymap (list similar to NSF prior results section "demonstration that products are available", FAIR statement

Data management plan: list of datasets - archive of inventory that has HydroSHare DOIs/hyperlink

Code management plan: Github -> collaborating Githubs e.g. PNNL -> JOSS publication(s)

Provenance/metadata management plan: funder, author, owner, contributor, institution, communication

4-16-18 Developed PowerPoint presentation for CIG brownbag presentation - given this date Feedback from the presentation includes: *some wanted more information about uncertainty and were confused about how the probability accounts for uncertainty...need to make this explicit/clearer *Amy asked what am I most worried about - said, just getting this done - allocation of time to do this project and some anxiety about using DHSVM as a new hydrologic model input *Rob could help improve maps...consider rolling up to HUC 12 watershed level in the end for communication *How are planning on comparing...said possibly 'percent change' *Should think about making a 'story map' for this project using Esri's app, where we can have a time slider and roll over historical map to future map Put powerpoint slide on GitHub

3-6-18 Upload signed Data Use Agreement as a new HydroShare project. Add description to a new Wiki page. Document project management approach on new Wiki page. Decide that next priority is to schedule and prepare for a CIG project introduction presentation. Agenda for next week: Draft presentation. Upcoming data tasks: develop saturation time series python code

2-27-18 Upload DHSVM model outputs for the saturation extent time series. (See Code folder in this repository) Develop Issues and next steps for linking DHSVM to Landlab landslide component.

2-20-18 Review Hydroshare and Github project setup Added Project in Github with task