10 Worked Examples
This chapter focuses on reviewable cases, not just runnable scripts.
10.1 Worked Example A — Fraser Sockeye gap-fill and layer review
10.1.1 Scenario
You received updated Sockeye annual inputs and need refreshed CU outputs. The main review question is not merely “did the script run?” but:
do the trend layer, abundance layer, and documented exceptions still make sense?
10.1.3 Review before release
- Inspect
DATA_TRACKING/FraserSockeye_MatchingCheck.csv. - Review
DATA_TRACKING/Sites not in CUs.csv. - Check CU-specific prep tables in
DATA_TRACKING/FraserSockeyePrep/. - Update
exception-register.csvfor any suppressions, timing overrides, or manual fills used in the run. - Confirm you can explain which stream set supports
cu_trendand which supportscu_abundance.
10.2 Worked Example B — Interior Fraser Coho decomposition review
10.2.1 Scenario
You need updated Coho CU and pop outputs. The main review question is:
is the split between the CU layer and the broader pop layer still intentional, and do the natural/hatchery decomposition tables support the final outputs?
10.2.3 Review before release
- Inspect
DATA_PROCESSING/Calc.Nat Table for <CU>.csvfor one or more key CUs. - Review
DATA_PROCESSING/All IFC CUs BY Table_EC.max=NA_infill=FALSE.csv. - Confirm the pop output is intentionally broader than the CU output.
- Record how
Final.Estimate.Type == "NO"was handled. - Update the exception register if any naming or estimate-type patches were required.
10.3 Worked Example C — Lower Fraser Chum composition review
10.3.1 Scenario
You need current Chum CU and pop outputs. The main review question is:
can you explain why the CU totals and pop sums are not the same thing?
10.3.3 Review before release
- Drop any unnamed first columns before QC.
- Confirm the CU layer reflects the intended Harrison treatment.
- Confirm the pop layer retains the expected decomposition logic.
- Write a plain-language note explaining the expected CU/pop non-equality.
- Record any special handling in
exception-register.csvif a manual patch was required.
10.3.4 Minimum QC
drop_unnamed <- function(d) d[, names(d) != "", drop = FALSE]
cu <- drop_unnamed(read.csv("DATA_OUT/Cleaned_FlatFile_ByCU_FraserChum.csv"))
stopifnot(!any(duplicated(cu[c("CU_ID", "Year")])))
pop <- drop_unnamed(read.csv("DATA_OUT/Cleaned_FlatFile_ByPop_FraserChum_all.csv"))
stopifnot(all(c("Pop_ID", "Year") %in% names(pop)))10.4 Worked Example D — Fraser Pink historical transition review
10.4.1 Scenario
You need updated Pink outputs. The main review question is:
does the release still preserve the official CU series while carrying the historical context layer correctly?
10.4.3 Review before release
- Confirm the CU file still reflects the official CU series.
- Inspect the pop/context file around the pre-1993 to post-1993 transition.
- Confirm the Fraser aggregate row is present where expected.
- Confirm its trend fields remain intentionally
NAwhere that is the method. - Document any omitted or suppressed historical years.
10.4.4 Minimum QC
drop_unnamed <- function(d) d[, names(d) != "", drop = FALSE]
cu <- drop_unnamed(read.csv("DATA_OUT/Cleaned_FlatFile_ByCU_FraserPink.csv"))
stopifnot(!any(duplicated(cu[c("CU_ID", "Year")])))
pop <- drop_unnamed(read.csv("DATA_OUT/Cleaned_FlatFile_ByPop_FraserPink_All_Nuseds.csv"))
stopifnot(all(c("Pop_ID", "CU_ID", "Year") %in% names(pop)))10.5 Worked Example E — Final CU/WSP bundle handoff
10.5.1 Scenario
Any species run has passed species-level QC and now needs a stable handoff for WSP metrics, packaging, or a downstream consumer.
10.5.2 Assemble the bundle
cu_timeseries.csvwsp_metric_specs.csv- optional
wsp_cyclic_benchmarks.csv - optional
cu_metadata.csv run-log.mddecision-log.mdqc-summary.mdmetadata_notes.md
10.5.3 Review before release
- Confirm the CU IDs in the time series and metric specs match.
- Confirm intentional
NApatterns are documented. - Confirm the exception register has been reviewed, not just populated.
- Confirm key intermediate QC artifacts are linked from
qc-summary.md. - Record any downstream consumer export as a derived file.