FSAR Data Standardization Workflow
Use this workflow when preparing salmon datasets for Fisheries Science Advisory Report (FSAR) analysis, review, and SPSR intake.
Canonical package specification (Markdown): Salmon data package specification
If you want a concrete walkthrough, use the companion page: FSAR to SPSR: End-to-End Example.
Step 1 — Define scope, classification, and destination
- Confirm dataset scope (what, where, when).
- Confirm data classification and sharing constraints.
- Confirm whether this package is intended for SPSR intake, external publication, or both.
- Policy guidance: Publishing Data Externally
Step 2 — Map columns to canonical ontology terms
- Use GC DFO Salmon Ontology (WIDOCO) to find canonical terms.
- Use the metasalmon reusing-standards vignette for the maintained package-side lookup and reuse workflow.
- Record full canonical IRIs for mapped terms.
Step 3 — Create a review-ready salmon data package
For the actual package build and review loop, use the canonical metasalmon docs:
- metasalmon 5-Minute Quickstart
- metasalmon Publishing Data Packages
- metasalmon Post-review publication
Use the generated package folder as the working FSAR artifact. If you need the normative field definitions or manual file layout, use the salmon data package specification.
Starter assets:
Treat this package as the source of truth for both validation and any SPSR upload files you generate later.
Step 4 — Review + validate package quality
Finish the metasalmon review loop before submission:
- review the generated
metadata/*.csvfiles - review any
README-review.txtorsemantic_suggestions.csvfiles created with the package - resolve missing placeholders or review-marked semantic values before downstream submission
- run the package-level validator before generating SPSR upload files
Minimum pre-flight check:
metasalmon::validate_salmon_datapackage(
pkg_path,
require_iris = TRUE
)Where pkg_path is the package folder created by create_sdp() or your project-owned package build step.
Step 5 — Prepare route-scoped SPSR intake from the package
Use the package to prepare the current SPSR upload profile that matches your data grain.
Minimum route-scoped support story:
- CU/composite
- SMU
- Population
Current working rules:
- start from the current SPSR templates: https://spsr.dfo-mpo.gc.ca/download_sdp_templates
- keep the salmon data package as the authoritative source, not a side spreadsheet
- if you are doing a bulk upload, use the same package-first route rather than a separate workflow
- document mapping assumptions explicitly when similarly named fields are not equivalent
- keep provenance explicit for transformed or derived values
- keep package metadata and route-scoped upload CSV(s) versioned together
Useful SPSR references:
- Repo: https://github.com/dfo-pacific-science/salmon-population-summary-repository
- App docs: https://spsr.dfo-mpo.gc.ca/documentation/
- Upload wizard: https://spsr.dfo-mpo.gc.ca/wizard/1/
If you are preparing CU/composite escapement estimates, use the focused recipe: CU/composite escapement estimates for package-first SPSR intake.
Step 6 — Upload and review in SPSR
- Upload via SPSR wizard or the current approved bulk route.
- Resolve validation feedback.
- Re-upload if needed until intake checks are clean.
- Record accepted upload version, intake date, and any route-specific notes.
Step 7 — Publish externally (if in scope and approved)
- Follow policy + approvals for external release.
- Include citation/version metadata and release notes.