Skip to contents

Use this page once before the homepage quick example or the 5-Minute Quickstart.

It covers two things:

  1. GitHub credentials so remotes::install_github() works reliably.
  2. Optional LLM provider setup if you want llm_assess = TRUE later.

1) GitHub setup for installs and updates

metasalmon is installed from GitHub. Public installs often work without a token, but setting up a GitHub Personal Access Token (PAT) up front avoids rate limits and awkward auth failures.

Create a GitHub token

Create a PAT in GitHub:

For public package install/update, a read-only token is enough. If you also plan to read private GitHub repositories later, make sure the token can access those repositories too.

Store it in ~/.Renviron

file.edit("~/.Renviron")

Add:

GITHUB_PAT="paste token here"

Restart R after saving.

Install metasalmon

install.packages("remotes")
remotes::install_github("dfo-pacific-science/metasalmon")

If you also need private GitHub CSV access later

The install token and the package’s private-repo CSV helpers are related but not identical workflows.

When you later want to use read_github_csv() against a private repository, run:

ms_setup_github(repo = "your-org/your-private-repo")

That verifies private GitHub access for the repository you actually intend to read.

2) Optional LLM provider setup

The base quick example and quickstart do not require an API key.

Only do this setup now if you want to run:

  • create_sdp(..., llm_assess = TRUE), or
  • suggest_semantics(..., llm_assess = TRUE)

If you also plan to pass PDF or Excel files through llm_context_files, install the optional readers once. HTML, DOCX, .R, .Rmd, and .qmd inputs work without extra reader packages:

install.packages(c("pdftools", "readxl"))

DFO internal: chapi

If you are on the DFO internal network or VPN, open:

Then:

  1. click the user icon in the bottom left,
  2. open Settings,
  3. click Show next to API Keys,
  4. copy the key value.

Store it in ~/.Renviron:

file.edit("~/.Renviron")
CHAPI_API_KEY="paste key here"

Optional overrides:

CHAPI_MODEL="ollama2.mistral:7b"
CHAPI_BASE_URL="https://chapi-dev.intra.azure.cloud.dfo-mpo.gc.ca/api"

External users: OpenRouter

file.edit("~/.Renviron")
OPENROUTER_API_KEY="paste key here"

llm_provider = "openrouter" defaults to openrouter/free.

External users: OpenAI

file.edit("~/.Renviron")
OPENAI_API_KEY="paste key here"

Then choose an explicit OpenAI model when you call the LLM review path, for example:

suggested <- suggest_semantics(
  df = your_data,
  dict = your_dict,
  llm_assess = TRUE,
  llm_provider = "openai",
  llm_model = "gpt-4.1-mini"
)

3) Next step

After this setup is done, go back to:

If you are continuing from a reviewed package later, use: