Warning
While it’s relatively easy to remain in the free tier for R2, understand the pricing before deciding to follow along. That being said, at the time of writing this, R2 has free egress along with up to 1 million & 10 million Class A & B operations respectively in free tier. Always understand pay as you go pricing. You may use Amazon’s S3 or other alternatives as well.
The Problem #
As stated previously, I wanted to keep my source and images separated due to concerns of a growing repository. I also still wanted to take advantage of Hugo’s page bundles and general content layout so that adding new images didn’t become a chore where I had to keep track of URLs and manually link to new pages.
My original inspiration for creating a photo gallery included pulling exif metadata from hugo’s build pipeline which required page bundles similar to the following:
content
└───photos
├───page-bundle-1
│ ├──3S4A1359.JPG
│ └───index.md
│
├───page-bundle-2
│ ├──3S4A20422.JPG
│ └───index.md
│
└───page-bundle-3
├──3S4A3477.JPG
└───index.md
This allowed the automation of pages pages with their corresponding images with the following layout:
layouts
└───photos
└───page.html
{{ range .Resources.ByType "image" }}
<img src="{{ .RelPermalink }}" alt="{{ .Name }}">
{{ end }}
Since I won’t be including the images in the same repo as the source code, they won’t be in the build pipeline either. This means I also needed a different solution before the build takes place.
The Solution #
In order to keep things simple, both in terms of logical groupings of resources and the ability to reference the file structures in the layout templates, I’m still creating bundles and placing their corresponding images within them.
From here, I use .gitignore to hide all images from source control and use a similar filter with rclone to only include images in R2.
This leaves us with the following remote structures:
source-control:photos
├───page-bundle-1
│ └───index.md
│
├───page-bundle-2
│ └───index.md
│
└───page-bundle-3
└───index.md
r2-bucket:photos
├───page-bundle-1
│ └──3S4A1359.JPG
│
├───page-bundle-2
│ └──3S4A20422.JPG
│
└───page-bundle-3
└──3S4A3477.JPG
Using Front Matter for Filenames #
One thing we are missing in this solution is the ability to reference the file at build time. This can be mitigated by including the filename as a field in the front matter.
---
title: 'My Image'
date: 2025-09-14T23:46:38-04:00
hero: '3S4A24353.JPG'
---
It can then be combined with a base URL in the site configuration and referenced in a template.
{{ define "main" }}
{{ $cdn := or .Params.cdnBase site.Params.cdnBase }}
{{ $name := or .Params.hero "hero.JPG" }}
{{ $pageDir := path.Dir .File.Path }}
{{ $rel := path.Join $pageDir $name }}
{{ $url := urls.JoinPath $cdn $rel }}
<img class="artist-photo" src="{{ $url }}" alt="{{ .Title }}">
{{ end }}
Automation and Exif #
Even though we lose some image processing capabilities by moving images outside of the build pipeline, some things like gathering exif metadata can be processed beforehand with exiftool.
$ErrorActionPreference = 'Stop'
$sourceDir = Join-Path $PSScriptRoot 'content'
$includeExt = 'jpg','jpeg'
$tags = 'Make','Model','FNumber','ExposureTime','ISO','FocalLength','WhiteBalance', 'Lens'
$tagArgs = $tags | ForEach-Object { "-$_" }
Get-ChildItem -LiteralPath $sourceDir -File -Recurse |
Where-Object { $includeExt -contains $_.Extension.TrimStart('.').ToLowerInvariant() } |
ForEach-Object {
$img = $_.FullName
$outJson = [System.IO.Path]::ChangeExtension($img, 'json')
$raw = & exiftool -j @tagArgs -- "$img"
if ($LASTEXITCODE -ne 0 -or -not $raw) {
Write-Warning "Failed: $img"
return
}
# Remove SourceFile and write JSON back
$arr = $raw | ConvertFrom-Json
foreach ($o in $arr) { $null = $o.PSObject.Properties.Remove('SourceFile') }
$arr | ConvertTo-Json -Depth 4 | Set-Content -Path $outJson -Encoding UTF8
}
Tip
This crawls through all of content and places a json file of the same name as any images it finds with their metadata. It can and should be further tailored to be included in the automation of creating new bundles and importing photos.
To reference the json within the template, we can update it.
{{ define "main" }}
{{ $cdn := or .Params.cdnBase site.Params.cdnBase }}
{{ $name := or .Params.hero "hero.JPG" }}
{{ $jsonFile := replace $name (path.Ext $name) ".json" }}
{{ $jsonRes := .Resources.GetMatch $jsonFile }}
{{ $pageDir := path.Dir .File.Path }}
{{ $rel := cond (in $name "/") $name (path.Join $pageDir $name) }}
{{ $url := urls.JoinPath $cdn $rel }}
<img class="artist-photo" src="{{ $url }}" alt="{{ .Title }}">
{{ if $jsonRes }}
{{ $json := $jsonRes.Content }}
{{ $meta := transform.Unmarshal $json }}
...
To then reference any of the metadata, it’s as simple as {{ $meta.FocalLength }}. This is because the directory now includes the json files in the page bundles.
content
└───photos
├───page-bundle-1
│ ├──3S4A1359.json
│ └───index.md
│
├───page-bundle-2
│ ├──3S4A20422.json
│ └───index.md
│
└───page-bundle-3
├──3S4A3477.json
└───index.md
Setting up R2 #
Before you can start fetching images from R2, you will have to use a Public Development URL or set up a Custom Domain. Follow this link if you need help setting up public buckets. These instructions also assume you know how to get rclone yourself.
API Tokens #
Once you have a public bucket, you’ll want a way to sync your images with it so you actually have items to link to. In the R2 Object Storage section of Cloudfare, this is where you can Manage API Tokens.
Create a user or account API token with Object Read & Write privileges. You may also want to limit the token to the bucket you are working with. Note the values after creating the token, as you will not be able to see these again. The ones you will need for Rclone are as follows:
- Access Key ID
- Secret Access Key
- Endpoint
Configuring Rclone #
With the values from the previous section, run:
rclone config
If this is your first time using rclone, you should see the following:
No remotes found, make a new one?
n) New remote
s) Set configuration password
q) Quit config
- Create a new remote with a name you will remember
- Choose the option storage of
Amazon S3 Compliant Storage Provders... - Option Provider should be
Cloudflare R2 Storage - Option env_auth is
falseto enter the credentials in the next step - Enter your
Access Key IDfrom above - Enter your
Secret Access Key - Enter your
Endpoint
From here you should be able to finish and save the config.
Note
When using rclone to sync to your bucket, you will need to know the name of the credentials that you saved, and the name of the bucket you are syncing to. you can always run
rclone configagain to view what remotes you have configured.
Syncing with Rclone #
$r2Remote = "r2"
$bucket = "photos"
$remoteDir = ""
$localRoot = Join-Path $PSScriptRoot "content"
$filters = Join-Path $PSScriptRoot "images.filters"
$doResync = $false # set $false after first successful run
write-host $PSScriptRoot
write-host $localRoot
if ([string]::IsNullOrEmpty($remoteDir)) {
$remote = "$r2Remote`:$bucket"
} else {
$remote = "$r2Remote`:$bucket/$remoteDir"
}
$extra = @()
if ($doResync) { $extra += "--resync" }
rclone bisync "$localRoot" "$remote" `
--filter-from "$filters" `
--create-empty-src-dirs `
--fast-list `
--use-server-modtime `
--compare size,modtime `
--max-delete 1000 `
@extra