Skip to main content

Blog Redesign 2026

·512 words·3 mins

After attending the session Der eigene Blog mit Hugo - Ein Praxisbeispiel by Tobi Heine at the Chemnitz Linux Days 2026, I was curious to work on the blog again.

Theme Change
#

The former theme used Bootstrap, but I had heavily customized it and it had grown outdated. I decided to give Blowfish a try and adjusted some minor points, like moving the comment section from the end of the page to the end of the article.

Blowfish is based on Tailwind and makes customizations easier than keeping a complete theme up to date.

Comments
#

Last year I moved the comments from a hosted Staticman container instance to a self-written solution with PHP and PostgreSQL, but I wasn’t happy with it and the interaction was very low.

I like the Fediverse and Mastodon, so I started researching solutions based on Mastodon and found the article Client-side comments with Mastodon on a static Hugo website by Andreas Scherbaum. The implementation was easy (just adding the provided sources) and I placed the comments.html file in the layouts/partials directory. This overwrites the theme file and loads the comments section. Andreas explained the implementation in great detail — if you are interested, please read his post.

Posting To Mastodon Automation
#

To make the comments section appear in a blog post, the front matter must include the following fields:

---
...
comments:
  host: "infosec.exchange"
  username: "stoeps"
  id: 
---

id is empty when creating a blog post and I need to fill it in after publishing a post to Mastodon. For now I will do this manually and test if the URL preview is working as expected. The comments section will only appear when all three fields are set in the front matter.

Build With GitLab
#


---
variables:
  RCLONE_FLAGS_PUSH: "--fast-list --no-update-modtime --no-update-dir-modtime --transfers 8 --checkers 16 --size-only"
  RCLONE_FLAGS_SCHEDULE: "-v"

stages:
  - build

build:
  stage: build
  image: "ghcr.io/hugomods/hugo:latest"
  rules:
    - if: $CI_COMMIT_BRANCH == "main" && $CI_PIPELINE_SOURCE == "push"
    - if: $CI_COMMIT_BRANCH == "main" && $CI_PIPELINE_SOURCE == "schedule"
    - if: $CI_COMMIT_BRANCH == "main" && $CI_PIPELINE_SOURCE == "web"
      when: manual
  before_script:
    - apk add git
    - "which rclone || ( apk add --no-cache rclone )"
    - mkdir -p ~/.config/rclone
    - echo $RCLONE_CONF | base64 -d > ~/.config/rclone/rclone.conf
    - |
      if [ "$CI_PIPELINE_SOURCE" = "push" ]; then
        export RCLONE_FLAGS="$RCLONE_FLAGS_PUSH"
      else
        export RCLONE_FLAGS="$RCLONE_FLAGS_SCHEDULE"
      fi
    # Pull latest so we see any comments.id commits from the toot stage
    - git pull origin ${CI_COMMIT_REF_NAME}
  script:
    - hugo build --environment production || exit 1
    - rclone sync $RCLONE_FLAGS docs/ my-webserver-hostname:public_html/target-dir/ || exit 1

This CI/CD code needs one variable (Project > Settings > CI/CD > Variables):

RCLONE_CONF:
#

[webserver-name]
type = sftp
host = webservername
user = username
pass = password
shell_type = unix
md5sum_command = none
sha1sum_command = none

As I don’t use GitLab Pages, I need to copy the resulting HTML files to my server, and for my hosting environment rclone was the best fit.

I want to give a big shoutout to Andreas Scherbaum and Tobi Heine — their talks and posts helped me a lot during the update.

Christoph Stoettner
Author
Christoph Stoettner
I work at Vegard IT GmbH as a senior consultant, focusing on collaboration software, Kubernetes, security, and automation. I primarily work with HCL Connections, WebSphere Application Server, Kubernetes, Ansible, Terraform, and Linux. My daily work occasionally leads to technical talks and blog articles, which I share here more or less regularly.

Comments

With an account on the Fediverse or Mastodon, you can respond to this post. Known replies are displayed below:



Note: This will load data from infosec.exchange.

Learn how this is implemented here.


Related

Guide to Reproducing and Logging JavaScript Issues: Persistent Console & HAR Capture (Edge, Chrome, Firefox)

HCL Connections heavily relies on HTML and JavaScript. When troubleshooting issues, administrators or support teams may struggle to reproduce problems. Providing console logs and HAR (HTTP Archive) files directly from your browser session can offer crucial insights, helping to diagnose and resolve issues more effectively.

Automate screenshots of web pages with Eyewhitness

This week, I investigated an issue with the advanced profiles search in HCL Connections. I had a system which did not show any results in the advanced search for the department number of Connections profiles. HCL support tried for three months to reproduce the issue, and the users missed the option to search for their teammates with the advanced profiles search.