smol projects

automated link-sharing (curated content)

25.02.2024 5 min read

I came across Sophie’s post on automated weekly links posts with raindrop.io and Eleventy and thought it was a cool idea.

TLDR: bookmark stuff, gather stuff bookmarked that week, and share it as a post on one’s personal site. I got to thinking: “how can I implement something similar?”

I use Linkding for bookmark management, and, fortunately, it comes with an API. To get started making requests, an authorization token is needed (found in Settings > Integrations). Every request needs to be made with the Authorization header, with Token <YOUR_TOKEN> as the value. Hit /api/bookmarks, and… success! What if we wanted a little more control over the results we’re getting?

Well, Linkding categorizes results by hashtags. Say we decide to tag these links #good-links (that’s what Sophie calls them, and I am all out of creativity). To get all “good links”, our request URL would look like the following: https://<your-linkding-domain>.example/api/bookmarks?q=%23good-links (where “%23” is the URL encoded version of the hashtag symbol). If we wanted to narrow these further, we can use the limit and offset options that are also provided by Linkding’s API. That’s fine, but at this point, what we really need is a way to filter bookmarks by date (say, bookmarks from the last week). Unfortunately, there doesn’t seem to be a good way to do it through params, so we’ll have to do it manually (with some pesky date parsing). After which, we can use a template - blog posts on this site are rendered from .mdx files, all of which conform to a certain format.

I love Nim (though I can’t quite say why), so that’s what we’re using here.

main.nim contains the “main” logic:

import types
import writePost
import std/[httpclient, json, times, sequtils, os, strformat, options]


if not existsEnv(LINKDING_TOKEN):
  quit(fmt"Error: {LINKDING_TOKEN} needs to be set", EXIT_STATUS_ZERO)

if not existsEnv(LINKDING_URL):
  quit(fmt"Error: {LINKDING_URL} needs to be set", EXIT_STATUS_ZERO)

proc empty[T](s: seq[T]): bool =
  s.len == 0

proc main() =
      url = getEnv(LINKDING_URL)
      token = getEnv(LINKDING_TOKEN)
      client =
        newHttpClient(headers = newHttpHeaders({"Authorization": fmt"Token {token}"}))
      res = client.get(url)
      json = res.body.parseJson
      optionalRes = to(json, Results)
      today = now()
      weekAgo = today - days(DAYS_IN_WEEK)
    if not optionalRes.results.isSome:
      let errDetail = to(json, Err)
      raise newException(UnpackDefect, fmt"Error: {errDetail.detail}")
    let filtered =
      optionalRes.results.get.filter do (link: Link) -> bool:
          let parser = initTimeFormat("yyyy-MM-dd'T'HH:mm:ss'.'ffffff'Z'")
          let dd = parse(link.date_added, parser)
          return dd <= today and dd >= weekAgo
          let parser = initTimeFormat("yyyy-MM-dd'T'HH:mm:ss'Z'")
          let dd = parse(link.date_added, parser)
          return dd <= today and dd >= weekAgo
    if not filtered.empty:
  except UnpackDefect as e:
    echo e.msg


Some notes:

Types-wise, I decided to make the Results an optional type. So, if anything fails during fetching (API down, invalid token, etc.), we exit with an error. If we get Some response (isSome), then we continue.

For some reason, Linkding stores dates in two different formats 😑 so we have to try and parse it two different times. I don’t know if there’s a better way other than trying to catch an exception, but this will have to do for now.

I don’t actually bookmark things often, so we have a conditional that skips writing if there weren’t any bookmarks that week. If not, we’ll write to file in writePost.nim:

import types
import strings
import std/[os, strformat, strutils, times, sequtils, tables]


if not existsEnv(SITE_BLOG_DIR):
  quit(fmt"Error: {SITE_BLOG_DIR} needs to be set", 0)

if not existsEnv(POST_TEMPLATE_PATH):
  quit(fmt"Error: {POST_TEMPLATE_PATH} needs to be set", 0)

proc writePost*(links: seq[Link]) =
    let filePath = getEnv(SITE_BLOG_DIR) & toParseableFileName()
    let tbl =
        "{{title}}": toReadableTitle(),
        "{{createdDate}}": now().format(YYYY_MM_DD),
        "{{links}}": links.mapIt(it.toLinkItem).join("\n\n")
    var contents = readFile(getEnv(POST_TEMPLATE_PATH))
    for toReplace, replacement in tbl:
      contents = contents.replace(toReplace, replacement)
    writeFile(filePath, contents)
  except Exception as e:
    echo fmt"Error: {e.msg}"

Here, we use a table to hold key-value pairs, where each key is the target, and each value contains the replacement.

The template itself (template.mdx) looks like this:

title: "{{title}}"
draft: false
layout: ../../layout/Layout.astro
tags: [good-links]
createdDate: {{createdDate}}

import RambleCard from "../../components/RambleCard.astro";
import TagsMap from "../../components/TagsMap.astro";
import MinutesRead from "../../components/MinutesRead.astro";

<RambleCard frontmatter={{...frontmatter}}>
<TagsMap frontmatter={{...frontmatter}}/>
<MinutesRead minutesRead={frontmatter.minutesRead}/> 



I have two other files, one that handles string utility functions, and another that holds all the types. They’re uninteresting, so I won’t detail them here.


  • This site is behind a docker container. It’s hosted locally, via a reverse proxy (self-hosting has proven once to be a bad idea, but it isn’t something I care to fix ATM - if it’s down, it’s down).
  • I have an existing cronjob that updates my /now page daily, at midnight. Whenever that cronjob runs, it recreates the docker image
  • I created a new cronjob that runs on a weekly basis (Saturday AM). It calls a bash wrapper, which sources necessary variables, invokes the main.nim file, writes some logs, and sends a notification via ntfy. It doesn’t recreate the image. So, the new blog post should show up at midnight on the same day (when the aforementioned job runs)

That’s about it. Really, it’s quite a lot of work and minimally practical, but that tends to be the case for most of my “projects”.

Built with Astro and Tailwind 🚀