utilities/auto-blog-link-preserver#20: CLI to run against URL



Issue Information

Issue Type: issue
Status: opened
Reported By: btasker
Assigned To: btasker

Milestone: vnext
Created: 13-Aug-24 12:04



Description

It's probably worth kicking together a small REPL to trigger (re)processing for a URL.

If I've published content at abcd, the cronjob will process it when it appears in my RSS feed.

But... if I later come back and update that content, any new links won't get archived because the RSS feed won't change (nor would we really want it to).



Toggle State Changes

Activity


assigned to @btasker

verified

mentioned in commit 6b6dcc47f3d244089defbea76e38f9fd9752fa0f

Commit: 6b6dcc47f3d244089defbea76e38f9fd9752fa0f 
Author: B Tasker                            
                            
Date: 2024-08-22T08:17:42.000+01:00 

Message

feat: add CLI to manually preserve a URL and sublinks (utilities/auto-blog-link-preserver#20)

There are some niceties which need to be added before release though

+58 -0 (58 lines changed)

This is a basic validation and needs at least the following adding:

  • validation/exception handling
  • Support for the periodic duplication functionality (might already be there, I didn't export the env var)

I think it should also be moved under a subcommand. That way rather than having lots of scripts dotted about we'll have one which supports different things

verified

mentioned in commit 1c2fd76e2ad52de9f12d13ed221fe2b94294f301

Commit: 1c2fd76e2ad52de9f12d13ed221fe2b94294f301 
Author: B Tasker                            
                            
Date: 2024-08-23T08:15:20.000+01:00 

Message

feat: move addition to a subcommand (utilities/auto-blog-link-preserver#20)

Invocation is

./preserve.py submit https://example.com

+66 -23 (89 lines changed)