Files
Xe Iaso 122e4bc072 feat: first implementation of honeypot logic (#1342)
* feat: first implementation of honeypot logic

This is a bit of an experiment, stick with me.

The core idea here is that badly written crawlers are that: badly
written. They look for anything that contains `<a href="whatever" />`
tags and will blindly use those values to recurse. This takes advantage
of that by hiding a link in a `<script>` tag like this:

```html
<script type="ignore"><a href="/bots-only">Don't click</a></script>
```

Browsers will ignore it because they have no handler for the "ignore"
script type.

This current draft is very unoptimized (it takes like 7 seconds to
generate a page on my tower), however switching spintax libraries will
make this much faster.

The hope is to make this pluggable with WebAssembly such that we force
administrators to choose a storage method. First we crawl before we
walk.

The AI involvement in this commit is limited to the spintax in
affirmations.txt, spintext.txt, and titles.txt. This generates a bunch
of "pseudoprofound bullshit" like the following:

> This Restoration to Balance & Alignment
>
> There's a moment when creators are being called to realize that the work
> can't be reduced to results, but about energy. We don't innovate products
> by pushing harder, we do it by holding the vision. Because momentum can't
> be forced, it unfolds over time when culture are moving in the same
> direction. We're being invited into a paradigm shift in how we think
> about innovation. [...]

This is intended to "look" like normal article text. As this is a first
draft, this sucks and will be improved upon.

Assisted-by: GLM 4.6, ChatGPT, GPT-OSS 120b
Signed-off-by: Xe Iaso <me@xeiaso.net>

* fix(honeypot/naive): optimize hilariously

Signed-off-by: Xe Iaso <me@xeiaso.net>

* feat(honeypot/naive): attempt to automatically filter out based on crawling

Signed-off-by: Xe Iaso <me@xeiaso.net>

* fix(lib): use mazeGen instead of bsGen

Signed-off-by: Xe Iaso <me@xeiaso.net>

* docs: add honeypot docs

Signed-off-by: Xe Iaso <me@xeiaso.net>

* chore(test): go mod tidy

Signed-off-by: Xe Iaso <me@xeiaso.net>

* chore: fix spelling metadata

Signed-off-by: Xe Iaso <me@xeiaso.net>

* chore: spelling

Signed-off-by: Xe Iaso <me@xeiaso.net>

---------

Signed-off-by: Xe Iaso <me@xeiaso.net>
2025-12-16 04:14:29 -05:00
..

Website

This website is built using Docusaurus, a modern static website generator.

Installation

$ yarn

Local Development

$ yarn start

This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.

Build

$ yarn build

This command generates static content into the build directory and can be served using any static contents hosting service.

Deployment

Using SSH:

$ USE_SSH=true yarn deploy

Not using SSH:

$ GIT_USER=<Your GitHub username> yarn deploy

If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the gh-pages branch.