13 bookmarks for 2023-07-14

463.

dt: duck tape for your unix pipes

dt.plumbing
460.

Deepnight Games | RPG Map

deepnight.net/tools/rpg-map
458.

GitHub - aaronpk/Nautilus: Turn your website into an ActivityPub profile

github.com/aaronpk/Nautilus

This project is meant to run as a standalone service to deliver posts from your own website to ActivityPub followers. You can run your own website at your own domain, and this service can handle the ActivityPub-specific pieces needed to let people follow your own website from Mastodon or other compatible services.

456.

The Anti-Mac User Interface (Don Gentner and Jakob Nielsen)

www.nngroup.com/articles/anti-mac-interface
455.

Blot – A blogging platform with no interface.

blot.im

A site generator using the file system as the interface.

454.

Braille Awareness Day - my life with braille | London Vision

www.londonvision.org/blog/my-life-with-braille
453.

If you're happy with OpenBSD, probably any computer is good enough.

muezza.ca/thoughts/openbsd_imac_g4
452.

Софт, исходники и фото | jenyay.net

jenyay.net

Сайт разработчика OutWiker.

451.

Computers are an inherently oppressive technology

www.devever.net/~hl/ruthlessness

This may seem a strange heading for someone whose career is in computers, yet I feel that this article has been a lifetime in the making. It is the product of intuitive observations and things that have stood out to me, even as a child, who even then could sense the sinister side of the most banal of technologies.

450.

The Gift of It's Your Problem Now

apenwarr.ca/log/20211229
449.

Beware Offers of "Help" with Your Projects

misc-stuff.terraaeon.com/articles/beware-help.html

A specific someone in disguise says that accepting help to your software projects often leads to destruction.

448.

“Low-Resource” Text Classification: A Parameter-Free Classification Method with Compressors

aclanthology.org/2023.findings-acl.426

One can use gzip to classify data.

Deep neural networks (DNNs) are often used for text classification due to their high accuracy. However, DNNs can be computationally intensive, requiring millions of parameters and large amounts of labeled data, which can make them expensive to use, to optimize, and to transfer to out-of-distribution (OOD) cases in practice. In this paper, we propose a non-parametric alternative to DNNs that’s easy, lightweight, and universal in text classification: a combination of a simple compressor like gzip with a k-nearest-neighbor classifier. Without any training parameters, our method achieves results that are competitive with non-pretrained deep learning methods on six in-distribution datasets.It even outperforms BERT on all five OOD datasets, including four low-resource languages. Our method also excels in the few-shot setting, where labeled data are too scarce to train DNNs effectively.

Our method is a simple, lightweight, and uni- versal alternative to DNNs. It’s simple because it doesn’t require any preprocessing or training. It’s lightweight in that it classifies without the need for parameters or GPU resources. It’s universal as com- pressors are data-type agnostic, and non-parametric methods do not bring underlying assumptions.

Without any pre-training or fine-tuning, our method outperforms both BERT and mBERT on all five datasets.

Questioned:

447.

CC0 1.0 text as plain text

creativecommons.org/publicdomain/zero/1.0/legalcode.txt

For some reason it took me some time to find it. Saving here so it is easier next time.