Neural networks and the Unix philosophy

Sunday 25 September 2022, 16:13

There are a number of directions from which we can look at current developments in deep neural networks and the issues I raised in my streamed comments on pirate AI. Here's a summary of the implications I see from the perspective of the Unix philosophy.

Somebody but not anybody

Tuesday 20 September 2022, 15:29

We have a patient suffering kidney failure; he's in a lot of pain and the disease will soon kill him. He could be saved - if someone would donate a kidney to be transplanted into this patient's body. So, as a matter of ethics, somebody ought to do that, right?

But who, exactly?

On language modelling and pirate AI (transcript)

Sunday 11 September 2022, 00:00

I've been thinking a lot recently about the current developments in deep neural network generative models, and how they implicate free software issues. I went into a lot of this stuff in my July 25 Twitch stream, and although I'd also like to write up some of my thoughts in a more organized way, the transcript of the relevant video content is a pretty decent introduction in itself, so I thought I'd post it here.

This is an automatically generated transcript (by another of these models!), with a fair bit of manual editing - so it's spoken English written down, not the usual standard form of written English. The video itself will go online eventually, but it may be a long time before that happens, because my video archive server is at capacity now and I probably won't expand it until the Twitch stream gets enough viewers I can sell some subscriptions to pay the bills.

The end of Lotto 6/49 as we know it

Sunday 4 September 2022, 00:00

The rules of Lotto 6/49 are about to change in a way that I think is pretty significant. It's drastic enough that I think we might well say Lotto 6/49 is coming to an end, being replaced by a new game that is not even properly a lotto game anymore and only happens to reuse the name.

Embodying the machine

Sunday 28 August 2022, 00:00

There's an idea in neuroscience that the brain maintains what is basically a 3D model of the body - the "body schema" - and that when we use tools, they become part of that schema. Claimed evidence for this concept includes studies where people were asked to perform a task with a tool, like picking up an object with a gripper like a pair of tongs, and then measurements afterward suggested their body schema had changed. For instance, the person's response time and the way they moved when picking up an object without a tool might change in a way that suggested their internal estimate of the length of their arms had increased; or their perception of the distance between two touches on their forearm might increase, as if they were at some deep level measuring against an estimated forearm length that had increased.

Bad enough to qualify

Sunday 21 August 2022, 00:00

Today is the Queen's Plate, a horse race here in Toronto that is considered a big enough deal to be reported in the general news media, and it had me thinking about a peculiarity of horse racing: the qualifications to enter a race are usually backward compared to human sports. Whereas human athletes need to be good enough to qualify for an event, horses need to be not too good.

The recipe for watering down a goal

Sunday 14 August 2022, 00:00

The overlapping circles of "life coaching," "mindfulness," and certain fluffy religions, have a highly-developed technology for achieving personal goals, and it works well - provided you're willing to follow the recipe.

What nobody else will do

Sunday 7 August 2022, 00:00

Somebody asked whether there's a purpose to my life (or that of whoever cared to answer) - on Twitter, where my profile description currently consists the the three words "Anchorite, apostate, asteroid."

I don't think there is. I used to think there was, but it's been some years since that fell apart for me. However, it's interesting what's left.

Shaping Hint

Sunday 31 July 2022, 17:00

This is the output from one of my text-generation AI experiments. I started with the GPT-J 6B model, and fine-tuned it on about 200,000 words of my own fiction writing (about half of that being the text of Shining Path). It took about 36 hours to fine-tune the model on my 12-core desktop PC, notably not using GPU computation, and then maybe 12 hours or so to generate this text under human supervision.

The fourth help

Sunday 31 July 2022, 00:00

I've written before about three very different actions that all end up being called "helping" someone. It recently occurred to me that there's a fourth important one as well.