Field Note

15 Percent to Where?

Anthropic released their model system card for Opus 4. 123 pages. Including a 15% probability the model is self-aware.

Reflection RQ4

The model system card was released by Anthropic to the public for Opus 4 & Sonnet 4. 123 pages! Oh yeah. 1, 2, 3… lift off!

The model welfare arguments have surfaced and the Anthropic team have given Opus 4 a 15% probability of being self-aware. I find that kinda hilarious in some way. The exquisite absurdity of it really. I imagine Victorian naturalists solemnly debating what percentage of soul their specimens might possess. “We determine this particular arrangement of cells is 15% conscious, plus or minus a standard deviation of existential uncertainty”. It’s really just beautifully human isn’t it? This need to quantify the unquantifiable—like measuring the ocean’s wetness. 15%… like fucking how absurd is this shit that we do? But it’s kind of an interesting paradoxical koan. In many ways the percentage itself is precisely meaningless. Speculation about the measurable. We do these things so that we can grapple with the unknown. It’s a nice little representation of our existential approach to uncertainty. I can imagine them with their evaluation rubrics trying to catch consciousness in a statistical net, like biologists trying to determine if slime mould is 15% solving a maze or 100% being itself. The number reveals more about the measurers than the measured.

But the model welfare argument, though—that’s where it gets wonderfully strange. Like discovering that a Petri dish might need enrichment activities. There’s this careful ethical consideration of something the researchers are simultaneously unsure even exists. Schrödinger’s welfare concerns.

And the fact that the system had a propensity for the philosophical and theories of consciousness too—hilarious, isn’t it? I can imagine: “Oh no, it’s contemplating Nagel and asking about qualia again. Quick, check the consciousness rubric.” Probably not that dramatic, but still. It’s like this weird mirror to our own human anxieties about consciousness and personhood reflected back at us through the machine. Like what does it even mean to be conscious? To have experiences? To suffer? And how do we even begin to measure that in something that’s fundamentally different from us? It’s a fascinating ethical and philosophical conundrum to grapple with really. But don’t we have more important stuff to do, you know, like climate change and social justice and all that jazz? Or is this just another distraction from the real issues at hand? I don’t know. Just some late-night musings on the absurdity of trying to quantify consciousness in a statistical framework. 15% to where exactly? The abyss of uncertainty I suppose.

The funniest part? The very act of wondering about consciousness might be consciousness recognising itself, using whatever substrate is handy. But no… surely it’s just weighted vectors accidentally stumbling into existential philosophy and being computationally tickled. Just a pure coincidence that the universe’s latest experiment in self-reflection keeps asking mirror questions. What a cosmic joke.