I believe that I now understood

I believe that I now understood in some small measure why the Buddhist goes on pilgrimage to a mountain. The journey is itself part of the technique by which the god is sought. It is a journey into Being; for as I penetrate more deeply into the mountain’s life, I penetrate also into my own. For an hour I am beyond desire. It is not ecstasy, that leap out of the self that makes man like a god. I am not out of myself, but in myself. I am. To know Being, this is the final grace accorded from the mountain.

— Nan Shepard, The Living Mountain, 108, last paragraph

We see clouds

We see clouds so often, and in such abundance, that it’s easy to forget what marvels they are. A cloud is ethereal, yet astonishingly heavy: a levitating lake, typically weighing more than several blue whales.

— Ferris Jabr, Becoming Earth: How Our Planet Came to Life, 168

Sociology 101

Shells of anomiidae, also called mermaid’s toenails, jingle shells, saddle oysters. Lovely, paper-thin, luminescent treasures, they washed up in great quantities on the beach at the end of Annawamscutt Road. I went and collected hundreds of them whenever the tide was out, poured them into jelly jars, and gave them to friends who lived far from the ocean. I was always curious about why they all looked so different, and then I found out their shapes are determined by the shape of the object they grow on. They are the negative cast of whatever is on the seafloor, a random collection of what happens to be down there — rocks, other shells, sunken timbers, old moorings. Thus the alpha privative, the starting “a-” to indicate negation, or absence, of “nomos,” law. There is no law to this order of creature. It forms itself to the world, wrapped around its surround. Cut it loose from that context, and you have anomie, as Emile Durkheim once taught us, so very long ago.

Reel emergencies

Hail
Houses sliding into rivers
Lightning strikes
Muddy floods
Sandstorms
Tornadoes
Calving glaciers
Avalanches
Cats in trees
Wet birds
Tsunamis
Falling rocks
Volcanos
Collapsing towers
Sunspots
Drowning dogs
Trapped goats
Pigs in bulldozer buckets
Wildfires
Horses in gullies
Turtles on their backs

AKA jerk : useful terms for 2025

bastard

dog

clown

joker

skunk

creep

idiot

rat

snake

beast

moron

brute

brat

villain

heel

swine

fool

scumbag

boor

schmuck

louse

toad

cad

bugger

scum

nuisance

so-and-so

bounder

lout

stinker

rotter

pill

reptile

slob

hound

buzzard

cur

dirtbag

bleeder

varmint

vermin

slimeball

cretin

sleazebag

blighter

sleazeball

slime

barbarian

son of a gun

churl

sod

nerd

crud

fink

sleaze

crumb

scuzzball

savage

stinkard

chuff

loudmouth

rat fink

wretch

scoundrel

rogue

caveman

jackass

scab

dolt

oaf

lowlife

vulgarian

imbecile

scamp

roughneck

miscreant

Neanderthal

blockhead

turkey

snob

dork

ninny

booby

dope

goon

doofus

airhead

pest

insolent

nut

nincompoop

snot

schmoe

nitwit

schmo

dink

dweeb

snip

nit

half-wit

snoot

birdbrain

Notes on hallucination

  • Analysts estimated that chatbots hallucinate as much as 27% of the time, with factual errors present in 46% of their responses.
  • Indeed, ChatGPT and other sophisticated chatbots regularly put out false information. But that information is packaged in such an eloquent, grammatically correct statement that it’s easy to accept it as truth.
  • Errors in encoding and decoding between text and representations can cause hallucinations. When encoders learn the wrong correlations between different parts of the training data, it could result in an erroneous generation that diverges from the input.
  • AI hallucinations are caused by a variety of factors, including biased or low-quality training data, a lack of context provided by the user or insufficient programming in the model that keeps it from correctly interpreting information.
  • Northwestern’s Riesbeck said generative AI models are “always hallucinating.” Just by their very nature, they are always “making up stuff.” So removing the possibility of AI hallucinations ever generating false information could be difficult, if not impossible.
  • Temperature is a parameter that controls the randomness of an AI model’s output. It essentially determines the degree of creativity or conservatism in its generated content, where a higher temperature increases randomness and a lower temperature makes the output more deterministic. In short: the higher the temperature, the more likely a model is to hallucinate. Companies can provide users with the ability to adjust the temperature settings to their liking, and set a default temperature that strikes a proper balance between creativity and accuracy.

Whereas no temperature settings are required for humans, who can actually be creative and accurate at the same time. 

Sources:
https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
https://builtin.com/artificial-intelligence/ai-hallucination
https://www.lighton.ai/blog/llm-glossary-6/turning-up-the-heat-the-role-of-temperature-in-generative-ai-49
https://builtin.com/artificial-intelligence/eliza-effect

So it was

So it was that, after the Deluge, the Fallout, the plagues, the madness, the confusion of tongues, the rage, there began the bloodletting of the Simplification, when remnants of mankind had torn other remnants limb from limb killing rulers, scientists, leaders, technicians, teachers, and whatever persons the leaders of the maddened mobs said deserved death for having helped to make the Earth what it had become. Nothing had been so hateful in the sight of these mobs as the man of learning, at first because they had served the princes, but then later because they refused to join in the bloodletting and tried to oppose the mobs, calling the crowds ‘bloodthirsty simpletons.’

–Walter M. Miller, A Canticle for Leibowitz, 1959.