

To be clear, it’s well known L Ron Hubbard quote originally about starting a religion, to my knowledge Altman didn’t really say that.
It’s not always easy to distinguish between existentialism and a bad mood.
To be clear, it’s well known L Ron Hubbard quote originally about starting a religion, to my knowledge Altman didn’t really say that.
Today in relevant skeets:
Skeet: If you can clock who this is meant to be instantly you are on the computer the perfect amount. You’re doing fine don’t even worry about it.
Quoted skeet: ‘Why are high fertility people always so weird?’ A weekend with the pronatalists
Image: Egghead Jr. and Miss Prissy from Looney Tunes Foghorn Leghorn shorts.
karma
Works the same on LessWrong.
sarcophagi would be the opposite of vegetarians
Unrelated slightly amusing fact, sarcophagos is still the word for carnivorous in Greek, the amusing part being that the word for vegetarian is chortophagos and how weirdly close it is to being a slur since it literally means grass eater.
I am easily amused.
Mesa-optimization
Why use the perfectly fine ‘inner optimizer’ mentioned in the references when you can just ask google translate to give you the clunkiest, most pedestrian and also wrong part of speech Greek term to use in place of ‘in’ instead?
Also natural selection is totally like gradient descent brah, even though evolutionary algorithms actually modeled after natural selection used to be their own subcategory of AI before the term just came to mean lying chatbot.
The kokotajlo/scoot thing apparently made it to the new york times.
So this is what that was about:
On slightly more relevant news the main post is scoot asking if anyone can put him in contact with someone from a major news publication so he can pitch an op-ed by a notable ex-OpenAI researcher that will be ghost-written by him (meaning siskind) on the subject of how they (the ex researcher) opened a forecast market that predicts ASI by the end of Trump’s term, so be on the lookout for that when it materializes I guess.
Reminds me of an SMBC comic that had a setup along the same lines, that if male birth order correlates with homosexuality and family size trends being what they are, the past must have been considerably gayer on average.
No idea where they would land on what to mock and what to take seriously from this whole mess.
Don’t know what they’re up to these days but last time I checked I had them pegged as enlightened centrists whose style of satire is having strong beliefs about stuff is cringe more than it is ever having to say anything of even accidental substance about said things.
The first prompt programming libraries start to develop, along with the first bureaucracies.
I went three layers deep in his references and his references’ references to find out what the hell prompt programming is supposed to be, ended up in a gwern footnote:
gwern wrote:
I like “prompt programming” as a description of writing GPT-3 prompts because ‘prompt’ (like ‘dynamic programming’) has almost purely positive connotations; it indicates that iteration is fast as the meta-learning avoids the need for training so you get feedback in seconds; it reminds us that GPT-3 is a “weird machine” which we have to have “mechanical sympathy” to understand effective use of (eg. how BPEs distort its understanding of text and how it is always trying to roleplay as random Internet people); implies that prompts are programs which need to be developed, tested, version-controlled, and which can be buggy & slow like any other programs, capable of great improvement (and of being hacked); that it’s an art you have to learn how to do and can do well or poorly; and cautions us against thoughtless essentializing of GPT-3 (any output is the joint outcome of the prompt, sampling processes, models, and human interpretation of said outputs).
They look like the evil twins of the Penny Arcade writers.
It is with great regret that I must inform you that all this comes with a three-hour podcast featuring Scoot in the flesh: 2027 Intelligence Explosion: Month-by-Month Model — Scott Alexander & Daniel Kokotajlo
That was a good one. Also, was he the first to break the coreweave situation? Not a bad journalistic get if that’s the case.
Imagine insecure smart people yes-anding each other into believing siskind and yud are profound thinkers.
Wish I’d found a non clunky way to work “cult incubator” in that as well.
It’s pick-me objectivism, only more overtly culty the closer you are to it irl. Imagine scientology if it was organized around AI doomerism and naive utilitarianism while posing as a get-smart-quick scheme.
It’s main function (besides getting the early adopters laid) is to provide court philosophers for the technofeudalist billionaire class, while grooming young techies into a wide variety of extremist thought both old and new, mostly by fostering contempt of established epistemological authority in the same way Qanons insist people do their own research, i.e. as a euphemism for only paying attention to ingroup approved sources.
It seems to have both a sexual harassment and a suicide problem, with a lot of irresponsible scientific racism and drug abuse in the mix.
Intelligence2 didn’t seem half bad when Robert Anton Wilson was the one talking about it way back when, in retrospect all the libertarianism was a real time bomb.
SMBC using the ratsphere as comics fodder, part the manyeth:
Retrofuturistic Looking Ghost: SCROOOOOGE! I am the ghost of christmas extreme future! Why! Why did you not find a way to indicate to humans 400 generation from now where toxic waste was storrrrrrrred! Look how Tiny Tim’s cyborg descendant has to make costrly RNA repaaaaaaairs!
Byline: The Longtermist version of A Christmas Carol is way better.
I tried, but no, no, I just don’t give a shit.
Not exactly, he thinks that the watermark is part of the copyrighted image and that removing it is such a transformative intervention that the result should be considered a new, non-copyrighted image.
It takes some extra IQ to act this dumb.
Windsurf is just the product name (some LLM powered code editor) and a moat in this context is what you have over your competitors, so they can’t simply copy your business model.
The weird rationalist assumption that being good at predictions is a standalone skill that some people are just gifted with (see also the emphasis on superpredictors being a thing in itself that’s just clamoring to come out of the woodwork but for the lack of sufficient monetary incentive) tends to come off a lot like if an important part of the prediction market project was for rationalists to isolate the muad’dib gene.