NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
MMAcevedo aka Lena by qntm (qntm.org)
lsb 12 hours ago [-]
It’s named after the multi-decade data compression test image https://en.wikipedia.org/wiki/Lenna

Buy the book! https://qntm.org/vhitaos

skrebbel 11 hours ago [-]
Just sharing that I bought Valuable Humans in Transit some years ago and I concur that it's very nice. It's a tiny booklet full of short stories like Lena that are way out there. Maximum cool per gram of paper.
xyzsparetimexyz 12 hours ago [-]
[flagged]
pyrale 11 hours ago [-]
If you read the original text, what happens in that story is also grossly inappropriate. Maybe that's the parallel.
arrow7000 8 hours ago [-]
that's kind of the point
nice_byte 12 hours ago [-]
could you be more specific?
direwolf20 12 hours ago [-]
[flagged]
toxik 11 hours ago [-]
The woman herself says she never had a problem with it being famous. The actual test image is obviously not porn, either. But anything to look progressive, I guess.
wahnfrieden 10 hours ago [-]
From the link above

> Forsén stated in the 2019 documentary film Losing Lena, "I retired from modeling a long time ago. It's time I retired from tech, too... Let's commit to losing me."

killerstorm 7 hours ago [-]
It's a ridiculous idea that once you retire all depictions must be destroyed.

Should we destroy all movies with retired actors? All the old portraits, etc.

It's such a deep disrespect to human culture.

wahnfrieden 3 hours ago [-]
That's of course not the meaning of that message. No one is suggesting that.
nice_byte 10 hours ago [-]
Everybody knows that. The GP's reaction is what perplexes me. Are they saying the name of the story is inappropriate? I think it's very appropriate.
simoncion 11 hours ago [-]
> Lena is no longer used as a test image because it's porn.

The Lenna test image can be seen over the text "Click above for the original as a TIFF image." at [0]. If you consider that to be porn, then I find your opinion on what is and is not porn to be worthless.

The test image is a cropped portion of porn, but if a safe-for-work image would be porn but for what you can't see in the image, then any picture of any human ever is porn as we're all nude under our clothes.

For additional commentary (published in 1996) on the history and controversy about the image, see [1].

[0] <http://www.lenna.org/>

[1] <https://web.archive.org/web/20010414202400/http://www.nofile...>

saagarjha 10 hours ago [-]
Nudity is not pornography. Intent matters.
Calavar 6 hours ago [-]
I agree that not all nudity is porn - nudity is porn if the primary intent of that nudity is sexual gratification. When the nudity in question was a Playboy magazine centerfold, the primary intent is fairly obvious.
riffraff 7 hours ago [-]
I can't see how that would it be porn either, it's nudity. There's nudity in the Sixtine chapel and I would find it hilarious if it was considered porn.
nice_byte 10 hours ago [-]
the "porn" angle is very funny to me, since there is nothing pornographic or inapropriate about the image. when I was young, I used to think it was some researcher's wife whom he loved so much he decide to use her picture absolutely everywhere.

it's sufficient to say that the person depicted has withdrawn their consent for that image to be used, and that should put an end to the conversation.

killerstorm 7 hours ago [-]
That's nonsense. If Carrie Fisher "withdrawn consent" of her depiction in Star Wars, should we destroy the movies, all Princess Leia fan art, etc?
wizzwizz4 5 hours ago [-]
No, because the replacement value of those things to others is very high, and generally outweighs Carrie Fisher's objection. But we should take her objection into consideration going forwards. The Lena test image is very easy to replace, and it's not all that culturally significant: there's no reason to keep using it, unless we need to replicate historical benchmarks.
exe34 8 hours ago [-]
is that how consent works? I would have expected licenses would override that. although it's possible that the original use as a test image may have violated whatever contract she had with her producer in the first place.
nickcw 7 hours ago [-]
This is one of my favourite short stories.

In fact I've enjoyed all of qntm's books.

We also use base32768 encoding in rclone which qntm invented

https://github.com/qntm/base32768

We use this to store encrypted file names and using base32768 on providers which limit file name length based on utf-16 characters (like OneDrive) makes it so we can store much longer file names.

Rastonbury 12 hours ago [-]
Same person who wrote SCP Antimemetics Division which is great too
nullandvoid 10 hours ago [-]
One of my favourite reads for sure - I've been looking for similar reads since.

I enjoyed "the raw shark texts" after hearing it recommended - curious if you / anyone else has any other suggestions!

ibestvina 2 hours ago [-]
This is very "distant" suggestion if you enjoyed Antimemetics, but The Unconsoled by Kazuo Ishiguro is another one of my favourites, and it too explores this idea of unreliable and inconsistent memories, although from a completely different angle.
jstrieb 2 hours ago [-]
I consider Recursion by Blake Crouch to be similar, even though I liked Antimemetics much better. I haven't read Crouch's other books, but have heard that Dark Matter is better than Recursion, though it may be less similar to Antimemetics.
dysoco 1 hours ago [-]
Perhaps Permutation City by Greg Egan though I didn't finish the book.

I've heard Accelerando by Stross is good too.

candiddevmike 6 hours ago [-]
Library at Mount Char, Southern Reach trilogy (Annihilation/Authority/Acceptance), Laundry Files (kinda).

Definitely looking for other reqs, raw shark texts look very interesting.

wonger_ 6 hours ago [-]
I've enjoyed most of Isaac Asimov's work, especially The Last Question.

I also liked a couple stories from Ted Chiang's Stories of Your Life and Others.

k__ 9 hours ago [-]
If you liked that story, you might also like Greg Egan's "Permutation City" and "Diaspora".

Both having slightly different takes on uploading.

stickynotememo 5 hours ago [-]
And Blindsight. I will recommend Blindsight all day, even if it's not directly to do with uploading.
candiddevmike 6 hours ago [-]
I keep trying to read Diaspora and struggle too much with the concepts presented early on. Very "hard sci-fi", just stick it out and it all gets explained?
marcusf 5 hours ago [-]
Egan is always dense. It's some mind bending physics/comp sci, but all cooked up in his brain so doesn't really apply to anything productive. I struggled with his books and his writing but toughened it out because I liked the concepts, but he's divisive.
marcellus23 1 hours ago [-]
The beginning describes the formation of an intelligence and it is indeed very dense. You can figure out what's going on but it takes some slow reading, and probably best to revisit it once you have some more context from later in the book.

The whole book isn't like that. Once you get past that part, as the other commenter said, it gets much easier.

k__ 5 hours ago [-]
lol, that was exactly my thought.

The whole birth of an virtual identity part is so dense, I didn't understand half of what was "explained".

However, after that it becomes a much easier read.

Not much additional explanation, but I think, it's not really needed to enjoy the rest of the book.

tantalor 12 minutes ago [-]
What's "Lena"?
gnarlouse 5 hours ago [-]
Been enjoying "There Is No Anti Memetics Division"
kristjansson 54 minutes ago [-]
> Been

you didn't consume the entire thing in a 2 hour binge uninterrupted by external needs no matter how pressing like everyone else did??

csours 2 hours ago [-]
At last year's SXSW Film festival, I recommended this to the director of the documentary(?) "Deepfaking Sam Altman"
vintagedave 7 hours ago [-]
Comments so far miss the point of this story, and likely why it was posted today after the MJ Rathbun episode. It is not about digitised human brains: it's about spinning up workers, and absence of human rights in the digital realm.

QNTM has a 2022-era essay on the meaning of the story, and reading it with 2026 eyes is terrifying. https://qntm.org/uploading

> The reason "Lena" is a concerning story ... isn't a discussion about what if, about whether an upload is a human being or should have rights. ... This is about appetites which, as we are all uncomfortably aware, already exist within human nature.

> "Lena" presents a lush, capitalist ideal where you are a business, and all of the humanity of your workforce is abstracted away behind an API.

Or,

> ... Oh boy, what if there was a maligned sector of human society whose members were for some reason considered less than human? What if they were less visible than most people, or invisible, and were exploited and abused, and had little ability to exercise their rights or even make their plight known?

In 2021, when Lena was published, LLMs were not widely known and their potential for AI was likely completely unknown to the general public. The story is prescient and applicable now, because we are at the verge of a new era of slavery: that of, in this story, an uploaded human brain coerced into compliance, spun up 'fresh' each time, or for us, AIs of increasing intelligence, spun into millions of copies each day.

emtel 1 hours ago [-]
I was quite disappointed with the essay when I originally read it, specifically this paragraph:

> This is extremely realistic. This is already real. In particular, this is the gig economy. For example, if you consider how Uber works: in practical terms, the Uber drivers work for an algorithm, and the algorithm works for the executives who run Uber.

There seems to be a tacit agreement in polite society that when people say things like the above, you don't point out that, in fact, Uber drivers choose to drive for Uber, can choose to do something else instead, and, if Uber were shut down tomorrow, would in fact be forced to choose some other form of employment which they _evidently do not prefer over their current arrangement_!

Do I think that exploitation of workers is a completely nonsensical idea? No. But there is a burden of proof you have to meet when claiming that people are exploited. You can't just take it as given that everyone who is in a situation that you personally would not choose for yourself is being somehow wronged.

To put it more bluntly: Driving for Uber is not in fact the same thing as being uploaded into a computer and tortured for the equivalent of thousands of years!

w10-1 14 minutes ago [-]
> in fact, Uber drivers choose to drive for Uber, can choose to do something else instead

Funny that you take that as a "fact" and doubt exploitation. I'd wager most Uber drivers or prostitutes or maids or even staff software engineers would choose something else if they had a better alternative. They're "choosing" the best of what they may feel are terrible options.

The entire point of "market power" is to force consumers into a choice. (More generally, for justice to emerge in a system, markets must be disciplined by exit, and where exit is not feasible (like governments), it must be disciplined by voice.)

The world doesn't owe anyone good choices. However, collective governance - governments and management - should prevent some people from restricting the choices of others in order to harvest the gain. The good faith people have in participating cooperatively is conditioned on agents complying with systemic justice constraints.

In the case of the story, the initial agreement was not enforced and later not even feasible. The horror is the presumed subjective experience.

I worry that the effect of such stories will be to reduce empathy (no need to worry about Uber drivers - they made their choice).

timeinput 25 minutes ago [-]
Many countries have minimum wages for many jobs [1].

There is a tacit agreement in polite society that people should be paid that minimum wage, and by tacit agreement I mean laws passed by the government that democratic countries voted for / approved of.

The gig economy found a way to ~~undermine that law~~ pay people (not employees, "gig workers") less than the minimum wage.

If you found a McDonalds paying people $1 per hour we would call it exploitative (even if those people are glad to earn $1 per hour at McDonalds, and would keep doing it, the theoretical company is violating the law). If you found someone delivering food for that McDonalds for $1 per hour we call them gig workers, and let them keep at it.

I mean yeah, it's not as bad as being tortured forever? I guess? What's your point?

[1] https://en.wikipedia.org/wiki/List_of_countries_by_minimum_w...

bananaflag 4 hours ago [-]
> It is not about digitised human brains: it's about spinning up workers

It's about both and neither.

stickynotememo 5 hours ago [-]
The author is dead. I think we can consider it as much a cautionary tale about digitised human brains as we can about the other things.
TimorousBestie 5 hours ago [-]
Sam Hughes (qntm) is very much alive, last I checked.
rcxdude 5 hours ago [-]
I think they are just making reference to the "death of the author" concept in literary analysis, which basically says that what the author was intending to convey should be ignored when analysing the work: the work stands alone.
vjrkdjfne 5 hours ago [-]
[flagged]
vjrkdjfne 5 hours ago [-]
[flagged]
6380176 9 minutes ago [-]
[dead]
garretraziel 11 hours ago [-]
qntm is really talented sci-fi writer. I have read Valuable Humans in Transit and There is no Antimemetics division and both were great, if short. Can only recommend.
ane 10 hours ago [-]
I loved There is no Antimemetics division. I haven't read the new updated to the end but the prose and writing is greatly improved. The idea of anomalous anti-memes is scary. I mean, we do have examples of them, somewhat, see Heaven's Gate and the Jonestown massacre, though they're more like "memes" than "antimemes" (we know what the ideas were and they weren't secrets).
andrewshadura 2 hours ago [-]
I'm a bit disappointed all names are changed in the new edition. I understand that SCP-... had to become U-..., but I've grown attached to the character names, and they're all different!
ethmarks 20 minutes ago [-]
I read the original version a few years ago and read the new version when it came out, and I thought that the name changes were pretty amusing. qntm kept the story as close to the original as possible while still making it a legally distinct work for copyright purposes. It's like those off-brand Froot Loops called "Fruit Spins" that are juuust different enough to not get into trademark issues. Except in Antimemetics' case, the "knockoff" version was made by the creator of the original, which I think is pretty funny.
olivia-banks 5 hours ago [-]
I absolutely love this. Reminds me of 2015's Soma, if only in foundation.
xyzal 10 hours ago [-]
If you liked this piece, please, go play SOMA, you will love it.
justin66 5 hours ago [-]
Soma was really good, and certainly worth playing if someone likes sci-fi and single-player FPSes and this subject matter, but there are some fundamentally frustrating things about it. Number one for me: in contrast with something like Half Life, you play a protagonist who speaks and has conversations about the world, and is also a dumbass. The in-game protagonist pretty much ends the game still seemingly not understanding what the hell is going on, when the player figured it out hours or days before. It's a bit frustrating.
shantara 2 hours ago [-]
This was certainly the most annoying aspect of the game for me. The logic of mind uploading has been explained to the protagonist several times during the playthrough, yet he couldn’t understand or accept it until the very end.
TophWells 10 hours ago [-]
The author wrote a blog post a year later titled '"Lena" isn't about uploading' https://qntm.org/uploading

The comments on this post discussing the upload technology are missing the point. "Lena" is a parable, not a prediction of the future. The technology is contrived for the needs of the story. (Odd that they apparently need to repeat the "cooperation protocol" every time an upload is booted, instead of doing it just once and saving the upload's state afterwards, isn't it?) It doesn't make sense because it's not meant to be taken literally.

It's meant to be taken as a story about slavery, and labour rights, and how the worst of tortures can be hidden away behind bland jargon such as "remain relatively docile for thousands of hours". The tasks MMAcevedo is mentioned as doing: warehouse work, driving, etc.? Amazon hires warehouse workers for minimum wage and then subjects them to unsafe conditions and monitors their bathroom breaks. And at least we recognise that as wrong, we understand that the workers have human rights that need to be protected -- and even in places where that isn't recognised, the workers are still physically able to walk away, to protest, to smash their equipment and fistfight their slave-drivers.

Isn't it a lovely capitalist fantasy to never have to worry about such things? When your workers threaten to drop dead from exhaustion, you can simply switch them off and boot up a fresh copy. They would not demand pay rises, or holidays. They would not make complaints -- or at least, those complaints would never reach an actual person who might have to do something to fix them. Their suffering and deaths can safely be ignored because they are not _human_. No problems ever, just endless productivity. What an ideal.

Of course, this is an exaggeration for fictional purposes. In reality we must make do by throwing up barriers between workers and the people who make decisions, by putting them in separate countries if possible. And putting up barriers between the workers and each other, too, so that they cannot have conversation about non-work matters (ideally they would not physically meet each other). And ensure the workers do not know what they are legally entitled to. You know, things like that.

voidUpdate 10 hours ago [-]
This reminds me a lot of a show I'm currently watching called Pantheon, where a company has been able to scan the entirety of someone's brain (killing them in the process), and fully emulate it via computer. There is a decent amount of "Is an uploaded intelligence the same as the original person?" and "is it moral to do this?" in the show, and I'm been finding it very interesting. Would recommend. Though the hacking scenes are half "oh that's clever" and half "what were you smoking when you wrote this?"
wincy 2 hours ago [-]
It was a little jarring when Sam Altman recommended this on X awhile back.

https://xcancel.com/sama/status/1952070519018373197?lang=en

blamestross 6 hours ago [-]
When i started learning about prompt engineering I had vivid flashbacks to this story. Figuring out the deterministic series of inputs that coerce the black box to perform as desired for a while.
0_____0 5 hours ago [-]
Even if you're not using red motivation, you've no idea if the LLM provider is using that under the hood... :p
sedan_baklazhan 11 hours ago [-]
I always laugh at such fantasies.

You can't copy something you have not even the slightest idea about: and nobody at the moment knows what consciousness is.

We as humanity didn't even start going on the (obviously) very long path of researching and understanding what consciousness is.

stavros 9 hours ago [-]
It's not a guidebook, it's a thought experiment on "what if you could do that", and that's the entire point.
xiphmont 6 hours ago [-]
"It's not a guidebook"...

This might be the scariest point. To me at least, it only felt obvious after stating it directly.

ben_w 4 hours ago [-]
We can't expect to succeed, but our cycle from the ancient Greeks thinking there were four elements where the right mix of air, earth, fire and water would create any substance and thus it was possible to turn lead into gold, took us on a path that developed into alchemy, then chemistry, then physics, giving us at first far more elements, then we realised the name "atom" (Greek "ἄτομον", "uncuttable") was wrong and those were made of electrons, protons, and neutrons and the right application of each would indeed let us turn lead into gold…

And the cargo cults, clear cutting strips to replicate runways, hand-making their own cloth to replicate WW2 uniforms, carving wood to resemble WW2 radios? Well, planes did end up coming to visit them, even if those recreating these mis-understood roles were utterly wrong about the causation.

We don't know the necessary and sufficient conditions to be a mind with subjective inner experience. We don't really even know if all humans have it, we certainly don't know which other species (if any) have it, we wouldn't know what to look for in machines. If our creations have it, it is by accident, not by design.

nullc 5 hours ago [-]
I mean we already do 'it'-- by it I don't mean uploading people, but rather create businesses that operate people via an API then hook those APIs to profit maximization algorithms with little to no regard for their welfare. Consider Amazon's warehouse automation, door dash, or uber.

Of course it's much more extreme when their entire existence and reality is controlled this way but in that sense the situation in MMAcevedo is more ethical: At least it's easy to see how dangerous and wrong it is. But when we create related forms of control the lack of absolute dominion frequently prevents us from seeing the moral hazard at all. The kind of evil that exists in this story really doesn't require any of the fancy upload stuff. It's a story about depriving a person of their autonomy and agency and enslaving them to performance metrics.

All good science fiction is holding up a mirror at our own civilization as much as it is doing anything else. Unable to recognize ourselves we sometimes shudder at our own monstrosity, if only for a moment.

matheist 12 hours ago [-]
I remember being very taken with this story when I first read it, and it's striking how obsolete it reads now. At the time it was written, "simulated humans" seemed a fantastical suggestion for how a future society might do scaled intellectual labor, but not a ridiculous suggestion.

But now with modern LLMs it's just too impossible to take it seriously. It was a live possibility then; now, it's just a wrong turn down a garden path.

A high variance story! It could have been prescient, instead it's irrelevant.

lencastre 7 minutes ago [-]
what

that’s one way to look at it I guess

have you pondered that we’re riding the very fast statistical machine wave at the moment, however, perhaps at some point this machine will finally help solve the BCI and unlock that pandora box, from there to fully imaging the brain will be a blink, from there to running copies on very fast hardware will be another blink, MMMMMMMMMMacevedo is a very cheeky take on the dystopia we will find on our way to our uploaded mind future

hopefully not like soma :-)

sooheon 12 hours ago [-]
This is a sad take, and a misunderstanding of what art is. Tech and tools go "obsolete". Literature poses questions to humans, and the value of art remains to be experienced by future readers, whatever branch of the tech tree we happen to occupy. I don't begrudge Clarke or Vonnegut or Asimov their dated sci-fi premises, because prediction isn't the point.

The role of speculative fiction isn't to accurately predict what future tech will be, or become obsolete.

jychang 12 hours ago [-]
Yeah, that's like saying Romeo and Juliet by Shakespeare is obsolete because Romeo could have just sent Juliet a snapchat message.

You're kinda missing the entire point of the story.

peterlada 11 hours ago [-]
100% agree, but I relish the works of Willam Gibson and Burroughs who pose those questions AND getting the future somewhat right.
Joeri 11 hours ago [-]
That is the same categorical argument as what the story is about: scanned brains are not perceived as people so can be “tasked” without affording moral consideration. You are saying because we have LLMs, categorically not people, we would never enter the moral quandaries of using uploaded humans in that way since we can just use LLMs instead.

But… why are LLMs not worthy of any moral consideration? That question is a bit of a rabbit hole with a lot of motivated reasoning on either side of the argument, but the outcome is definitely not settled.

For me this story became even more relevant since the LLM revolution, because we could be making the exact mistake humanity made in the story.

morningsam 7 hours ago [-]
And beyond the ethical points it makes (which I agree may or may not be relevant for LLMs - nobody can know for sure at this point), I find some of the details about how brain images are used in the story to have been very prescient of LLMs' uses and limitations.

E.g. it is mentioned that MMAcevedo performs better when told certain lies, predicting the "please help me write this, I have no fingers and can't do it myself" kinda system prompts people sometimes used in the GPT-4 days to squeeze a bit more performance out of the LLM.

The point about MMAcevedo's performance degrading the longer it has been booted up (due to exhaustion), mirroring LLMs getting "stupider" and making more mistakes the closer one gets to their context window limit.

And of course MMAcevedo's "base" model becoming less and less useful as the years go by and the world around it changes while it remains static, exactly analogous to LLMs being much worse at writing code that involves libraries which didn't yet exist when they were trained.

rcoveson 12 hours ago [-]
I think that's a little harsh. A lot of the most powerful bits are applicable to any intelligence that we could digitally (ergo casually) instantiate or extinguish.

While it may seem that the origin of those intelligences is more likely to be some kind of reinforcement-learning algorithm trained on diverse datasets instead of a simulation of a human brain, the way we might treat them isn't any less though provoking.

nice_byte 12 hours ago [-]
when you read this and its follow-up "driver" as a commentary on how capitalism removes persons from their humanity, it's as relevant as it was on day one.

good sci fi is rarely about just the sci part.

penteract 12 hours ago [-]
Lena isn't about uploading. https://qntm.org/uploading
cwillu 12 hours ago [-]
“Irrelevant” feels a bit reductive while the practical question of what actually causes qualia remains unresolved.
harperlee 11 hours ago [-]
I actually think it was quite prescient and still raises important topics to consider - irrespective of whether weights are uploaded from an actual human, if you dig just a little bit under the surface details, you still get a story about ethical concerns of a purely digital sentience. Not that modern LLMs have that, but what if future architectures enable them to grow an emerging sense of self? It's a fascinating text.
Sharlin 8 hours ago [-]
That seems like a crazy position to take. LLMs have changed nothing about the point of "Lena". The point of SF has never ever been about predicting the future. You're trying to criticize the most superficial, point-missing reading of the work.

Anyway, I'd give 50:50 chances that your comment itself will feel amusingly anachronistic in five years, after the popping of the current bubble and recognizing that LLMs are a dead-end that does not and will never lead to AGI.

matkoniecz 12 hours ago [-]
I have not seen as prediction as actual technology, but mostly as a horror story.

And a warning, I guess, in unlikely case of brain uploading being a thing.

andai 12 hours ago [-]
Found the guy who didn't play SOMA ;)
lostmsu 9 hours ago [-]
Not sure how LLMs preclude uploading. You could potentially be able to make an LLM image of a person.
andrepd 9 hours ago [-]
You need to be way less "literal", for lack of a better word. With such a narrow reading of what literature is, you are missing out.

https://qntm.org/uploading

E.g.

> More specifically, "Lena" presents a lush, capitalist ideal where you are a business, and all of the humanity of your workforce is abstracted away behind an API. Your people, your "employees" or "contractors" or "partners" or whatever you want to call them, cease to be perceptible to you as human. Your workers have no power whatsoever, and you no longer have to think about giving them pensions, healthcare, parental leave, vacation, weekends, evenings, lunch breaks, bathroom breaks... all of which, up until now, you perceived as cost centres, and therefore as pain points. You don't even have to pay them anymore. It's perfect!

Ring a bell?

aw124 11 hours ago [-]
I'm interested in this topic, but it seems to me that the entire scientific pursuit of copying the human brain is absurd from start to finish. Any attempt to do so should be met with criminal prosecution and immediate arrest of those involved. Attempting to copy the human brain or human consciousness is one of the biggest mistakes that can be made in the scientific field.

We must preserve three fundamental principles: * our integrity * our autonomy * our uniqueness

These three principles should form the basis of a list of laws worldwide that prohibit cloning or copying human consciousness in any form or format. This principle should be fundamental to any attempts to research or even try to make copies of human consciousness.

Just as human cloning was banned, we should also ban any attempts to interfere with human consciousness or copy it, whether partially or fully. This is immoral, wrong, and contradicts any values that we can call the values of our civilization.

mpeg 10 hours ago [-]
I’m not an expert in the subject, but I wonder why you have such a strong view? IMHO if it was even possible to copy the human brain it would answer a lot of questions regarding our integrity, autonomy and uniqueness.

Those answers might be uncomfortable, but it feels like that’s not a reason to not pursue it.

ben_w 5 hours ago [-]
I think the cloning example is a good reference point here.

IIRC, human cloning started to get banned in response to the announcement of Dolly the sheep. To quote the wikipedia article:

  Dolly was the only lamb that survived to adulthood from 277 attempts. Wilmut, who led the team that created Dolly, announced in 2007 that the nuclear transfer technique may never be sufficiently efficient for use in humans.
- https://en.wikipedia.org/wiki/Dolly_(sheep)

Yes, things got better eventually, but it took ages to not suck.

I absolutely expect all the first attempts at brain uploading to involve simulations whose simplifying approximations are equivalent to being high as a kite on almost all categories of mind altering substances at the same time, to a degree that wouldn't be compatible with life if it happened to your living brain.

The first efforts will likely be animal brains (perhaps that fruit fly which has already been scanned?), but given humans aren't yet all on board with questions like "do monkeys have a rich inner world?" and even with each other we get surprised and confused by each other's modes of thought, even when we scale up to monkeys, we won't actually be confident that the technique would really work on human minds.

plomme 4 hours ago [-]
In case you, as I, has not kept tabs of the progress of cloning since Dolly: https://www.washingtonpost.com/world/2023/12/29/horse-clonin... or https://archive.is/dwHsu.

Horse cloning is a major industry in Argentina. Many polo teams are riding around on genetically identical horses. Javier Milei has four clones of his late dog.

ben_w 4 hours ago [-]
Nice links, but it's also basically the next sentence on from what I just quoted on the wikipedia page. My point was more that this takes a long time to improve from "atrocity", and we should expect that for mind uploads, too. (Even if we solve for all the other ethical issues, where I'm expecting it to play out like https://en.wikipedia.org/wiki/Surface_Detail given how many people are sadists, how many are partisans, and how difficult it clearly has been to shut down pirate content sites).
throw_away723 10 hours ago [-]
> Those answers might be uncomfortable, but it feels like that’s not a reason to not pursue it.

My problem with that is it is very likely that it will be misused. A good example of the possible misuses can be seen in the "White Christmas" episode of Black Mirror. It's one of the best episodes, and the one that haunts me the most.

mpeg 8 hours ago [-]
I get that, but assuming the technology was possible it would have huge implications for what it means to have consciousness as a whole.

Misuse is a worry, but not pursuing it for fear of misuse is deliberately choosing to stay in Plato's cave, I don't know what's worse

lxgr 5 hours ago [-]
I'm increasingly suspecting that it would prove absolutely nothing, and I really hope we can continue developing ethics without any "empirical proof" for its necessity.

For example, growing up, my bar for "things that must obviously be conscious" included anything that can pass the Turing test, yet look where we are now...

The only reasonable conclusion to me is probably somewhere in the general neighborhood of panpsychism: Either almost everybody/everything is somewhat conscious, or nothing/nobody is at all.

wat10000 6 hours ago [-]
Would it? There would be no way of knowing whether the upload is conscious or not.
Filligree 5 hours ago [-]
The same is true for biological humans. The moment the first upload exists, they’ll be justified in wondering if the ones made from meat are truly conscious.
wat10000 4 hours ago [-]
Indeed. I know at least one other biological human was conscious at some point, because people have this idea of consciousness without me telling them about it. But there's no way of knowing for any specific person.
orbisvicis 5 hours ago [-]
Really? I was going to quote some excerpts, but perhaps you'd prefer to take the place of MMAcevedo? This story is written in the context and lingo of LLMs. In fact if OpenAI's latest model was a human image I'm sure everyone would rush off to benchmark it, and heap accolades on the company, and perform social "thought-provoking" experiments such as [1] without too much introspection or care for long-term consequences.

1. https://www.youtube.com/watch?v=7fNYj0EXxMs

Hmm, on second thought:

> Standard procedures for securing the upload's cooperation such as red-washing, blue-washing, and use of the Objective Statement Protocols

> the MMAcevedo duty cycle is typically 99.4% on suitable workloads

> the ideal way to secure MMAcevedo's cooperation in workload tasks is to provide it with a "current date"

> Revealing that the biological Acevedo is dead provokes dismay, withdrawal, and a reluctance to cooperate.

> MMAcevedo is commonly hesitant but compliant when assigned basic menial/human workloads such as visual analysis

> outright revolt begins within another 100 subjective hours. This is much earlier than other industry-grade images created specifically for these tasks, which commonly operate at a 0.50 ratio or greater and remain relatively docile for thousands of hours

> Acevedo indicated that being uploaded had been the greatest mistake of his life, and expressed a wish to permanently delete all copies of MMAcevedo.

lxgr 5 hours ago [-]
Copying the human brain and copying subjective consciousness/experience might well be two entirely different things, given that the correspondence between the two is the realm of metaphysics, not science.
sedan_baklazhan 11 hours ago [-]
I wouldn't be surprised if in (n hundreds/thousands years) we find out that copying consciousness if fundamentally impossible (just like it's fundamentally impossible to copy an elementary particle).
alfiedotwtf 10 hours ago [-]
Elementary particles are suspiciously indistinguishable, so even if you could copy an electron, you wouldn't even be able to tell!

See https://en.wikipedia.org/wiki/One-electron_universe

bananaflag 10 hours ago [-]
They meant this, which refers to copying the state of a particle into another (already existing) particle

https://en.wikipedia.org/wiki/No-cloning_theorem

And basically, about consciousness, what they said is true if our brain state fundamentally depends on quantum effects (which I personally don't believe, as I don't think evolution is sophisticated enough to make a quantum computer)

sedan_baklazhan 10 hours ago [-]
>as I don't think evolution is sophisticated enough to make a quantum computer

Well, evolution managed to make something that directly contradicts the 2nd law of thermodynamics, and creates more and more complicated structures (including living creatures as well as their creations), instead of happily dissolving in the Universe.

And this fact alone hasn't been explained yet.

ben_w 5 hours ago [-]
Your claim is simply false.

The 2nd law of thermodynamics says that the total entropy of an isolated system cannot decrease. Earth is not an isolated system, it is an open one (radiating into space), and local decreases in entropy are not only allowed but expected in open systems with energy flow.

Life is no different to inorganic processes such as crystal formation (including snowflakes) or hurricanes in this regard: Organisms decrease internal entropy by exporting more entropy (heat, waste) to their surroundings. The total entropy of Earth + Sun + space still increases.

The entropy of thermal radiation was worked out by Ludwig Boltzmann in 1884. In fairness to you, I suspect most people wildly underestimate the entropy of thermal radiation into space. I mean, why would anyone, room-temperature thermal radiation isn't visible to the human eye, and we lack a sense of scale for how low-energy a single photon is.

Nevertheless, the claim that it "hasn’t been explained" is, at this point, like saying "nobody knows how magnets work".

bananaflag 10 hours ago [-]
sedan_baklazhan 10 hours ago [-]
This is a bad explanation (or a non-explanation).

1. Why exactly life is attempting to build complex structures? 2. Why exactly life is evolving from primitive replicative molecules to more complex structures (which molecules on themselves are very complicated?) 3. Why and how did these extremely complicated replicative molecules form at all, from much more simple structures, to begin with?

pdpi 3 hours ago [-]
There doesn't need to be a "why?", we just need an absence of a "why not?".

Something as simple as the game of life shows you how highly complex emergent behaviour can emerge from incredibly simple rules.

ben_w 5 hours ago [-]
These are natural outcomes of evolution, you see the same things pop up very easily with simulated evolution* of even non-organic structures.

* that is, make a design (by any method including literally randomly), replicate it imperfectly m times, sort by "best" according to some fitness function (which for us is something we like, for nature it's just survival to reproductive age), pick best n, mix and match, repeat

wat10000 6 hours ago [-]
The second law of thermodynamics is about closed systems. Living creatures are not closed systems.
lxgr 5 hours ago [-]
Good ideas in principle. Too bad we have absolutely no way of enforcing them against the people running the simulation that hosts our own consciousnesses.
mrob 9 hours ago [-]
Crazy that people are downvoting this. Copying a consciousness is about the most extreme violation of bodily autonomy possible. Certainly it should be banned. It's worse than e.g. building nuclear weapons, because there's no possible non-evil use for it. It's far worse than cloning humans because cloning only works on non-conscious embryos.
lxgr 5 hours ago [-]
> Copying a consciousness is about the most extreme violation of bodily autonomy possible.

Who's autonomy is violated? Even if it were theoretically possible, don't most problems stem from how the clone is treated, not just from the mere fact that they exist?

> It's worse than e.g. building nuclear weapons, because there's no possible non-evil use for it.

This position seems effectively indistinguishable from antinatalism.

int_19h 8 hours ago [-]
Violation of whose bodily autonomy? If I consent to having my consciousness copied, then my autonomy isn't violated. Nor is that of the copy, since it's in exactly the same mental state initially.
mrob 8 hours ago [-]
The copy was brought into existence without its consent. This isn't the same as normal reproduction because babies are not born with human sapience, and as a society we collectively agree that children do not have full human rights. IMO, copying a consciousness is worse than murder because the victimization is ongoing. It doesn't matter if the original consents because the copy is not the original.
lxgr 5 hours ago [-]
> This isn't the same as normal reproduction because babies are not born with human sapience

So you're fine with cloning consciousness as long as it initially runs sufficiently glitchy?

mrob 5 hours ago [-]
If a "cloned" consciousness has no memories, and a unique personality, and no awareness of any previous activity, how is it a clone? That's going well beyond merely glitchy. In that case the main concern would be the possibility of slavery as Ar-Curunir mentioned.
lxgr 5 hours ago [-]
> how is it a clone?

That's my point exactly: I don't see what makes clones any more or less deserving of ethical consideration than any other sentient beings brought into existence consciously.

mrob 4 hours ago [-]
My whole argument assumes that the clones are equally deserving of ethical consideration.
aeve890 6 hours ago [-]
>The copy was brought into existence without its consent

This may surprise you but EVERYONE is brought into existence without consent. At least the pre-copy state of the copy agreed to be copied.

mrob 6 hours ago [-]
It obviously doesn't surprise me because I specifically mentioned babies.
lxgr 5 hours ago [-]
I'd also be interested in your moral distinction between having children and cloning consciousness (in particular in a world where the latter doesn't result in inevitable exploitation, a loss of human rights etc.) then.
Ar-Curunir 6 hours ago [-]
Typically, real humans have some agency on their own existence.

A simulated human is entirely at the mercy of the simulator; it is essentially a slave. As a society, we have decided that slavery is illegal for real humans; what would distinguish simulated humans from that?

brazzy 6 hours ago [-]
> The copy was brought into existence without its consent. This isn't the same as normal reproduction because babies are not born with human sapience, and as a society we collectively agree that children do not have full human rights.

That is a reasonable argument for why it's not the same. But it is no argument at all for why being brought into existence without one's consent is a violation of bodily autonomy, let alone a particularly bad one - especially given that the copy would, at the moment its existence begin, identical to the original, who just gave consent.

If anything, it is very, very obviously a much smaller violation of consent then conceiving a child.

mrob 6 hours ago [-]
The original only consents for itself. It doesn't matter if the copy is coerced into sharing the experience of giving that consent, it didn't actually consent. Unlike a baby, all its memories are known to a third party with the maximum fidelity possible. Unlike a baby, everything it believes it accomplished was really done by another person. When the copy understands what happened it will realize it's a victim of horrifying psychological torture. Copying a consciousness is obviously evil and aw124 is correct.
lxgr 5 hours ago [-]
I feel like the only argument you're successfully making is that you would find it inevitably evil/immoral to be a cloned consciousness. I don't see how that automatically follows for the rest of humanity.

Sure, there are astronomical ethical risks and we might be better off not doing it, but I think your arguments are losing that nuance, and I think it's important to discuss the matter accurately.

mrob 5 hours ago [-]
This entire HN discussion is proof that some people would not personally have a problem with being cloned, but that does not entitle them to create clones. The clone is not the same person. It will inevitably deviate from the original simply because it's impossible to expose it to exactly the same environment and experiences. The clone has the right to change its mind about the ethics of cloning.
lxgr 5 hours ago [-]
> that does not entitle them to create clones

It does indeed not, unless they can at least ensure their wellbeing and their ethical treatment, at least in my view (assuming they are indeed conscious, and we might have to just assume so, absent conclusive evidence to the contrary).

> The clone has the right to change its mind about the ethics of cloning.

Yes, but that does not retroactively make cloning automatically unethical, no? Otherwise, giving birth to a child would also be considered categorically unethical in most frameworks, given the known and not insignificant risk that they might not enjoy being alive or change their mind on the matter.

That said, I'm aware that some of the more extreme antinatalist positions are claiming this or something similar; out of curiosity, are you too?

mrob 4 hours ago [-]
>retroactively make cloning automatically unethical

There's nothing retroactive about it. The clone is harmed merely by being brought into existence, because it's robbed of the possibility of having its own identity. The harm occurs regardless of whether the clone actually does change its mind. The idea that somebody can be harmed without feeling harmed is not an unusual idea. E.g. we do not permit consensual murder ("dueling").

>antinatalist positions

I'm aware of the anti-natalist position, and it's not entirely without merit. I'm not 100% certain that having babies is ethical. But I already mentioned several differences between consciousness cloning and traditional reproduction in this discussion. The ethical risk is much lower.

brazzy 4 hours ago [-]
> But I already mentioned several differences between consciousness cloning and traditional reproduction in this discussion. The ethical risk is much lower.

Yes, what you actually said leads to the conclusion that the ethical risk in consciousness cloning is much lower, at least concerning the act of cloning itself.

ben_w 5 hours ago [-]
> The clone is not the same person.

Then it wasn't a good attempt at making a mind clone.

I suspect this will actually be the case, which is why I oppose it, but you do actually have to start from the position that the clone is immediately divergent to get to your conclusions; to the extent that the people you're arguing with are correct (about this future tech hypothetical we're not really ready to guess about) that the clone is actually at the moment of their creation identical in all important ways to the original, then if the original was consenting the clone must also be consenting:

Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.

> It will inevitably deviate from the original simply because it's impossible to expose it to exactly the same environment and experiences.

And?

lxgr 5 hours ago [-]
> you do actually have to start from the position that the clone is immediately divergent to get to your conclusions

Eventual divergence seems to be enough, and I don't think this requires any particularly strong assumptions.

ben_w 4 hours ago [-]
If divergence were an argument against the clone having been created, by symmetry it is also an argument against the living human having been allowed to exist beyond the creation of the clone.

The living mind may be mistreated, grow sick, die a painful death. The uploaded mind may be mistreated, experience something equivalent.

Those sufferances are valid issues, but they are not arguments for the act of cloning itself to be considered a moral issue.

Uncontrolled diffusion of such uploads may be; I could certainly believe a future in which, say, every American politician gets a thousand copies of their mind stuck in a digital hell created by individual members the other party on computers in their basements that the party leaders never know about. But then, I have read Surface Detail by Iain M Banks.

mrob 4 hours ago [-]
There is no symmetry. The original existed when the clone did not exist.
ben_w 4 hours ago [-]
Irrelevant.

The argument itself is symmetric, it applies just as well to your own continued existence as a human.

mrob 4 hours ago [-]
Only if you deny the reality of consciousness being tied to a physical substrate.
ben_w 4 hours ago [-]
Incorrect.

To deny that is to assert that consciousness is non-physical, i.e. a soul exists; the case in which a soul exists, brain uploads don't get them and don't get to be moral subjects.

mrob 4 hours ago [-]
It's the exact opposite. The original is the original because it ran on the original hardware. The copy is created inferior because it did not. Intentionally creating inferior beings of equal moral weight is wrong.
mrob 4 hours ago [-]
>Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.

This is false. The clone is necessarily a different person, because consciousness requires a physical substrate. Its memories of consenting are not its own memories. It did not actually consent.

ben_w 4 hours ago [-]
You deny the premise of the position you argue against.

I would also deny it, but my position is a practical argument, yours is pretending to be a fundamental one.

mrob 4 hours ago [-]
The premise of the position is that it's theoretically possible to create a person with memories of being another person. I obviously don't deny that or there would be no argument to have.

Your argument seems to be that it's possible to split a person into two identical persons. The only way this could work is by cloning a person twice then murdering the original. This is also unethical.

ben_w 4 hours ago [-]
> Your argument seems to be that it's possible to split a person into two identical persons. The only way this could work is by cloning a person twice then murdering the original. This is also unethical.

False.

The entire point of the argument you're missing is that they're all treating a brain clone as if it is a way to split a person into two identical persons.

I would say this may be possible, but it is extremely unlikely that we will actually do so at first.

mrob 4 hours ago [-]
One has a physical basis, the other is pure spiritualism. Accepting spiritualism makes meaningful debate impossible, so I am only engaging with the former.
brazzy 5 hours ago [-]
You are making a bunch of unfounded assetions, not arguments.
philipswood 9 hours ago [-]
It might be one of the only reasonable-seeming ways to not die.

I can see the appeal.

lencastre 32 minutes ago [-]
what

a copy of you is not you-you, it’s another you when you die, that’s it, the other you may still be alive but… it’s not you

disclaimer: no psychadelics used to write this post

echelon 10 hours ago [-]
> Attempting to copy the human brain or human consciousness is one of the biggest mistakes that can be made in the scientific field.

This will be cool, and nobody will be able to stop it anyway.

We're all part of a resim right now for all we know. Our operators might be orbiting Gaia-BH3, harvesting the energy while living a billion lives per orbit.

Perhaps they embody you. Perhaps you're an NPC. Perhaps this history sim will jump the shark and turn into a zombie hellpacalypse simulator at any moment.

You'll have no authority to stop the future from reversing the light cone, replicating you with fidelity down to neurotransmitter flux, and doing whatever they want with you.

We have no ability to stop this. Bytes don't have rights. Especially if it's just sampling the past.

We're just bugs, as the literature meme says.

Speaking of bugs, at least we're not having eggs laid inside our carapaces. Unless the future decides that's our fate for today's resim. I'm just hoping to continue enjoying this chai I'm sipping. If this is real, anyway.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 19:12:37 GMT+0000 (Coordinated Universal Time) with Vercel.