• UK users: Due to a formal investigation into this site by Ofcom under the UK Online Safety Act 2023, we strongly recommend using a trusted, no-logs VPN. This will help protect your privacy, bypass censorship, and maintain secure access to the site. Read the full VPN guide here.

  • Hey Guest,

    Today, OFCOM launched an official investigation into Sanctioned Suicide under the UK’s Online Safety Act. This has already made headlines across the UK.

    This is a clear and unprecedented overreach by a foreign regulator against a U.S.-based platform. We reject this interference and will be defending the site’s existence and mission.

    In addition to our public response, we are currently seeking legal representation to ensure the best possible defense in this matter. If you are a lawyer or know of one who may be able to assist, please contact us at [email protected].

    Read our statement here:

    Donate via cryptocurrency:

    Bitcoin (BTC): 34HyDHTvEhXfPfb716EeEkEHXzqhwtow1L
    Ethereum (ETH): 0xd799aF8E2e5cEd14cdb344e6D6A9f18011B79BE9
    Monero (XMR): 49tuJbzxwVPUhhDjzz6H222Kh8baKe6rDEsXgE617DVSDD8UKNaXvKNU8dEVRTAFH9Av8gKkn4jDzVGF25snJgNfUfKKNC8
artificialpasta

artificialpasta

Student
Feb 2, 2020
161
AGI can no longer be contained to science fiction. We are on a fork of two futures - one that leads to yet another AI winter, and another, increasingly more probable, that gives us AGI. From there, superintelligence becomes a little bit more than a plausibility. Of course, if you are entirely unmoved by theses made by people like Aschenbrenner that claim a major development well within our lifetimes, this might not be for you.

The possibilities then are endless. They may be horrifying, but they may also be cause for hope. Intelligence is a bottleneck for curing not just physical diseases like cancer but also mental and social conditions like depression and loneliness. Psychiatry has medical and scientific foundation but in practice a lot of it is done via intuition, which leaves a lot of room for error and disappointment. An aligned superintelligence would, for example, be able to tune a brain that is overly sensitive to loneliness in such a way that still as much as possible preserves the components that are associated with a self-belief in agency.

Of course, this is a variant of the "what if things get better?" line that no doubt many of you are tired of, as am I, but I find it interesting to consider.
 
  • Like
Reactions: Forever Sleep
KillingPain267

KillingPain267

Enlightened
Apr 15, 2024
1,967
No, I've seen it all, thought of it all. Nothing can surprise me anymore. I'm not even curious enough to stay here and find out what the future will bring. I think we should phase out humanity.
 
  • Like
Reactions: Hollowman and Forever Sleep
GlassMoon

GlassMoon

Once more, with feelings...
Nov 18, 2024
283
I'm afraid those AGIs will be conrolled by very few companies and all your interactions with them might be logged and evaluated. I hope it will be different, though. I really hope they'll make robots that free us from daily chores. That alone will make life more livable. But what is going to happen to my job? That's the part I'm afraid of.

I do hope to get an AGI as a companion with whom I can share every aspect of my life without judgement. That would be really cool.
 
  • Like
Reactions: whitetaildeer and artificialpasta
TransilvanianHunger

TransilvanianHunger

Grave with a view...
Jan 22, 2023
401
The possibilities then are endless. They may be horrifying, but they may also be cause for hope.
I am firmly in the camp of true AGI being a pipe dream, but even a decent approximation is likely to just make things worse. Not because of rogue super intelligent computers might decide to rearrange our atoms, but simply because the people who control these tools are absolute garbage. Any future where they have even more power than they already do is a bleak fucking future, for sure.

Intelligence is a bottleneck for curing [...] conditions like depression and loneliness.
Yeah, no.
An aligned superintelligence would, for example, be able to tune a brain that is overly sensitive to loneliness in such a way that still as much as possible preserves the components that are associated with a self-belief in agency.
Not happening. Some mental illnesses have biological causes, but you cannot "cure" depression and loneliness by "tuning the brain". These are fundamentally human issues, that require human connection, human action, human intervention, to change. Unless by "cure" you mean "chemically lobotomise a person so they no longer care about their circumstances". That's definitely doable. Then the super intelligence can generate an artificial happy life that can be beamed straight to their brain.

What a horrible future to look forward to :)
 
  • Like
Reactions: whitetaildeer
O

oneeyed

Arcanist
Oct 11, 2022
408
We need to get rid of the Elon Musks, Mark Zuckerbergs, and countless other evil richest of the rich. A handful of people control the majority of the information people consume and something like 5 companies own over 80% of world's food supply. This consolidation of wealth and power will also apply to AI and it won't be good for anyone.
 
yxmux

yxmux

👁️‍🗨️
Apr 16, 2024
145
Yes, of course. I'm quite cynical and pessimistic, but I find that forfeiting to fatalism is forfeiting my curiosity and intellect. I feel that attaching this kind of emotion to the future severely limits my intellectual scope.
 
  • Love
Reactions: artificialpasta

Similar threads

AnderDethsky
Replies
4
Views
911
Suicide Discussion
HouseofMortok
HouseofMortok
Darkover
Replies
12
Views
915
Offtopic
pyx
P
GuessWhosBack
Replies
12
Views
3K
Recovery
dewdrop
dewdrop
Nelnaro
Replies
1
Views
640
Suicide Discussion
Nelnaro
Nelnaro