• Hey Guest,

    We wanted to share a quick update with the community.

    Our public expense ledger is now live, allowing anyone to see how donations are used to support the ongoing operation of the site.

    👉 View the ledger here

    Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.

    If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.

    Donate via cryptocurrency:

    Bitcoin (BTC):
    Ethereum (ETH):
    Monero (XMR):
sensenmann

sensenmann

this will be the end of me
Jun 14, 2023
141
Since nothing is forever and the site could be gone any day (hope not) I decided to archive some threads, I didn't find any guide here on how to do it so I am posting my method.

There is probably a faster and more "automated" method but this has worked well for me.

Here are the things you need:
  • The browser add-on SingleFile or SingleFileZ, I use Firefox but they are also available for Chrome.
  • Python to create and execute the script
Now, you can use SingleFile to archive a single page but if you want to archive a whole thread at once you will need some kind of script that generates the urls, then paste those into SingleFile.

This python script can generate multiple urls in a text file, open notepad put in the script, then you put the template of the url in base_text = "url-here" last thing is to put the number of pages in num_strings = 123.

Save the script ending with .py, then double-click it, or open a terminal in the script directory and write in scriptname.py, you will get a .txt file in the same directory with all the urls.

Last thing is to copy all the urls from the .txt file and paste those into singlefile/singlefilez with the "Batch save URLs" option, it will download every page into a .html file, containing pictures, text, etc.

EDIT: removed selenium since it is actually not needed, it is for automation.

Examples:

Url

Pages

Script

Text

Batch
 
Last edited:
  • Like
  • Love
  • Informative
Reactions: donxtwait, Praestat_Mori, ztem and 2 others
アホペンギン

アホペンギン

…
Jul 10, 2023
2,191
This is quite interesting. It would be very useful in the case of SS being taken down. Unfortunately, though, you can't do shit on a phone so I screenshot things that I want to keep, lol.
 
  • Like
  • Hugs
Reactions: Praestat_Mori, Lost in a Dream and sensenmann
Darkover

Darkover

Archangel
Jul 29, 2021
5,649
someone already did a backup of sasu on a mediafire about a year ago they uses a web crawler can't find it now even tho i've searched high and low for it i have it on a old laptop that broke down
 
  • Like
  • Aww..
Reactions: アホペンギン and Praestat_Mori
アホペンギン

アホペンギン

…
Jul 10, 2023
2,191
someone already did a backup of sasu on a mediafire about a year ago they uses a web crawler can't find it now even tho i've searched high and low for it i have it on a old laptop that broke down
That's awful… I hope someone else makes a backup of sasu too, because you lost the one you had.
 

Similar threads

truehappiness
Replies
0
Views
55
Suicide Discussion
truehappiness
truehappiness
braintorture
Replies
72
Views
8K
Suicide Discussion
black_blooded_heart
B
R
Replies
23
Views
1K
Suicide Discussion
capi
capi
nyotei_
Replies
37
Views
5K
Recovery
invisible4ever
invisible4ever