Does lemmy have any communities dedicated to archiving/hoarding data?
This post foreshadowed today’s AWS outage.
👀
I have been archiving Linux builds for the last 20 years so I could effectively install Linux on almost any hardware since 1998-ish.
I have been archiving docker images to my locally hosted gitlab server for the past 3-5 years (not sure when I started tbh). I’ve got around 100gb of images ranging from core images like OS to full app images like Plex, ffmpeg, etc.
I also have been archiving foss projects into my gitlab and have been using pipelines to ensure they remain up-to-date.
the only thing I lack are packages from package managers like pip, bundler, npm, yum/dnf, apt. there’s just so much to cache it’s nigh impossible to get everything archived.
I have even set up my own local CDN for JS imports on HTML. I use rewrite rules in nginx to redirect them to my local sources.
my goal is to be as self-sustaining on local hosting as possible.
You’re awesome. Keep up the good work.
Welcome to datahoarders.
We’ve been here for decades.
Also follow 3-2-1 people. 3 Backups, 2 storage mediums, 1 offsite.
For wikipedia you’ll want to use Kiwix. A full backup of wikipedia is only like 100GB, and I think that includes pictures too.
Last time I updated it was closer to 120GB but if you’re not sweating 100 GB then an extra 20 isn’t going to bother anyone these days.
Also, thanks for reminding me that I need to check my dates and update.
EDIT: you can also easily configure a SBC like a Raspberry Pi (or any of the clones) that will boot, set the Wi-Fi to access point mode, and serve kiwix as a website that anyone (on the local AP wifi network) can connect to and query… And it’ll run off a USB battery pack. I have one kicking around the house somewhere
Just built one of those using Dietpi as the OS and NVME M.2 for the storage. I have many different ZIMs and running different services and only using about 270GB.
Works great for offline use. Probably should add an ISO or 2 as well.
What other services are you running?
@fmstrat@lemmy.world asked what else I was running in a sibling comment to yours and I didn’t have an answer because I’m not… yet : )
DietPi makes it dead simple to run most of these things as their “software suite” is pretty robust and simple to setup.
For “user facing” applications:
- Homer Dashboard as the landing page when going to the .local address in a browser
- Kiwix for the ZIMs
- Hedgedoc for personal note taking/wiki
- Lychee photos for a very lightweight photo album maker/viewer for keepsake photos.
For “admin side” stuff:
- Portainer to manage the containers/stacks
- Watchtower to auto-update the containers while they’re still network connected
- Transmission daemonized to download and seed the ZIMs or anything else non-pirate related
- Use jojo2357’s ZIM updater to auto-update ZIMs via cron job while they’re still network connected
- DietPi-Dashboard as an all-in-one dashboard to monitor and control the RPi from a web interface. (Yeah I know I can do everything SSH’ing in but I’m lazy.)
- File Browser just in case I want other people to have access to files but since it’s in maintenance mode and I’m unsure I want others to have access, might strip it out
I try to use containers from LinuxServer.io whenever possible. Mostly just cause it’s what I do on my main server.
I’m still looking at adding/removing things as I get more time to sit down but I’m pretty happy with it’s current state.
Do you recommend adding anything else to it?
For instance, OSM maps?
I’ve been thinking about running the Kiwix app + OSMAnd on an old Android phone and auto updating it once a year.
That’s a good question (and good idea) that I hadn’t really thought about past a collection of ZIMs. The one I built advertises it’s own AP SSID that anyone can connect to and then access the ZIMs that are served via
kiwix-serve
on HTTP/80. That is, I wanted a single, low power, headless device that multiple people could use simultaneously via wifi and browser rather than a personal device.I hadn’t really thought about other helpful services past that. I mean, we’ve got a (wee) server so why not use it? I like the idea of OSM and their website is open source but has a lot of dependencies :
openstreetmap-website is a Ruby on Rails application that uses PostgreSQL as its database, and has a large number of dependencies for installation
A fully-functional openstreetmap-website installation depends on other services, including map tile servers and geocoding services, that are provided by other software. The default installation uses publicly-available services to help with development and testing.
I wonder how hard it would be to host everything it needs locally/offline… and what that would do to power consumption : )
Thanks for the idea - something to look into, for sure.
Saw your comment on mine and finally saw this one.
I’m gonna take a look at openstreetmap-tile-server and see about running that since if all has gone to shit, who knows if GPS will work. Least it’s almost like a paper map and can be auto-updated as long as we still have internet. Quick Gist someone wrote here.
Yeah, I feel the same in that it’s assuredly doable, but how hard is it?
If you’re able to dig into and make some progress, please tag me because I’m interested but don’t have much time these days.
I might beat you to it. I’ve got Kiwix running in docker, just did a PR to the
kiwix-zim-updater
so it can run in Docker on a cron schedule next to the server, and have spun those up with Karakeep (self-hosted web archive I use for bookmarking).Right now I’m adding a ZIM list feature to the updater to list available ZIMs by language, and then I’ll move on to OSM.
You’ll definitely beat me to it : D
Do me a favor and tag me when you post your how to?
I will do my best to remember hah
Yeah also if you make a Zim wiki or convert a website into Zim then you can run that stuff too. If you use Emacs it’s easy to convert some pages to wikitext for Zim too
120GB not including Wikimedia 😉
Also, I wish they included OSM maps, not just the wiki.
You can also offline the whole of Project Gutenberg with Kiwix, it’s about 70GB IIRC.
I wonder if there’s anyways to edit these files afterwards? They tend to be read only, right? I must confess, I don’t have too much experience with this myself.
I would add in some rom collections and book repositories as well. The whole library of Nintendo games is under a gig and would go a long way for entertaining people.
I also recommend downloading “Flashpoint archive” to have flash games and animations to stay entertained.
There is a 4gb version and a 2.3TB version.
Is that Flash exclusive or do they accept other games from that era?
I’m not sure, but I do think it’s just flash
This is just minor datahoarding. I do it, on an extreme level.
Neither are that bad honestly. I have jigdo scripts I run with every point release of Debian and have a copy of English Wikipedia on a Kiwix mirror I also host. Wikipedia is a tad over 100 GB. The source, arm64 and amd64 complete repos (DVD images) for Debian Trixie, including the network installer and a couple live boot images, are 353 GB.
Kiwix has copies of a LOT of stuff, including Wikipedia on their website. You can view their zim files with a desktop application or host your own web version. Their website is: https://kiwix.org/
If you want (or if Wikipedia is censored for you) you can also look at my mirror to see what a web hosted version looks like: https://kiwix.marcusadams.me/
Note: I use Anubis to help block scrapers. You should have no issues as a human other than you may see a little anime girl for a second on first load, but every once and a while Brave has a disagreement with her and a page won’t load correctly. I’ve only seen it in Brave, and only rarely, but I’ve seen it once or twice so thought I’d mention it.
I thought the whole point of torrenting was to decentralise distribution. I use torrents to get my distros.
In my own little bubble, I thought that’s how most people got their distro.
What happens when they just cut the underwater cables? Torrent over carrier pigeon for a linux distro would take ages
Sneakernet to the rescue. Some of you are too young to know about walking around with boxes full of disks.
A wise man once said
Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway.
Pigeon latency is horrible, but the bandwidth is pretty great. You could probably load up an adult pigeon with at least 12TB of media.
https://en.wikipedia.org/wiki/IP_over_Avian_Carriers
Just gonna leave this here for whoever wants to read more on the methodology and potential risks.
A good way to see what the future of places like the U.S are is to look at places like North Korea, where they do exactly this, move files around on flash media to avoid the state censors.
Tiny jump drives on pigeons is low key excellent imo
Okay so where do I find some cheap hard drives? Europe if possible :-)
look for dvr’s they have huge hdds in them and you can find them at thrift stores for cheap
Is there a context to this or just random thought?
You can ignore politics, but politics will not ignore you.
gestures at everything
I stumbled across this sort of fascinating area of doomsday prepping a few weeks back.
A nice addition to that, don’t just make it a USB, but a raspberry pi. So you’d have a reasonably low-powered computer you could easily take with you.
Not suggesting this one as it seems a bit expensive to me, but https://www.prepperdisk.com/products/prepper-disk-premium-over-512gb-of-survival-content?view=sl-8978CA41
Just built one of these myself. I went NVME M.2 instead of SD Card to avoid data corruption. I know SD Cards are fine if you don’t write to them a lot but if you wanna update or add your own stuff, scares me. Plus NVME is just so much faster.
deleted by creator
at this point why not just use a phone running postmarketos?
Last I checked (3 years ago) postmarketOS drained the pinephone battery in record time :(
Cause if ya wanna go overboard like I did, 1TB of NVME storage, can add with SD Card if necessary. 16GB RAM. Very little learning curve for my part as I use SBCs often. Plus almost every Docker container and program I want works on RPi without any hassle.
There’s also more robust guides and community for RPi.
Just my thoughts.
old pcs off amazon usually come with good reliable 1/2tb harddrive.
Years ago I bought a physical encyclopedia. I remember having one as a kid and using it for school reports. Also just looking through it can be cool. Learning about something you never knew existed is just a unique experience and doing it through a physical book just deepens the whole experience.
I also learned the practice of printing a physical encyclopedia is going out of fashion. I think there is only one company the still prints a yearly encyclopedia and it’s not Encyclopedia Britannica of all things. Might have change since I bought my copy but go give some physical media some love if you can.
You’ll need about 500gb of free space. not too much of an ask tbh
It makes me really happy that people can say “500gb … not too much of an ask” these days.
i know this because i actually do this. its more like ~300gb of space but its better to have even more just in case