• 2 Posts
  • 426 Comments
Joined 4 years ago
cake
Cake day: January 17th, 2022

help-circle

  • Worthwhile yet tricky. Companies like OpenAI, Google, Meta, etc are full of experts in statistics and they have access to a lot of storage space. If use a service from those companies, say 4hrs per day between 7am and 9pm, at a certain frequency, e.g. 10 requests / hour, then suddenly, when you realize you actually do not trust them with your data, you do 10000 req/hr for 1hr then that’s a suspect pattern. Then might be able to rollback until before that “freak” event automatically. They might still present you as a user your data with the changes but not in their internal databases.

    So… I’m not saying it’s not a good idea, nor useful, but I bet doing it properly is hard. It’s probably MUCH harder than do a GDPR (or equivalent) take out request then deletion request AND avoiding all services that might leverage your data from these providers.







  • Agree but nobody forces you to use anything except ProtonMail or ProtonVPN. In fact I have a visionary account and I mostly just use ProtonMail. I do use ProtonVPN but I also have WireGuard. Also my ProtonMail addresses are behind domains I host. If tomorrow I decide to switch away from Proton, I can.

    So… sure Proton is not perfect and centralization is bad but IMHO it’s like saying Firefox is imperfect so it’s fine to use Chrome or Chromium browsers. Imperfect alternatives to BigTech and surveillance capitalism is better than relying on the things you hate until something “perfect” never comes along.


  • something watching, logging connections to everyone connected to that torrent

    Might be, FWIW there are quite a few ways to torrent in a rather private way, namely require encrypted connection, have a blocklist, require to be behind a VPN, etc … but in the end you still share data with strangers, that’s the core premise. The whole point is to facilitate the sharing of data reliably but then who joins the pool is outside of the protocol itself.





  • Can’t talk about AMD but I’m on NVIDIA and I always followed https://wiki.debian.org/NvidiaGraphicsDrivers and never had issues others seem to be having. I typically hear good things about AMD GPU support, on Debian and elsewhere so I’m surprised.

    Now in practice IMHO GPU support doesn’t matter much for NAS, as you’re probably going headless (no monitor, mouse or keyboard). You probably though do want GPU instruction set support for transcoding but here again can’t advise for this brand of GPU. It should just be relying on e.g. https://trac.ffmpeg.org/wiki/Hardware/AMF

    Finally I’m a Debian user and I’m quite familiar with setting it up, locally on remotely. I also made ISOs for RPi based on Raspbian so this post made me realize I never (at least I don’t remember) installed Debian headlessly, by that I mean booting on a computer with no OS all the way to getting a working ssh connection established on LAN or WiFi. I relied on Imager for RPi configuration or making my own ISO via a microSD card (using dd) but it made me curious about preseeding wiki.debian.org/DebianInstaller/Preseed so I might tinker with it via QEMU. Advices welcomed.

    PS: based on few other comments, consider minidlna over more complex setups. Consider Wireguard over tailscale (or at least headscale for a version relying solely on your infrastructure) with e.g. wg-easy if you want to manage everything without 3rd parties.





  • Sounds absolutely stupid… and yet my (gaming) desktop (model CORSAIR ONE i180) remains untouched after nearly 6 years. I still play indies to AAA to VR with it. I still work with it, specifically VR prototyping, so dev.

    If I were to give it away or use as a self-hosted server with GPU used on e.g Immich or video transcoding it would still do pretty well.

    So…IMHO it’s not a bad take but damn I remembered I paid a LOT of money back then. As other pointed out if you can afford it, sure. If you are not a professional then probably not.