• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • While it sounds a bit hacky, I think this is an underrated solution. It’s actually quite a clever way to bypass the whole problem. Physics is your enemy here, not economics.

    This is kind of like trying to find an electric motor with the highest efficiency and torque at 1 RPM. While it’s not theoretically impossible, it’s not just a matter of price or design, it’s a matter of asking the equipment to do something it’s simply not good at, while you want to do it really well. It can’t, certainly not affordably or without significant compromises in other areas. In the case of a motor, you’d be better off letting the motor spin at its much higher optimal RPM and gear it down, even though there will be a little loss in the geartrain it’s still a much better solution overall and that’s why essentially every low speed motor is designed this way.

    In the case of an ammeter, it seems totally reasonable to bring it up to a more ideal operating range by adding a constant artificial load. In fact the high precision/low range multimeters and oscilloscopes are usually internally doing almost exactly the same thing with their probes, just in a somewhat more complex way behind the scenes.


  • The end result is exactly the same.

    The difference is that you can install an iso on a computer without an internet connection. The normal iso contains copies of most or all relevant packages. Although maybe not all of the latest and most up to date ones, the bulk are enough to get you started. The net install, like the name suggests, requires an internet connection to download packages for anything except the most minimal, bare-bones configuration. The connection would hopefully be nearly as fast if not faster than the iso and be guaranteed to have the latest updates available which the iso may not. While such a fast connection is usually taken for granted nowadays, it is not always available in some situations and locations, it is not always convenient, and some hardware may have difficulty with the network stack that may be difficult to resolve before a full system is installed or may require specialized tools to configure or diagnose that are only available as packages.

    In almost all cases, the netinst works great and is a more efficient and sensible way to install. However, if it doesn’t work well in your particular situation, the iso will be more reliable, with some downsides and redundancy that wastes disk space and time.

    Things like windows updates and some large and complex software programs and systems often come with similar “web” and “offline” installers that make the same distinctions for the same reasons. The tradeoff is the same, as both options have valid use cases.


  • To be fair, in the case of something like a Linux ISO, you are only a tiny fraction of the target or you may not even need to be the target at all to become collateral damage. You only need to be worth $1 to the attacker if there’s 99,999 other people downloading it too, or if there’s one other guy who is worth $99,999 and you don’t need to be worth anything if the guy/organization they’re targeting is worth $10 million. Obviously there are other challenges that would be involved in attacking the torrent swarm like the fact that you’re not likely to have a sole seeder with corrupted checksums, and a naive implementation will almost certainly end up with a corrupted file instead of a working attack, but to someone with the resources and motivation to plan something like this it could get dangerous pretty quickly.

    Supply chain attacks are increasingly becoming a serious risk, and we do need to start looking at upgrading security on things like the checksums we’re using to harden them against attackers, who are realizing that this can be a very effective and relatively cheap way to widely distribute malware.


  • I still use Nextcloud for syncing documents and other basic stuff that is relatively simple. But I started getting glacial sync times consuming large amounts of CPU and running into lots of conflicts as more and more got added. For higher performance, more demanding sync tasks involving huge numbers of files, large file sizes, and rapid changes, I’ve started using Syncthing and am much, much happier with it. Nextcloud sync seems to be sort of a jack of all trades, master of none, kind of thing. Whereas Syncthing is a one trick pony that does that trick very, very well.


  • I feel like you are the one who is confusing a “NAS device” or “NAS appliance” as in a device that is specifically designed and primarily intended to provide NAS services (ie, its main attribute is large disks, with little design weight given to processing, RAM or other components except to the extent needed to provide NAS service), and a NAS service itself, which can be provided by any generic device simultaneously capable of both storage and networking, although often quite poorly.

    You are asserting the term “NAS” in this thread refers exclusively to the former device/appliance, everyone else is assuming the latter. In fact, both are correct and context suggests the latter, although I’m sure given your behavior in this thread you will promptly reply that only your interpretation is correct and everyone else is wrong. If you want to assert that, go right ahead and make yourself look foolish.



  • It is mostly a myth (and scare tactic invented by copyright trolls and encouraged by overzealous virus scanners) that pirated games are always riddled with viruses. They certainly can be, if you download them from untrustworthy sources, but if you’re familiar with the actual piracy scene, you have to understand that trust is and always will be a huge part of it, ways to build trust are built into the community, that’s why trust and reputation are valued higher than even the software itself. Those names embedded into the torrent names, the people and the release groups they come from, the sources where they’re distributed, have meaning to the community, and this is why. Nobody’s going to blow 20 years of reputation to try to sneak a virus into their keygen. All the virus scans that say “Virus detected! ALARM! ALARM!” on every keygen you download? If you look at the actual detection information about what it actually detected, and you dig deep enough through their obfuscated scary-severity-risks-wall-of-text, you’ll find that in almost all cases, it’s actually just a generic, non-specific detection of “tools associated with piracy or hacking” or something along those lines. They all have their own ways of spinning it, but in every case it’s literally detecting the fact that it’s a keygen, and saying “that’s scary! you won’t want pirated illegal software on your computer right?! Don’t worry, I, your noble antivirus program will helpfully delete it for you!”

    It’s not as scary as you think, they just want you to think it is, because it helps drive people back to paying for their software. It’s classic FUD tactics and they’re all part of it. Antivirus companies are part of the same racket, they want you paying for their software too.


  • Owncloud is not fully open source. Nextcloud is. They have developed in different directions since then, but that remains the fundamental difference that split them apart in the first place. If that matters to you, Nextcloud is the right choice. If that doesn’t matter to you, then use whichever you prefer and has the features you need.


  • There’s going to be a bunch of caveats here, but basically…

    Assuming you’re using a NAT router to connect to the internet (basically everyone is nowadays): If you’re using a local LAN IP address (10..., 192.168.., or 172.[16-32]..*) then nobody on the internet can access any services on that IP, unless you specifically port forward it through your router. Assuming there’s nobody dangerous on your local network (and nobody gets a remote-access virus) and your router itself is not hackable then yes it’s entirely safe.

    You don’t technically need a public domain name to set up an SSL certificate, but to smoothly streamline the process in a way that modern software trusts it, you do. A self-signed certificate can be created for any IP address and it will provide full encryption and avoid interception of traffic between established clients, but you will get a scary warning that the certificate is self-signed every time you connect a new client or browser, because it cannot be verified. It still works though, it’s just (intentionally) scary, because it doesn’t know what you’re doing with it and it doesn’t know how to establish trust. You probably don’t need this, but it is an option. Setting up a self-signed certificate will have various degrees of complexity in documentation depending on what web server you’re using, I would recommend using the simplest guide you can find for the relevant web server if you choose to go that route, you don’t need anything complex for this. The keywords you’re looking for are “self-signed certificate”

    Welcome to self-hosting. Nextcloud is a great thing to self-host, too. Hope you enjoy.