This should be illegal, companies should be forced to open-source games (or at least provide the code to people who bought it) if they decide to discontinue it, so people can preserve it on their own.

  • @chicken@lemmy.dbzer0.com
    link
    fedilink
    English
    91 year ago

    experiments where YouTubers downloaded and reuploaded their own video 100 times, it very quickly degrades

    That just means Youtube’s software uses lossy compression, that is a Youtube problem, not a digital media problem. Are you familiar with the concept of file hashing? A short string can be derived from a file, such that if any bit of the file is altered, it will produce a different hash. This can be used in combination with other methods to ensure perfect data consistency; for example a file torrent that remains well seeded won’t degrade, because the hash is checked by the software, so if anyone’s copy changes at all due to physical degradation of a harddrive or whatever other reason, the error will be recognized and routed around. If you don’t want to rely on other people to preserve something, there is always RAID, a 50 year old technology that also avoids data changing or being lost assuming that you maintain your hardware and replace disks as they break.

    Here’s the fundamental reason you’re wrong about this: computers are capable of accounting for every bit, conclusively determining if even one of them has changed, and restoring from redundant backup. If someone wants to perfectly preserve a digital file and has the necessary resources and knowledge, they can easily do so. No offense but what you are saying is ignorant of a basic property of how computers work and what they are capable of.

      • @chicken@lemmy.dbzer0.com
        link
        fedilink
        English
        21 year ago

        Computers might be able to account for every bit with the use of parity files and backups with frequent parity checks

        Yes, and this can be done through mostly automatic or distributed processes.

        even the most complex system of data storage can fail or degrade eventually.

        I wouldn’t describe it as complex, just the bare minimum of what is required to actually preserve data with no loss. All physical mediums may degrade through physical processes, but redundant systems can do better.

        but the fact is most people aren’t running a server with 4 separately powered and monitored drives as their home computer

        It isn’t hard to seed a torrent. If a group of people want to preserve a file, they can do it this way, perfectly, forever, so long as there remain people willing to devote space and bandwidth.

        We live in a world of problems, like the YouTube problem, compression problems, encoding problems, etc. We do because we chose efficiency and ease of use over permanency.

        All of these problems boil down to intent. Do people intend to preserve a file, do they not care, do they actively favor degradation? In the case of the OP game, it seems that the latter must be the case. Same with Youtube, same with all those media companies removing shows and movies entirely from all public availability, same with a lot of companies. If someone wants to preserve something, they choose the correct algorithms, simple as that. There isn’t necessarily much of a tradeoff for efficiency and ease of use in doing so, disk space is cheap, bandwidth is cheap, the technology is mature and not complicated to use. Long term physical storage can be a part of that, but it isn’t a replacement for intent or process.

          • @chicken@lemmy.dbzer0.com
            link
            fedilink
            English
            21 year ago

            I am saying the most complex system will fail.

            And I am saying complexity has little to do with it and also that a system can exist that will not fail.

            it’s not going to last for a thousand years

            Specifically why not? What is unrealistic about this scenario, assuming enough people care to continue with the preservation effort? All nodes must fail simultaneously for any data to be lost. The probability of any given node failing at any given time is a finite probability, independent event. The probability of N nodes failing simultaneously is P^N. That is exponential scaling. Very quickly you reach astronomically low probabilities, 1000 years is nothing and could be safely accomplished with a relatively low number of peers. Maybe there are external factors that would make that less realistic, like whether new generations will even care about preserving the data, but considering only the system itself it is entirely realistic.