I had a bunch of personal scripts to manage my music database. Maybe 10-sh scripts, max a few hundred lines long, nothing too big. A little while ago, I wrapped them into a big emacs org file for literate programming, and to tangle them, so I could easily edit them in one place. Backed them up to at least three servers, both locally and in another building. I also have Cronopete running, (a Linux implementation of MacOS Time Machine), so everything is safe, right? Right?!. I didn’t need the scripts for 3 months or so, but today I wanted to use them but couldn’t find them anywhere. Not on any backup server, not on the Cronopete drive. The only thing I can think of is that I must have saved that org file on the backup server and then backed up over it (and it never got pulled by Cronopete because it does of course not look at the backup server). I will have to start rewriting those scripts from scratch. FML.

  • thequickben@beehaw.org
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    I’ve made too many mistakes like this, so I check in anything important into git. Gitlab is easy to run locally.

    • tietze111@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      If it is not in git, it is not safe, learned that the hard way as well… I guess we all do at some point

      • Dunstabzugshaubitze@feddit.de
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I guess some lessons need to be learned through pain.

        • Commiting regulary.
        • Following the branch rules.
        • writing tests.
        • writing tests, that test the desired not the current behaviour
        • refactoring your code.
        • not refactoring code, you don’t understand nor have tests for.
        • actually reading code before merging a pr.
        • not pulling in 23 unmantained libraries to solve a simple problem.
        • keeping your dependencies up to date.
        • that dirty hack will make your life harder.

        Yes, all those hurt. They sometimes still do, most of us are not machines that turn caffeine into code and we are never as clever as we think we are.

        • steph@lemmy.clueware.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          On a side note, w.r.t. keeping the dependencies up to date, have a look at renovatebot. It creates merge request for each and every dependency update, thus triggering a build to check that everything is OK.

    • DeltaWhy@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      GitLab is pretty resource heavy - if you want to self host something I prefer Gitea. Very easy to set up, doesn’t require Docker, just a single binary.

      • donio@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Why bother with either of those for private personal repos though? Why not just regular remote repos over ssh?

        • DeltaWhy@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That’s also an option - I’ve used gitolite before to set that up. In my case though I wanted to mirror repos from gitlab.com and github, and I might want to hook up CI and webhooks later on.

    • jsveiga@vlemmy.net
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If you’ll run gitlab locally just for you, it’s easier to simply create a network shared directory and use it as a git repository. Git on your local machines can push/pull/clone to/from a directory (local or remote) just like to/from a git server.