• 0 Posts
  • 75 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle
  • We played this game called Icarus last weekend because of a free weekend. It was okay for me, but I also have a pretty high-end PC for the 1080p monitor connected to it. Even for me the game was quite janky, but for my friends with older hardware the game wasn’t a good time. One friend’s microphone randomly turned into a max-volume noise generator while playing on multiple occasions, something that has never happened before. Another (who plays on Linux) experienced constant crashes and weird behaviour.

    After that disappointment we went back in time to The Showdown Effect for the first time in years, which was still as hilarious as ever. Apparently there’s an updated free to play version now (called reloaded or something?) so we’d have to check that out. Would recommend it if you’re looking to have some mayhem with friends .

    Edit: oh yeah and I also bought Grid Legends because it has a big sale and I like racing games. The driving physics don’t annoy me like the ones in The Crew or Forza so I’m having a good time with it till now


  • I was just about to post the same thing. I’ve been using Linux for almost 10 years. I never really understood the folder layout anyway into this detail. My reasoning always was that /lib was more system-wide and /usr/lib was for stuff installed for me only. That never made sense though, since there is only one /usr and not one for every user. But I never really thought further, I just let it be.




  • Imo it’s both overblown and very impressive. Deep neural networks are capable of many things that we didn’t even imagine 10 years ago. We’ve made huge leaps.

    The problem is that every company is putting “AI” in everything and techbro’s and managers are heavily overvaluing the technology. Most companies don’t need AI. In many cases there are way better methods to do the thing they want to do. The fridge or washing machine doesn’t need AI, the website of whatever company doesn’t need an AI assistant, and most people don’t need an AI accelerator in their laptop or phone.



  • Machine learning and compression have always been closely tied together. It’s trying to learn the “rules” that describe the data rather than memorizing all the data.

    I remember implementing a paper older than me in our “Information Theory” course at university that treated the creation of a decision tree as compression. Their algorithm considered sending the decisions tree and all the exceptions to the decision tree and the tree itself. If a node in the tree increased the overall message size, it would simply be pruned. This way they ensured that you wouldn’t make conclusions while having very little data and would only add the big patterns in the data.

    Fundamentally it is just compression, it’s just a way better method of compression than all the models that we had before.

    EDIT: The paper I’m talking about is “Inferring decision trees using the minimum description length principle” - L. Ross Quinlan & Ronald L. Rivest


  • I’m on Arch (actually a converted Antergos) and I have an NVIDIA card as well. My first attempt a few months ago was horrible, bricking my system and requiring a bootable USB an a whole evening to get Linux working again.

    My second attempt was recently, and went a lot better. X11 no longer seems to work, so I’m kinda stuck with it, but it feels snappy as long as my second monitor is disconnected. I’ve yet to try some gaming. My main monitor is a VRR 144Hz panel with garbage-tier HDR. The HDR worked out of the box on KDE Plasma, with the same shitty quality as on Windows, so I immediately turned it off again. When my second monitor is connected I get terrible hitching. Every second or so the screen just freezes for hundreds of milliseconds. Something about it (1280x1024, 75Hz, DVI) must not make Wayland happy. No settings seem to change anything, only physically disconnecting the monitor seems to work.


  • I might misunderstand what you mean with “implementing” an LLM, but unless you have a good understanding of deep learning and math I wouldn’t recommend to implement one from scratch. There’s a lot of complex math involved in these kind of topics. If you mean implementing an application around an existing LLM, for example writing a chat website that interfaces with ChatGPT or a local LLM, then it’s doable (depending on you current skills).


  • I bought a ThinkPad new in 2014 for my study for like 1200 euro’s. She’s still happily purring today. Around 2019 I made the mistake of emptying a cup of tea into the ThinkPad accidentally and then holding it upside down to get the water out. I think I should’ve just let it leak out of the bottom since the laptop has holes for that, but I panicked. This broke the keyboard, but not the rest of the laptop. I got an official new keyboard for like 100 euro’s which came with a tool and the simple instructions, and since then everything has been working flawlessly.

    So I recommend ThinkPads, although I can’t really say anything about compatibility of new models



  • I do, but I’ve gotten better at it. More often than not I just struggle to get started. So just forcing myself to get started results in actually doing what I wanted to do. Sometimes I’m just exhausted, and I accept that I’m just gonna “waste” the evening with video games or something. Rather have some enjoyment than nothing.

    That being said, I’m still learning to be better. I’m still too judgemental and unrealistic to myself


  • My first experience with the Sims was jumping behind a random computer at some kind of event that was running the Sims 1. Most of the family had just died because the previous person behind the PC had let the house burn down. Needless to say, I was a bit confused. I’ve played the Sims quite a bit after that, and I honestly like messing around with it.

    I don’t think I’ve ever played a game without cheating a lot of money. I don’t like that the Sims that I made have to go off to work or school, so usually I just build a big fence around the property to keep them all there. From there on it used to devolve into chaos when I was younger. Building huge mazes to access basic necessities, launching fireworks indoors, etc. Nowadays im a bit more behaved though.

    Imo the Sims 4 is the best nowadays. The older ones are showing their age. That being said, the Sims 4 is definitely in need of some competition. It’s inexcusably buggy sometimes, and I personally think there’s a lot more that can be done with a game like this. Hopefully the upcoming competitors can spark some fire into this genre.



  • gerryflap@feddit.nltoADHD@lemmy.worldHappens Too Frequently
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    I’m the same. I hate dark rooms, they make me sleepy and downbeat. I prefer natural light, but any light is better than sitting in a dark room. I also prefer to sleep in a room that’s not totally dark and have no issues with sleeping in a moderately light room. I don’t like pitch black darkness, makes me feel uneasy.



  • Damn, it’s so weird hearing the kind of rebellious teen mind in a voice and culture from so long ago. She sounds so mature, so aware of what’s going on. I remember thinking how hypocritical adults were, wanting to break free of all these stupid rules, yet also wanting guidance. But I never put it this eloquently. And yet I suspect that her rebellion wasn’t fully positively received, despite how nicely it actually was put.


  • I’m not a hundred percent sure, but afaik it has to do with how random the output of the GPT model will be. At 0 it will always pick the most probable next continuation of a piece of text according to its own prediction. The higher the temperature, the more chance there is for less probable outputs to get picked. So it’s most likely to pick 42, but as the temperature increases you see the chance of (according to the model) less likely numbers increase.

    This is how temperature works in the softmax function, which is often used in deep learning.