It’s to teach AI more properly?
Spending short lifetime for development and boring things.
$argon2id$v=19$m=512,t=256,p=1$J7IZNfNG3RC2FiSRzZfQzw$0seu3KTpAnSufFxXDbHccXnX81enF+A++beI9VvHVVA
It’s to teach AI more properly?
I love ligatures and Fira Code (retina) is the best and absolutely comfortable for me.
Seems to my mistake. You question is about CI/CD services that supports Pijul. So yes, almost zero. But it’s like ouroboros. Just use pijul more then git and talk about it, and services will support it soon.
CI/CD
Pijul as git or hg or any other is a VCS, so what are you talking about? If you mean web-service like GitHub with social things and CI/CD services, so yes, nest have CI/CD with nix. But mostly you shouldn’t host your huge project on the Nest because, as I’m absolutely sure, you as anyone other should create your own host (public or private) to support decentralization to prevent github-like centralization situation. Pijul was created with decentralization in first place in mind.
Not tested with big projects in production
Not publicly. Many private projects, personal and in-company, that uses pijul are existing. Personally I have one HUGE personal. Also I worked for two companies where pijul is used.
Ah! 😣 Why not nest or self-hosted pijul!?
Kind of destiny 🤦🏻♂️
I started programming when I was in primary school. And I liked it very much, even though I didn’t understand much even then. But it was impossible to stop and here I am writing this after about 30+ years.
What exactly attracted me - my father soldered a Russian clone of the Sinclair ZX Spectrum, showed me a couple of games loaded from a tape cassette, and I was curious how it works.
More than ten years ago, I was a team-lead in a game development company and during another crunch I was very overloaded with tasks, including a code review. And of course, I missed some things. We optimized the game, which in general did not do 30 fps on the target machine with minimum requirements, and the target was at least 40 fps. During the month, we jointly optimized everything so that we were able to achieve almost stable 40 frames, but this was not enough and periodically there were drawdowns up to 15-20 fps. Everyone is in panic, there are no ideas, we’re optimized everything.
No, I understand that there is no optimization limit apriori, but I mean an adequate opts without rewriting everything to assembler specifically for all supported architectures.
So, one night I looked into the event loop initialization and found that the initialization happens twice. Twice. Two parallel event loops. That is, two parallel cycles of logic and state updates. That night, by deleting one line, I optimized the performance of the game by more than 100%. 🤦🏻♂️
I investigated and found out that this line of secondary initialization was left by junior “for debugging” and forgotten in the crunch. And I missed it on the review.
I’ve keep this story a secret all these years. And now I’m not revealing names. Otherwise, it can have a dramatic impact on the careers of many.
I’m sure we need a bot for lemmy or entire fediverse that will search posts like this 👆 and do comment with “normal” link. That will be great. I saw something like that in the Masto for YouTube.
I have to suggest to try benchmark-games like this:
Testing interface is not about exactly interface as language’s feature. It’s about testing outer public interface (public API) of you things, looking at it as a black-box and doesn’t matter what happens inside. That’s all it means.
🫣🤨