• 0 Posts
  • 68 Comments
Joined 9 months ago
cake
Cake day: March 3rd, 2024

help-circle
  • chrash0@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    17 days ago

    you have to do a lot of squinting to accept this take.

    so his wins were copying competitors, and even those products didn’t see success until they were completely revolutionized (Bing in 2024 is a Ballmer success? .NET becoming widespread is his doing?). one thing Nadela did was embrace the competitive landscape and open source with key acquisitions like GitHub and open sourcing .NET, and i honestly don’t have the time to fully rebuff this hot take. but i don’t think the Ballmer haters are totally off base here. even if some of the products started under Ballmer are now successful, it feels disingenuous to attribute their success to him. it’s like an alcoholic dad taking credit for his kid becoming an actor. Microsoft is successful despite him


  • these days Hyprland but previously i3.

    i basically live in the terminal unless i’m playing games or in the browser. these days i use most apps full screen and switch between desktops, and i launch apps using wofi/rofi. this has all become very specialized over the past decade, and it almost has a “security by obscurity” effect where it’s not obvious how to do anything on my machines unless you have my muscle memory.

    not that i necessarily recommend this approach generally, but i find value in mostly using a keyboard to control my machines and minimizing visual clutter. i don’t even have desktop icons or a wallpaper.




  • All programs were developed in Python language (3.7.6). In addition, freely available Python libraries of NumPy (1.18.1) and Pandas (1.0.1) were used to manipulate data, cv2 (4.4.0) and matplotlib (3.1.3) were used to visualize, and scikit-learn (0.24.2) was used to implement RF. SqueezeNet and Grad-CAM were realized using the neural network library PyTorch (1.7.0). The DL network was trained and tested using a DL server mounted with an NVIDIA GeForce RTX 3090 GPU, 24 Intel Xeon CPUs, and 24 GB main memory

    it’s interesting that they’re using pretty modest hardware (i assume they mean 24 cores not CPUs) and fairly outdated dependencies. also having their dependencies listed out like this is pretty adorable. it has academic-out-of-touch-not-a-software-dev vibes. makes you wonder how much further a project like this could go with decent technical support. like, all these talented engineers are using 10k times the power to work on generalist models like GPT that struggle at these kinds of tasks, while promising that it would work someday and trivializing them as “downstream tasks”. i think there’s definitely still room in machine learning for expert models; sucks they struggle for proper support.




  • i feel like if you’re not sat stationary at a workstation (who is these days) what you want is a laptop that’s good at being a laptop. 99% of the software developers i work with (not a small number) use Macbook Pros. they are well built, have good components, have best in class battery life (we’ll see how things shake out with Qualcomm), and are BSD based and therefore Unix compatible. my servers and gaming/CUDA PC? Linux all day. my laptop? Macbook. i’m not ideological enough to have range anxiety every time i step away from my desk. plus any decent sized org is going to have to administrate these machines, from scientists to administrators, and catering to .4% of your users is not a good ROI if your software vendors struggled for 8 years to get their Windows 98 based specialty sensor software to run on Mac.

    that .4% is likely not 0 because they are nerds.

    seriously tho if Qualcomm chips can make a Linux book that lasts all day i would happily make the switch





  • as you might have guessed i haven’t really tried it, but i have been reading about it. that said i have used “drop in replacement” tools like this (we use pnpm at work), and a drop in replacement is not without quirks. they wouldn’t have made a different tool altogether if it was really a 1:1 replacement. just because the commands are the same doesn’t mean it behaves the same. i.e. i doubt one person on the team could be using uv while everyone else sticks to pip


  • definitely not the real reason for a project like this to exist. Python package management can be nightmarish at times depending on what you’re doing. between barebones requirements.txt, Poetry, and the different condas there’s a ton of fragmentation, and none of them do everything you’d want in an ideal way. above and beyond speed, i think uv is another attempt at it. but it could just be another classic xkcd moment where now there’s just another standard to deal with



  • i’ve used Chezmoi for years now pretty successfully. works on my Mac and Linux machines. it probably could be made to work on Windows. i am transitioning to NixOS, but i’ll probably keep using it anyway, since i still have Macs for work (and because they’re great laptops don’t @ me). the only real downside is that it only works for the home folder, so i have to manually control stuff for /etc, but i generally prefer user configuration for most tools anyway.

    i had messed around with Ansible for this in the past, but i didn’t really like it for this use case. it’s been a while tho so it’s hard to say why.

    not to pile on, but you might also look at GNU Stow. i decided against it, but it’s there.

    obligatory i s’pose: https://github.com/covercash2/dotfiles




  • i guess i liked this theory when i was in college eating mushrooms on the regular, but isn’t it kind of weird? like, is a dog not conscious? or did they suddenly become conscious from mushrooms too? to me it feels like tool usage that enables written language is by far the biggest differentiator between humans and “lower” species. i mean, dolphins may be as smart as humans but they have no fuckin clue what their great great grandmother’s name was and have little hope of solving differential equations trying to draw in the sand with their flippers.

    maybe this is just my belief system, but i don’t think eating a mushroom gave anyone a “soul”. i know the feeling of coming down and feeling like you’ve left the cave and everyone else is just looking at shadows on the wall, but those people are conscious of the shadows at least.


  • it’s super weird that people think LLMs are so fundamentally different from neural networks, the underlying technology. neural network architectures are constantly improving, and LLMs are just a product of a ton of research and an emergence after the discovery of the transformer architecture. what LLMs have shown us is that we’re definitely on the right track using neural networks to solve a wide range of problems classified as “AI”