it is fucking priceless that an innovation that contained such simplicities as “don’t use 32-bit weights when tokenizing petabytes of data” and “compress your hash tables” sent the stock exchange into ‘the west has fallen’ mode. I don’t intend to take away from that, it’s so fucking funny peltier-laugh

This is not the rights issue, this is not the labor issue, this is not the merits issue, this is not even the philosophical issue. This is the cognitive issue. When not exercised, parts of your brain will atrophy. You will start to outsource your thinking to the black box. You are not built different. It is the expected effect.

I am not saying this is happening on this forum, or even that there are tendencies close to this here, but I preemptively want to make sure it gets across because it fucked me up for a good bit. Through Late 2023–Early 2024 I found myself leaning into both AI images for character conceptualization and AI coding for my general workflow. I do not recommend this in the slightest.

For the former, I found that in retrospect, the AI image generation reified elements into the characters I did not intend and later regretted. For the latter, it essentially kneecapped my ability to produce code for myself until I began to wean off of it. I am a college student. I was in multiple classes where I was supposed to be actively learning these things. Deferring to AI essentially nullified that while also regressing my abilities. If you don’t keep yourself sharp, you will go dull.

If you don’t mind that or don’t feel it is personally worth it to learn these skills besides the very very basics and shallows, go ahead, that’s a different conversation but this one does not apply to you. I just want to warn those who did not develop their position on AI beyond “the most annoying people in the world are in charge of it and/or pushing it” (a position that, when deployed by otherwise-knowledgeable communists, is correct 95% of the time) that this is something you will have to be cognizant of. The brain responds to the unknowable cube by deferring to it. Stay vigilant.

      • NaevaTheRat [she/her]@vegantheoryclub.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 天前

        I’ve wondering about dicking with this. Assume I’m somewhat lazy, approximately technically competent but a slow worker, and prone to dropping projects that take more than a week.

        Would I be able to make something for the kitchen which can like:

        • Set multiple labelled timers, and announce milestones (e.g. ‘time one hour for potatoes, notify at half an hour’, ‘time 10 minutes for grilling’)
        • Be fed a recipe and recite steps navigating forward and backwards when prompted e.g. ‘next’ ‘last’ ‘next 3’ etc
        • Automatically convert from barbarian units
        • email me notes I make at the end of the day (like accumulate them all like a mailing list)

        Or is this still a pipe dream?

        • KnilAdlez [none/use name]@hexbear.net
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 天前

          The answer is that it will take work, but not as much as you may think. And some of the especially niche things may not be possible, I’ll address them one by one.

          Multiple labelled timers

          Yes!

          Announce milestones

          I don’t think so, but you can just set two timers. Timers are new, so that may be a feature in the future.

          Be fed a recipe

          I have never tried this, but it does integrate with grocy, so maybe.

          Automatically convert units

          Officially, probably not, but I have done this before and it has worked just by asking the llm assistant.

          Email me notes at the end of the day.

          I have never done this, and it would take some scripting, but I am willing to bet that it can be done. Someone might have a script for it in the forums.

          Ultimately, home assistant is not an all-in-one solution. It is a unified front end to connect smart home devices and control them. Everything else requires integrations and add-ons, of which there are many. There are lots of tools for automation that don’t require scripting at all, and if you’re willing to code a little it becomes exponentially more powerful. I love it, it helps with my disability, and I build my own devices to connect to it. Give it a shot if you have some spare hardware, So to do what you want, You will need a computer, a GPU at least a RTX 3060, and a speaker phone of some kind.

          • NaevaTheRat [she/her]@vegantheoryclub.org
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 天前

            I can code at a basic level. Like I can make and ship a webapp with some tears.

            Running a server with a GPU is probably a bit high cost. The build and electricity (~50c a kW hour).

            I wasn’t aware running voice recognition cost so much in ?vector calculations? I remember trying some Google thing at some point and finding it utterly hilarious that they thought it was ready for home use since you couldn’t script cues yourself and they seemed to have a very 20 something American man view of what working in the kitchen entails. No real capacity for making multiple things at once, tweaking recipes and saving them, etc

            • KnilAdlez [none/use name]@hexbear.net
              link
              fedilink
              English
              arrow-up
              3
              ·
              3 天前

              The GPU is for the llm, voice recognition is easy for any modern pc. You don’t need to use an llm, but it does give you some flexibility in the commands you can give it. Without an llm, home assistant can only use sentence matching. Sentence matching in HA is pretty good, don’t get me wrong, but llm’s are a level above.

              The home assistant scripting language is just yaml, really easy syntax.

              • Lyudmila [she/her, comrade/them]@hexbear.net
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                2 天前

                Is there any way to use something like a Coral processor instead of a whole GPU to run that LLM?

                If it is, I think that would make it very nearly energy and cost effective enough to run at home. If not, sentence matching sounds best for now.

                • KnilAdlez [none/use name]@hexbear.net
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 天前

                  Unfortunately no, llm’s are too big for a device with no on board memory. That being said, you can try a very small llm on CPU and see how you like it. That is what I am doing currently. You can also use a hybrid option where it tries sentence matching first, then falls back on the slow llm. I am going to write a beginner’s guide about home assistant tomorrow and go into all of this, but it’s pretty easy to get up on your own.

                • KnilAdlez [none/use name]@hexbear.net
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 天前

                  No problem! I love home assistant and private IOT, so I’m always happy to talk about it. I’ll probably write a beginner’s guide in a day or two since people seem interested.

                  • NaevaTheRat [she/her]@vegantheoryclub.org
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    3 天前

                    Oh that would be fantastic, please contact me with a link if you do.

                    I looked into it for some esp32 stuff (weather station + led cube link maybe) and didn’t really grok most of its capabilities or the ecosystem.