• 1 Post
  • 49 Comments
Joined 11 months ago
cake
Cake day: July 3rd, 2023

help-circle





  • The Disney thing is what did him in. It was such a stupid fight to pick. Even if he had, in any form, won that fight, he still would’ve lost, because he would’ve harmed his state’s largest employer, and therefore Florida’s economy (and a major donor to his campaign).

    It was a stupid, completely bad political move. It undercuts his “Trump, but competent” image. Donald Trump, or more importantly, a competent version of Donald Trump, would’ve moved the goalposts to somehow declare victory and give up on that fight.

    He only built the “competent” image because his party controls the state legislature. It’s easy to pass stuff when you control the whole government. It’s not an impressive political accomplishment.


  • Aidan@lemm.eetoProgramming@programming.devThe Fall of Stack Overflow
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    10 months ago

    You can have it generate shitty code and then compare it against examples it finds online to iterate that code. Also, it was trained on the whole internet, including those good solutions, and can often reproduce them on its own. but you have to tell it, explicitly, to do all this to make better code, rather than just asking for the code.





  • I don’t agree that ChatGPT has gotten dumber, but I do think I’ve noticed small differences in how it’s engineered.

    I’ve experimented with writing apps that use the OpenAI api to use the GPT model, and this is the biggest non-obvious problem you have to deal with that can cause it to seem significantly smarter or dumber.

    The version of GPT 3.5 and 4 used in ChatGPT can only “remember” 4096 tokens at once. That’s a total of its output, the user’s input, and “system messages,” which are messages the software sends to give GPT the necessary context to understand. The standard one is “You are ChatGPT, a large language model developed by OpenAI. Knowledge Cutoff: 2021-09. Current date: YYYY-MM-DD.” It receives an even longer one on the iOS app. If you enable the new Custom Instructions feature, those also take up the token limit.

    It needs token space to remember your conversation, or else it gets a goldfish memory problem. But if you program it to waste too much token space remembering stuff you told it before, then it has fewer tokens to dedicate to generating each new response, so they have to be shorter, less detailed, and it can’t spend as much energy making sure they’re logically correct.

    The model itself is definitely getting smarter as time goes on, but I think we’ve seen them experiment with different ways of engineering around the token limits when employing GPT in ChatGPT. That’s the difference people are noticing.




  • Someone close to me is a HS teacher. During covid, the schools changed their policy from “no phones in class ever” to “you can have your phone in class but you’d better only use it to help with classwork or in an emergency.”

    They’ve been trying to reverse the policy back to how it was, but it’s hard to get all the kids to believe that they can’t do this anymore. They don’t take the threat of punishment seriously because everyone is doing it now.

    Even if you manage to deal with the phone issue, the school gives kids chromebooks now to do their work on. The student wifi network seemingly has no restrictions, since the teachers sometimes need to have them watch something on YouTube or Netflix.

    So kids, during class, watch Netflix on their Chromebook instead of paying attention.