• 0 Posts
  • 31 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle


  • ThoughtGoblin@lemm.eeto196@lemmy.blahaj.zonenumber rule
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’m not sure what you’re getting at. Dark matter has been proven numerous times, is a predictive model, and is the only explanation that has held up to scrutiny and observations. It’s very clearly the right explanation and we know how dark matter generally behaves, we just don’t know specifically what it is.

    See, for example, the behavior of the bullet cluster merger.




  • How is Xorg a “direct competitor” to Microsoft? Especially Microsoft’s trademark to X in the gaming market where they own the Xbox and Xorg doesn’t participate at all?

    Trademarks protect consumers by preventing fraud and misleading naming. It makes perfect sense that Microsoft owns X in the given market space due to the enormous prevalence of Xbox. Their first console was literally X-shaped and it would be bad for consumers for anyone to be able to make the “X-station” or “X-cube” or some such.



  • Everytime Firefox updates I have to restart the entire browser or it won’t let me open a new tab. This has been going on for years. As a dev, I can’t dynamically edit source during runtime ever since the Quantum update. It’s noticeably slower these days, which is especialy bad on mobile/laptops due to battery life. If you’re on Windows, you don’t get video super sampling (NVIDIA) or HDR videos.

    I wouldn’t call it a buggy mess that crashes frequently, but it’s certainly constantly getting on my nerves.


  • ThoughtGoblin@lemm.eetoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 year ago

    It’s mid-way through 2023, so 3.5 years, right? That seems a little generous, but reasonable. Products for the next year are likely already designed and finished. Then it’ll take time for companies to redesign their devices now that they have to totally change how their chassis are designed, how they achieve IPS resistances, to source the new part, etc.








  • Not really, though it’s hard to know what exactly is or is not encoded in the network. It likely has more salient and highly referenced content, since those aspects would come up in it’s training set more often. But entire works is basically impossible just because of the sheer ratio between the size of the training data and the size of the resulting model. Not to mention that GPT’s mode of operation mostly discourages long-form wrote memorization. It’s a statistical model, after all, and the enemy of “objective” state.

    Furthermore, GPT isn’t coherent enough for long-form content. With it’s small context window, it just has trouble remembering big things like books. And since it doesn’t have access to any “senses” but text broken into words, concepts like pages or “how many” give it issues.

    None of the leaked prompts really mention “don’t reveal copyrighted information” either, so it seems the creators really aren’t concerned — which you think they would be if it did have this tendency. It’s more likely to make up entire pieces of content from the summaries it does remember.