• @TheAnonymouseJokerM
    link
    -2
    edit-2
    11 months ago

    With tools like GPT it’s now much easier for anyone to put out their message and to make it look polished.

    The problem is there is no end to this kind of escalation. It may sound too moralistic, but it is the truth. Instead of trying to de-escalate the consequences of mastered propaganda matrix that USA throws on the world, now we are in a position battling it out.

    As for artistry and photography, you are missing the issue here. OpenAl (GPT-3, GPT-4), Google (Bard), Microsoft (Clearview image database + current OpenAl partnership) have abused the lack of regulation and enjoyed protection under USA government. This has allowed them to scrape almost all of the internet we interact with in general, down to reddit. Artists not being paid anything for their scraped art, and in turn facing an optimised database machine that can churn out art in seconds for prompts, is not the same as s photographers and artists who mutually gave each other respect for their craft and did not interfere in each others’ space like this. There is a huge distinction.

    I am very conflicted over what this colossal data scraping and manipulation of this data will end up in. I can say that it is not as progressive for technology in terms of ethics in the long term. And the existing barriers regarding creativity have a purpose – everyone is not supposed to do everything. A society of jack of trades wil never progress like one with few quality experts or craft masters. It only elevates the expertise baseline of society for a temporary phase, ultimately leading to stagnation, unless the issue of craft expertise is solved. Humans have a fundamental problem of ego, and instead of no knowledge, now people wil argue with half knowledge. Look at social media in the last decade, with just as many liberals being like conservatives, armchair-ing it out on internet or in podcasts.

    /rant

    • ☆ Yσɠƚԋσʂ ☆OP
      link
      311 months ago

      I think there is a plateau to how far things can escalate in practice. The limitation is ultimately how much information humans can process throughout the day. It’s also worth noting that we’re already drowning in propaganda and junk information right now. I’m not sure that additional saturation of the information space will fundamentally change things.

      And I agree with the unethical side of how models get trained, especially in the case of proprietary models. I would argue this is more of a capitalism problem than a technology problem though.

      I imagine that in the end people are still going to specialize, and they will leverage these tools to automate a lot of tedious work in their professions. It’s never been the case that new sorts of technology and automation led to stagnation. The opposite is generally the case where there is a huge explosion in creativity and invention.

      I do think that societies like China will make much better use of this tech than the west will though because there is a central direction over how the tech is used and what it’s directed towards.