To be honest, this development is simply the logical consequence of applying the principle of profit maximization.

Unfortunately, technology has now reached such a scale that the people behind the spreadsheets are willing to sacrifice humanity itself.

That, too, is not surprising, because they will only realize their mistake when it is already too late.

Another example of the same principle is climate change.

  • Natal@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 days ago

    I’m currently studying and gathering studies about language and cognition to see if LLMs could do anything to our culture. So far I’d lean to say that, yes, they could curate our words and thus alter the way we speak, write and think. Words are linked to our perception of reality somehow and given enough time our overlords could curate words they like, shadowban others and in that way interact with what we can communicate and think about.

    Obviously there is a large group of people rejecting AI so they wouldn’t suffer from this.

  • daniskarma@lemmy.dbzer0.com
    cake
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    4 days ago

    People can still do culture, and art, and whatever they want.

    Go draw a picture, write a book, compose a song. It’s not like chatgpt is stepping on your doorway to stop you.

    • DandomRude@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      That is true, of course, but LLMs, image and video generation, and so on, will, in my view, lead to fewer and fewer people being willing to publish their creative works, because the staggering output of AI models will not only make it increasingly unlikely that they will receive compensation for their work, but also that they will receive recognition for it.

      As a result, I think, there will likely be fewer and fewer people willing to accept that their work is being used for free to train precisely those models from which only the people who steal their work benefit - without this theft, the business model of OpenAI and the like simply cannot function.

      In my view, this will sooner or later lead to a vicious cycle in which the models are trained predominantly only with content they have generated themselves. This will then lead to a stagnation of what we understand as culture - for these models are neither creative nor intelligent: they can merely combine existing content to create something that appears new; however, they cannot produce anything truly new. Nevertheless, given its ever-expanding reach, it will likely be this repetitive AI output that has a significant influence on popular culture, at the very least.

      • insomniac_lemon@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        because the staggering output of AI models will not only make it increasingly unlikely that they will receive compensation for their work, but also that they will receive recognition for it

        Don’t forget scraping. GH is already there with public repos, and even if you take a chance and go elsewhere (or don’t release source code) some tools are locked-in (too small to move or too large to move) so it has a chilling effect on growth/contribution.

        I am there already. Albeit just starting out and the thing I didn’t share yet is just a simple sweeper clone (in a somewhat niche language+Godot bindings).

  • wraekscadu@vargar.org
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    5 days ago

    AI is new means of production. The fight mustn’t be against this new means of production. Rather, it must be to seize these new means of production.

    “AI bad” is not a coherent argument.

    • DandomRude@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 days ago

      AI is a new way of combining what already exists. Moreover, these models serve as a tool of power for those who are already powerful.

      “AI is good” is not a coherent argument.

  • Rhynoplaz@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    6 days ago

    This exact same thing has been said about every technological advancement since the industrial revolution.

    I’m not saying we shouldn’t have concerns, but all this AI doomer bullshit is just a circle jerk blown out of proportion.

    • DandomRude@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 days ago

      I’m afraid it’s true this time. You do know how LLMs work, don’t you?

      Edit: And you do have at least some idea of how the entertainment industry works, right?

      • Rhynoplaz@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        6 days ago

        So did the people who thought a woman’s uterus would fly out of her body if she was traveling on a train at 50kpm.

        I’m more than willing to admit I was wrong when AI destroys the world, but I think we’ll find many other things to blame it on before that happens.

        • lyralycan@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          6 days ago

          Well, in one respect it’s humans - always humans - that cause destruction in one way. Even if some of us try, like backing a carbon offset program that plants trees (in correct places that historically support trees), there’s the rich and powerful creating some other hyperfixation like fast fashion, space races, greed wars, religious wars and witch hunts like the one against AI to cock it up again

          On the other hand, I as a builder of tech fucking hate what the obsession with AI has done to the current global economy, and how generative AI is a complete waste of existence as its meaningful use is dwarfed by its cost…

          But I do see value in developmental AI, and while it’s little more than a standard algorithmic program with memory, parameters and developer bias, it is useful at doing some work better than us, and much faster. And since the dawn of humanity we’ve been inventing things to make life tasks easier. I do believe that form of AI will persist. In a way, vehemently opposing the AI programs that make calculations or accurate code is about as righteous as refusing to use a hammer to nail together some wood, or making fearmongering pamphlets about the advent of electrified cities.

        • DandomRude@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          6 days ago

          I have no idea what you’re getting at, but it’s a fact that every minute - if not every second - AI-generated content is being published that you could never read in your entire lifetime.

          That’s the scale we’re talking about.

          Try to imagine what that might mean for people who make a living doing anything that could even remotely be called creative work.

          • Rhynoplaz@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            6 days ago

            Like I said, when the world falls apart, I’ll give you all the credit for being right. Just let me know when it happens.

            • yermaw@sh.itjust.works
              link
              fedilink
              arrow-up
              0
              ·
              6 days ago

              Its not the world falling apart, just what we understand to be culture.

              We thought the machines would take over the jobs to give us time enough to create great works of art of all sorts, and then we find out right at the last minute that actually the machines will be taking care of art because we’re too busy working multiple jobs.

              • Rhynoplaz@lemmy.world
                link
                fedilink
                arrow-up
                0
                arrow-down
                1
                ·
                6 days ago

                What we understand to be culture changes every fifty years or so. Have you waltzed to a harpsichord or taken the family to a public execution lately? Now, THAT was culture.

                • yermaw@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  5 days ago

                  I havent seen a harpsichord or lion-fight lately, but even then it was humans doing it. Humans invented something, other humans participated.

                  Even CGI had humans doing it just with increasingly easy/powerful tools. Even dubstep had a human composer.

                  Yesterday we used the machines to create. Today we ask the machines to create. Tomorrow the machines just create for us.