Elon Musk’s xAI has lost its bid for a preliminary injunction that would have temporarily blocked California from enforcing a law that requires AI firms to publicly share information about their training data.

xAI had tried to argue that California’s Assembly Bill 2013 (AB 2013) forced AI firms to disclose carefully guarded trade secrets.

The law requires AI developers whose models are accessible in the state to clearly explain which dataset sources were used to train models, when the data was collected, if the collection is ongoing, and whether the datasets include any data protected by copyrights, trademarks, or patents. Disclosures would also clarify whether companies licensed or purchased training data and whether the training data included any personal information. It would also help consumers assess how much synthetic data was used to train the model, which could serve as a measure of quality.

  • MonkderVierte@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    57 minutes ago

    Ah, finally, someone starts it. A person is allowed to say publicly what they want. But a LLM is not a person but a commercial interest. The LLM should never have been allowed to public from start, as the misinformation machines they are currently. EU, please follow up.

      • chaogomu@lemmy.world
        link
        fedilink
        English
        arrow-up
        41
        ·
        15 hours ago

        Don’t forget his child porn image generator, what sort of training data did he use to get that result?

        • NihilsineNefas@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Well we all know musk has desperately wanted to get on the island, who knows, maybe hes got some of those unredacted files in his personal collection

        • zurohki@aussie.zone
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          10
          ·
          13 hours ago

          If you ask an AI image generator for a bed shaped like a pineapple, it’ll give you one without having a single pineapple-shaped bed in the training data. It has beds and pineapples and it can mash the two together.

          If you’ve got naked adults in the training data and you’ve got children in the training data, it’s going to be able to generate child porn.

          • Prime@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            3 hours ago

            Why is this down voted? It is essentially true. Source: I work in ml/hpc

  • ageedizzle@piefed.ca
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    14 hours ago

    Elon Musk’s xAI has lost its bid for a preliminary injunction that would have temporarily blocked California from enforcing a law that requires AI firms to publicly share information about their training data.

    How do you actually enforce this? What’s stopping these companies from just lying about what training data they use?

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 hours ago

      what’s stripping these companies lying about their financial data to tax authorities?

      there are lots of self-report mechanisms that we use… it’s just not worth the blowback of non-disclosure to lie about it. some people do, and sometimes they get caught; not always, but overall it’s a net benefit to transparency

      • ageedizzle@piefed.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 hours ago

        I don’t know anything about accounting, but at first blush it seems like tax evasion and so forth would be easier to detect because the government can look at their bank activity and perform random audits, and so on. In contrast I don’t really know what tools we’d use to catch people lying about their training data

        • Pup Biru@aussie.zone
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 hours ago

          for large companies, i think you’re probably right… but there are plenty of transactions that happen cash. i think it’s a case of not letting perfect be the enemy of better. some people might lie, and if they get caught that should have some punishment… but we hope that most people don’t lie, because the risk just isn’t worth it

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      ·
      16 hours ago

      Who is going to remove it? Trump’s friend Tim Cook? Trump’s friend Jeff Bezos? Trump’s friend Sundar Pichai? Or Trump’s friend Satya Nadella?

        • tempest@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          14 hours ago

          Which of why they are all Trump’s friend. They are the friend of who ever is in power and doesn’t get in the way (or helps) make the stock price larger.

          If Trump disappeared tomorrow and was replaced with a progressive they would change their tune immediately.

          Corporations don’t have morals and have no qualms about being hypocritical. If they are publically traded the only language they speak is “stock go up” and “stock go down”.

  • BigMacHole@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    2
    ·
    18 hours ago

    I DONT care WHERE they got the Data ALL I want is to be Able to make ACCURATE CSAM!

    -Elon Musk and Republicans!

  • CosmoNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    18 hours ago

    Xitter or xAI shouldn‘t be allowed anyway and California cyberlaws are bonkers. I would hope both instances tear each other apart.