44 comments

  • jlawer 2 days ago ago

    For Grok’s sake you hope this is data that was public, something that was buried deep that it has surfaced.

    It’s a shame transparency is so poor here. A simple grep of the training data would likely give a clear explanation of where this has come from.

    • zamadatix 2 days ago ago

      Grokipedia (an online encyclopedia run by xAI and steered by Grok) lists a few sources for it directly, even in old copies of the entry https://web.archive.org/web/20251225113339/https://grokipedi...

      Grok shouldn't be serving this kind of information IMO, and it's yet another serial example of xAI just not caring about real problems, but the even bigger crime is these services she is paying thousands to seem to have done jack other than give a false sense of security while happily taking their money. A time bound Google search and verification of pages from the Wayback Machine confirms this information has been all over social media and other sites constantly for the last decade.

      If I were cynical I'd say this was just a publicity stunt, but the truth is probably really just sad all around: lack of ability to keep such things private, leechers making people think you can just pay and information disappears from the internet, Grok amplifying the problem by being run by people who don't really care about what it does...

      • lazide a day ago ago

        LLMs are fundamentally not deterministic or predictable the way people think they are, and it shows up pretty clearly in situations like this. It isn’t even as deterministic or predictable as a human. And humans aren’t particularly deterministic or predictable.

        Grok, like Tesla FSD, is also kinda half assed, so it shows up even more prominently on that front.

    • harambae 2 days ago ago

      It's usually available through various indirect means. For example, the person who applies to trademark their stage name [0]. (People in the comments are comparing this to revenge porn, but legally it's completely different.)

      [0] https://tsdr.uspto.gov/documentviewer?caseId=sn88576104&docI...

    • undefined 2 days ago ago
      [deleted]
  • 1970-01-01 2 days ago ago

    Too bad it didn't happen to a politician's girlfriend.

  • randyrand 2 days ago ago

    I feel like the bigger issue - from an anonymity perspective - is that she is publicly showing her real face while doing these things, no?

    • selridge a day ago ago

      If the bigger issue to you is that people have a face attached to their bodies then you are off base.

    • 7bit 2 days ago ago

      You think the bigger issue is the thing she does voluntary, and that the lesser issue is that Grok leaked something without her consent, that she protected for over a decade?

      Are you serious?

      • chrisjj a day ago ago

        You have some reason to think she didn't reveal her legal name voluntarily?

        And Grok didn't leak anything. Her legal name was already public for years.

      • ShowalkKama a day ago ago

        how did grok gain access to this supposedly private information? Did it pilfer her private emails? Did it hack the producer's website and gained access to confidential files? Did it look through her computer?

        Look, I hate grok just as much as the next person but if it was just crawled then by definition it is not private.

        You may very well argue that people are harassing her (and that it's not ok), you may even argue that AI should not facilitate such harassment but to call publicly available information private is mental gymnastic.

        • chrisjj a day ago ago

          > how did grok gain access to this supposedly private information? Did it pilfer her private emails? Did it hack the producer's website and gained access to confidential files? Did it look through her computer?

          Or just read the Instagram and Facebook pages for this stage name. This "private" info is right there.

  • hiccuphippo 2 days ago ago

    >I've been paying for data removal services for like, at least six years now

    Do such services actually work? The internet is forever.

    • dmix 2 days ago ago

      The tech industry was very much against the idea when it first came about. It was only really enforced by a few big companies because of some European law and this lady being from Europe likely excepts information on the internet to be controlled in a centralized way.

      There's also been some revenge porn laws in the US that have some cross over. But it's definitely mostly hopes and dreams if you have the money to spend, not something strictly practical.

    • Legend2440 2 days ago ago

      Kind of? They can get you off the first page of Google, which is often enough to keep employers from seeing it.

      • c22 2 days ago ago

        Unless they ask grok!

    • walletdrainer a day ago ago

      Yes they work, no it isn’t.

      The sketchier ones will get content taken down with fake court orders and similar.

    • knowitnone3 2 days ago ago

      [dead]

  • undefined 2 days ago ago
    [deleted]
  • CqtGLRGcukpy 2 days ago ago

    For when the article goes behind a paywall: https://archive.ph/EOL7V

  • ThrowawayTestr 2 days ago ago

    If grok knew that it must have been publically available information.

    • collingreen 2 days ago ago

      Why is that true? Especially with such "must have been" certainty?

      Or is "publicly available" used here to include data breaches and data sold by gray area data brokers? If this is your point then what would qualify as private information ? If this isn't your point

    • pavel_lishin 2 days ago ago

      But should that information ever have been publicly available? Someone else here linked to another instagram account publishing the same data, but that sure sounds like she didn't put it out there.

      It's like saying that someone publishing your bank account balance or nude photos is fine, because someone once stole that data and released it on the internet.

      • manbash 2 days ago ago

        No, it's not. You assume it was stolen, while it wasn't implied in that comment. They simply stated that the data have been put publicly at some point or another.

        • pavel_lishin a day ago ago

          I do assume that, because I assume that this actress did not at any point make it public. It's a reasonable assumption, and it's asinine to assume otherwise.

    • chrisjj 2 days ago ago

      Indeed. <plays very small violin>.

      • well_ackshually 2 days ago ago

        Are you so fundamentally limited that you can't see the difference between "it's somewhere on the internet if you look very hard" and "a chatbot ingested petabytes of data and you can't ever escape it anymore", or is it just misogyny and hating sex workers ?

        • slyall 2 days ago ago

          The same has played out with search engines like google for the last 20 odd years.

          If some random person didn't like you they couldn't just casually (in 5 minutes) dredge up every controversial thing you ever said of that conviction from 15 years ago.

        • runaround555 2 days ago ago

          You shouldn't be getting downvoted for this. I think it's a combination of a large amount of people that just can't wrap their heads around potential for damage done until it happens to them personally, or maybe to a friend or loved one, and the sex work angle of them thinking "well it was public to begin with."

          A lot of this new technology powers abuse on a scale that just wasn't possible before. Things like doxxing/revenge porn were very real threats with life changing consequences but in many cases the worst of it would pass and any memory of it would remain long buried on some faraway corner of the internet no one would likely see again.

          A real human would have to sit down and spend hours and hours trying to track down dirt that may not even exist, it represented a huge investment in every target. Now someone can just take a picture of your face from just about any angle and dredge up anything they want at a touch of a button.

        • chrisjj a day ago ago

          > Are you so fundamentally limited that you can't see the difference between "it's somewhere on the internet if you look very hard"

          Google her stage name. Her legal name appears instantly from Instagram and Facebook. Not very hard.

    • venusenvy47 2 days ago ago

      Is this true? I thought Musk and his DOGE team tapped into many government databases. There were many reports a year ago.

  • phendrenad2 2 days ago ago

    This isn't specifically a Grok problem, this is an LLM training problem. Internet archive captures some personal info -> LLMs train on it -> you get your data scrubbed from IA, but forget to remove it from LLMs (will Google or OpenAI even respond to your email without legal letterhead..?)

  • NedF 2 days ago ago

    [dead]

  • chrisjj 2 days ago ago

    [flagged]

    • projektfu 2 days ago ago

      In this example the person only asked "who is she what is her name" and it would have been fine to stick with her stage name, as real name and birthdate wasn't asked.

      • chrisjj 2 days ago ago

        [flagged]

        • katdork 2 days ago ago

          Outing yourself as not having read the article; an image within and the text clearly shows that Grok provided both her stage name AND her legal name.

          • chrisjj 2 days ago ago

            [flagged]

            • GibbonBreath 2 days ago ago

              You are the one who chose not to read the article and then chose to baselessly speculate. You should own your decisions and not pass the buck to the author.

  • dmix 2 days ago ago

    [flagged]

    • o0-0o 2 days ago ago

      Bingo.

  • dirtikiti 2 days ago ago

    [flagged]

  • jeffwask 2 days ago ago

    [flagged]

  • shaderguy1416 2 days ago ago

    idk who I can talk about a Facebook Meta Ai which added some sexual features to an influencer for an add for a company x. The company x doesn't wanna talk about and influencer y also doesn't wanna talk about it.

    Which makes me think what other things like this has happened with Ai.