16 comments

  • pu_pe 2 days ago ago

    It was obvious that there would be no space for Yann LeCun after Alexandr Wang came in. He was probably just waiting for the best time to leave.

    I cannot judge his research output at Meta but he failed pretty bad at the LLM race. Since so many other organizations succeeded at creating open source models of far higher quality at much lower cost, it would be instructive to understand what exactly went wrong there.

    • marksimi a day ago ago

      Curious about how much risk Meta leadership was comfortable with when they decided to layer Yann. Perhaps the winds of open research were already blowing a different direction at the company, and he had already indicated that he wanted to leave as a result of that. We can only guess.

      Kind of hilarious to me to consider him "failing" with LLMs. Given his remit was a research time horizon of 8-10 years, and the fact that he's gone on record saying that he expects the technology will stall out in the time horizon, it seems he can only take Ws and ties. Indirect influence on open-sourcing the models to propel research forward (which is pretty important for a chief scientist) which added benefit for Meta's other products.

    • John23832 a day ago ago

      > I cannot judge his research output at Meta but he failed pretty bad at the LLM race. Since so many other organizations succeeded at creating open source models of far higher quality at much lower cost, it would be instructive to understand what exactly went wrong there.

      What? Until the Chinese jumped in Llama was the premium open source model. The reason that the Chinese were successful at MOE was just that they were limited with chips and had to think outside the box. US labs are operating on the power law. They also, arguably, distilled from western models (llama).

    • yodsanklai a day ago ago

      > he failed pretty bad at the LLM race

      Was he even involved in this?

      • DebtDeflation a day ago ago

        Did they even fail? Llama2 was groundbreaking for open source LLMs, it defined the entire space. Llama3 was a major improvement over Llama2. Just because Llama4 was underwhelming, it's silly to say they failed.

        • gcr a day ago ago

          Any exponential growth is failing in a market which demands superexponential growth

      • deburo a day ago ago

        No, he said that he was not involved. He had his own research model to develop, his startup will probably continue his work there but I wonder if he thinks its viable in the short term since he's launching a startup. I thought it was a moonshot.

  • yodsanklai a day ago ago

    I wonder if it means Meta will move away from their OSS commitment. Wasn't it largely pushed by LeCun?

  • nis0s 2 days ago ago

    Good for him. No one has done as much damage to AR/VR as FB did to it with the Metaverse. Way to make something cool fundamentally unlikable.

    • falcor84 2 days ago ago

      I would actually put a lot of the blame for today's VR winter (or at least expectations cooldown) on Ready Player One, and particularly its movie adaptation. Not that it was bad per-se, but for me and others I spoke too it was so outlandish that it essentially made VR "jump the shark".

      • torginus a day ago ago

        I would put it on VR having a grand total of 3 good applications: HL Alyx, VRChat and Beat Saber, one of which was doing the metaverse better than Facebook, and on a shoestring budget.

        (I forgot sims).

        Besides I wonder what does a company, whose entire product lineup revolves around looking cool and successful and admired in front of strangers with a product whose main mode of usage involves blindly flailing around in a room with a plastic box stripped to your face.

        • Pedro_Ribeiro a day ago ago

          HL Alyx ruined most VR games for me because they never lived up to Half-Life, and no one but Valve could make such a high-profile game. VR is a genre that benefits A LOT from high budgets.

          I enjoyed games like No Man's Sky in VR, but they just don't hit the same high notes.

    • marcuskane2 a day ago ago

      I've gotta know which side of Poe's law this falls on.

      Was this written in earnest or as an ironic/facetious joke?

    • undefined 2 days ago ago
      [deleted]
  • OBELISK_ASI a day ago ago

    [dead]