• 800XL@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    7
    ·
    17 days ago

    You just fucking wait. Trump is bringing manufacturing to the US. And when that plant opens someday you’ll be so sorry you doubted.

    • BobSentMe@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      16 days ago

      I’m sure the foxconn plant in Wisconsin will fire up ANY DAY NOW! drums fingers

      • 800XL@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        16 days ago

        I talked to like 50 people today and all of the people said they were starting manufacturing plants tomorrow and they’ll be fully functional Tuesday around 3:15.

        I started mine earlier and I’ve already done manufacturing 3 times today. It’s really easy. By this time tomorrow I’ll have a couple more and they’ll all be winning manufacturing.

        Tariffs gave me the ability to finally believe in myself. Tariffs have increased my stamina in bed, given me a full head of hair again, and since I started manufacturing plant yesterday I’ve dropped 50 pounds.

    • WereCat@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      16 days ago

      Why? If they looked at how current tech works then they could easily develop the same tech 10000x faster

    • LuckyPierre@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      16 days ago

      No? Oh, that’s a shame. I was hoping for some improvement in the world, but a random person on the internet said it wasn’t possible without giving any reasons at all. Oh well.

      • Kairos@lemmy.today
        link
        fedilink
        English
        arrow-up
        7
        ·
        16 days ago

        No it’s literally impossible without bypassing the speed of light and/or the size of atoms.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      4
      ·
      edit-2
      18 days ago

      It‘s likely BS anyway. Maybe it’s just me but reading about another crazy breakthrough from China every single day during this trade war smells fishy. Because I‘ve seen the exact same propaganda strategy during the pandemic when relations between China and the rest of the world weren‘t exactly the best. A lot of those headlines coming from there are just claims about flashy topics with very little substance or second guessing. And the papers releasing the stories aren‘t exactly the most renowned either.

      • LadyAutumn@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        17 days ago

        It’s definitely possible they’re amplifying these developments to maintain confidence in the Chinese market, but I doubt they’re outright lying about the discoveries. I think it’s also likely that some of what they’ve been talking about has been in development for a while and that China is choosing now to make big reveals about them.

  • WorldsDumbestMan@lemmy.today
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    17 days ago

    Whenever they say X whatever times, I doubt it right away, because they always interpret the statistics in the dumbest ways possible. You have a solar panel that is 28% efficient. There is no way it can be 20x times as efficient, that’s just clickbait.

  • MTK@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    16 days ago

    Yeah… At best click baity as fuck, at worst a complete scam.

    Any time there is a 10x or more in a headline you are 10x or more likely to be right by calling it BS.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      18 days ago

      Note that this in theory speaks to performance of a non volatile memory. It does not speak to cost.

      We already have a faster than NAND non volatile storage in phase change memory . It failed due to expense.

      If this thing is significantly more expensive even than RAM, then it may fail even if it is everything it says it is. If it is at least as cheap as ram, it’ll be huge since it is faster than RAM and non volatile.

      Swap is indicated by cost, not by non volatile characteristics.

  • tetris11@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    18 days ago

    Wow, finally graphene has been cracked. Exciting times for portable low-energy computing

  • CouncilOfFriends@slrpnk.net
    link
    fedilink
    English
    arrow-up
    205
    arrow-down
    2
    ·
    18 days ago

    By tuning the “Gaussian length” of the channel, the team achieved two‑dimensional super‑injection, which is an effectively limitless charge surge into the storage layer that bypasses the classical injection bottleneck.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    26
    ·
    17 days ago

    Clickbait article with some half truths. A discovery was made, it has little to do with Ai and real world applications will be much, MUCH more limited than what’s being talked about here, and will also likely still take years to come out

    • 𝓔𝓶𝓶𝓲𝓮@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      5
      ·
      edit-2
      17 days ago

      The key word is China, let us not kid ourselves. Otherwise it would be just another pop sci click but now it can be an ammunition in the fight with imperialist degenerated west or some bs like that

  • boonhet@lemm.ee
    link
    fedilink
    English
    arrow-up
    87
    arrow-down
    9
    ·
    edit-2
    18 days ago

    AI AI AI AI

    Yawn

    Wake me up if they figure out how to make this cheap enough to put in a normal person’s server.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      17 days ago

      You can get a Coral TPU for 40 bucks or so.

      You can get an AMD APU with a NN-inference-optimized tile for under 200.

      Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.

      What price point are you trying to hit?

      • boonhet@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        17 days ago

        What price point are you trying to hit?

        With regards to AI?. None tbh.

        With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          17 days ago

          With regards to AI?. None tbh.

          TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.

        • gravitas_deficiency@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          17 days ago

          You’re willing to pay $none to have hardware ML support for local training and inference?

          Well, I’ll just say that you’re gonna get what you pay for.

          • bassomitron@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            17 days ago

            No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.

                • boonhet@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  16 days ago

                  I mean the image generators can be cool and LLMs are great for bouncing ideas off them at 4 AM when everyone else is sleeping. But I can’t imagine paying for AI, don’t want it integrated into most products, or put a lot of effort into hosting a low parameter model that performs way worse than ChatGPT without a paid plan. So you’re exactly right, it’s not being sold to me in a way that I would want to pay for it, or invest in hardware resources to host better models.

      • WorldsDumbestMan@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        17 days ago

        I just use pre-made AI’s and write some detailed instructions for them, and then watch them churn out basic documents over hours…I need a better Laptop

    • Zip2@feddit.uk
      link
      fedilink
      English
      arrow-up
      131
      arrow-down
      13
      ·
      18 days ago

      normal person’s server.

      I’m pretty sure I speak for the majority of normal people, but we don’t have servers.

      • fmstrat@lemmy.nowsci.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        17 days ago

        “Normal person” is a modifier of server. It does not state any expectation of every normal person having a server. Instead, it sets expectation that they are talking about servers owned by normal people. I have a server. I am norm… crap.

      • notabot@lemm.ee
        link
        fedilink
        English
        arrow-up
        17
        ·
        18 days ago

        You… you don’t? Surely there’s some mistake, have you checked down the back of your cupboard? Sometimes they fall down there. Where else do you keep your internet?

        Appologies, I’m tired and that made more sense in my head.

      • Rose@slrpnk.net
        link
        fedilink
        English
        arrow-up
        45
        ·
        18 days ago

        Yeah, when you’re a technology enthusiast, it’s easy to forget that your average user doesn’t have a home server - perhaps they just have a NAS or two.

        (Kidding aside, I wish more people had NAS boxes. It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives. In a good day. I do have a USB floppy drive and a DVD drive just in case.)

        • KnightontheSun@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          ·
          18 days ago

          Hello fellow home labber! I have a home built xpenology box, proxmox server with a dozen vm’s, a hackentosh, and a workstation with 44 cores running linux. Oh, and a usb floppy drive. We are out here.

          I also like long walks in Oblivion.

          • MrPistachios@lemmy.today
            link
            fedilink
            English
            arrow-up
            7
            ·
            17 days ago

            Man oblivion walks are the best until a crazy woman comes at you trying to steal your soul with a fancy sword

        • pivot_root@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          17 days ago

          It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives.

          Equally disheartening is knowing that both of those have a shelf-life. Old USB flash drives are more durable than the TLC/QLC cells we use today, but 15 years sitting unpowered in a box doesn’t have very good prospects.