• Reygle@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    9
    ·
    9 hours ago

    A Phone can’t do anything. It can send/receive and the datacenter does the work. Surely everyone understands this.

    A modern AI data center have already shot right past 200 Terrawatt hours and are on track to double again in the next two years.

    People can’t be this blind.

    • AeonFelis@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 hours ago

      A phone can do a lot. Much much more than ENIAC era supercomputer (I think you’ll have to get pretty close to the end of the previous century to find a supercomputer more powerful than a modern smartphone)

      What a phone can’t do is run an LLM. Even powerful gaming PCs are struggling with that - they can only run the less powerful models and queries that’d feel instant on service-based LLMs would take minutes - or at least tens of seconds - on a single consumer GPU. Phones certainly can’t handle that, but that doesn’t mean that “cant’ do anything”.

      • bandwidthcrisis@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        I’ve run small models (a few Gb in size) on my steam deck. It gives reasonably fast responses (faster than a person would type).

        I know that they’re far from state-of-the art, but they do work and I know that the Steam Deck is not going to be using much power.

    • surph_ninja@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      9 hours ago

      LoL. Guess I can just get rid of phone’s processor then, huh?

      And again, you link an image from an outdated study. Because the new data shows the use declining, so it wouldn’t help your fear mongering.

        • surph_ninja@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          If it were reality, you’d have some recent data. Might as well make projections on computer power use by starting with the ENIAC, and then you can claim computers are consuming more than our current energy output.