Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

  • QuadratureSurfer@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    edit-2
    7 months ago

    I think you’re confusing “AI” with “AGI”.

    “AI” doesn’t mean what it used to and if you use it today it encompasses a very wide range of tech including machine learning models:

    Speech to text (STT), text to speech (TTS), Generative AI for text (LLMs), images (Midjourney/Stable Diffusion), audio (Suno). Upscaling, Computer Vision (object detection, etc).

    But since you’re looking for AGI there’s nothing specific to really point at since this doesn’t exist.

    Edit: typo

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      9
      ·
      edit-2
      7 months ago

      Speech to text (STT), text to speech (TTS), Generative AI for text (LLMs), images (Midjourney/Stable Diffusion), audio (Suno). Upscaling, Computer Vision (object detection, etc).

      Yes, this is exactly what I meant. Anything except actual intelligence. Do bosses from video games count?

      I think it’s smart to shift the conversation away from AI to ML, but that’s part of my point. There is a huge gulf between ML and AGI that AI purports to fill but it doesn’t. AI is precisely that hype.

      If “AI doesn’t mean what it used to”, what does it mean now? What are the scientific criteria for this classification? Or is it just a profitable buzzword that can be attached to almost anything?

      But since you’re looking for AGI there’s nothing specific to really point at since this doesn’t exist.

      Yes, it doesn’t exist.

      • QuadratureSurfer@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        7 months ago

        Edit: Ok it really doesn’t help when you edit your comment to provide clarification on something based on my reply as well as including additional remarks.


        I mean, that’s kind of the whole point of why I was trying to nail down what the other user meant when they said “AI doesn’t provide much benefit yet”.

        The definition of “AI” today is way too broad for anyone to make statements like that now.

        And to make sure I understand your question, are you asking me to provide you with the definition of “AI”? Or are you asking for the definition of “AGI”?

        Do bosses from video games count?

        Count under the broad definition of “AI”? Yes, when we talk about bosses from video games we talk about “AI” for NPCs. And no, this should not be lumped in with any machine learning models unless the game devs created a model for controlling that NPCs behaviour.

        In either case our current NPC AI logic should not be classified as AGI by any means (which should be implied since this does not exist as far as we know).