• SHITPOSTING_ACCOUNT@feddit.de
    link
    fedilink
    arrow-up
    16
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Bullshit article/study:

    These numbers are estimates based on the assumption that the Bitcoin mines run on water-dependent cooling systems typical in large data centers.

    So they took the typical datacenter water consumption per MW, applied that to some estimate of Bitcoin power consumption (wouldn’t be surprised if they did the usual “use current output rates and multiply with power-per-output numbers of long obsolete hardware”, often seen in “studies” “showing” how tech X is horrible for the environment), and assumed that would be it.

    All pictures of Bitcoin mines I’ve seen used (direct) free cooling which doesn’t use water. That has changed now, but simply assuming it’s the same as for normal data centers is an obviously questionable assumption.

    Fun experiment: look up the CO2 intensity of electricity, look up prices for industrial electricity, look up claims of “CO2 emission per Netflix movie streamed”, then compare with the cost of your Netflix subscription and wonder whether Netflix would really be profitable if streaming was that power hungry.

    (Also, the author misunderstood how this system works: “However, some data centers and crypto mines use a different system that keeps computers cool and cuts down water consumption by immersing them in a non-conductive liquid.” Now that DC has a hot liquid, which they could cool in a number of different ways, some using water some not. Which system they use to get the heat from the chip to the cooling system doesn’t matter if they aren’t freecooling)