Demand For Power

The Semiconductor Industry Association says we may be running out of electricity by 2040 because of our massive computer usage. Or at least, at this time, we won’t have enough electricity to meet our computing needs.

Modern interior of server room in data-center. Flickr

The report, entitled International Technology Roadmap for Semiconductors 2.0 (phew! Long name. ITRS is used to abbreviate), noted that the world’s biggest computing infrastructure already uses a huge chunk of the world’s power, and the current trajectory is self-limiting. In fact, as was just mentioned, we may soon be needing more electricity than we can produce.

And as a previous Futurism article notes, the ITRS also predicted that transistors will stop shrinking after 2021, as they will no longer be economically viable. This poses a challenge to tech firms to innovate new ways to make computers powerful enough to keep up with demands and also make electricity sources that can supply the power that they require.

Electricity Innovation

“Computing will not be sustainable by 2040, when the energy required for computing will exceed the estimated world’s energy production,” the report unabashedly asserts. And to that end, we may need to turn to new forms of production, such as nuclear fusion. Ultimately, nuclear fusion is the long sought “holy grail” of energy research. It could supply a nearly limitless source of energy—energy that is both clean and safe…and could power our computers.

If this report is accurate, then it seems we will need to accelerate the coming age of fusion. This is especially true as the year 2040 holds particular significance—many experts hold that this will likely be when be the year when artificial intelligence finally becomes as clever as humans. Of course, such systems will be remarkably transformative and usher in a new kind of society…but only if we can power it.

 


Share This Article