A little late to the party with this one, but I came across this study today and thought it was very interesting. Self-driving electric cars are often touted as a solution to fossil-fueled transit, but that comes with a lot of potential downsides that aren’t always recognized. For reference, data centers are currently responsible for about 3% of global carbon emissions.
In the future, the energy needed to run the powerful computers on board a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today.
That is one key finding of a new study from MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles are widely adopted.
The data centers that house the physical computing infrastructure used for running applications are widely known for their large carbon footprint: They currently account for about 0.3 percent of global greenhouse gas emissions, according to the International Energy Agency. Realizing that less attention has been paid to the potential footprint of autonomous vehicles, the MIT researchers built a statistical model to study the problem. They determined that 1 billion autonomous vehicles, each driving for one hour per day with a computer consuming 840 watts, would consume enough energy to generate about the same amount of emissions as data centers currently do.
The researchers also found that in over 90 percent of modeled scenarios, to keep autonomous vehicle emissions from zooming past current data center emissions, each vehicle must use less than 1.2 kilowatts of power for computing, which would require more efficient hardware. In one scenario — where 95 percent of the global fleet of vehicles is autonomous in 2050, computational workloads double every three years, and the world continues to decarbonize at the current rate — they found that hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels.
“If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn’t seem like it is going to be enough to constrain the emissions from computing onboard autonomous vehicles. This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start,” says first author Soumya Sudhakar, a graduate student in aeronautics and astronautics.
840w seems high for a self-driving computer?
Still don’t know why autonomous trains aren’t more common.
For one, they are, and for two, there’s no point.
What I mean is self-driving trains are very common, if you live in a country with trains from this century. Train autopilots are super easy to make and they are in wide use.
Driverless trains on the other hand are a rarity, because there’s no point. In pretty much any use case for a train you want to have a skilled human around the train anyway, so that you have someone in the exceptional cases where you need a human (e.g. any security related stuff like people train surfing or people causing trouble on a passenger train).
A train driver doesn’t cost that much more than a security guard with enough technical training to be able to patch up and operate a train in emergencies (which, btw, does requite a train driver), so in most cases it’s quite prudent to have a train driver on board.
This is what you get in the end: autonomously driving trains with a train driver for PA announcements, security stuff and emergencies.
each driving for one hour per day with a computer consuming 840 watts
This entirely depends on what energy source we end up using in 2050.
IF , you assume that by 2050 home solar and batteries are a common item, and consumer electric vehicles are predominantly charged at home via those sources , then claims of emissions becoming a concern are moot. Seeing that home solar/batteries are becoming more common now, with 25 years to go, this is not a huge stretch of the imagination.
Each individual vehicle has daily energy requirements that can be sourced relatively easily by local renewables, unlike datacentres which have huge energy requirements requiring energy to be piped in from sources elsewhere.
Apart from that , the 0.8kWh/day usage of the computer hardware is entirely dwarfed by the (handwave guess) ~20kWh/day usage of the actual electric drive system, where trivial improvements in efficiency can compensate for the 0.8kWh/day usage of the computers. Hell, improvements in efficiency because of the adoption of autonomous driving instead of leadfoot humans at the wheel might end up making all this a net positive.
I bet heated seats cost more in terms of power usage. Pointless worrying about the computer…
That 840W consumption is also bullshit.
You know what consumes that much power?
A fully fledged gaming PC with all peripherals included.
The onboard computers for self driving cars are quite optimised, I doubt that even with all the peripherals, cameras, etc., it gets even near 800W. Tesla’s HW3 for example is under 100W.
The article seems to assume a much deeper capability that current function, being run over 10 cameras, while assuming that such ability will not become more efficient over time with dedicated hardware.
They also seem to presume linear scaling for power usage per camera when it couldn’t be further from the truth…
Spending energy to keep humans in separate, less space efficient, metal transport boxes
I didn’t see anywhere in the article where they subtracted this emissions value from the current petrol/diesel dominant fleet’s carbon emissions to work out the huge drop it would bring regardless?
When you can be dropped off at the front door of the mall and let your self driving car circle the block multiple times empty, because paying for, or finding parking is to much of a hassle.





