AI is typically associated with virtuality and the cloud, yet these systems rely on vast physical infrastructures that span the globe and require tremendous amounts of natural resources, including energy, water, and rare earth minerals. A 2019 study found that training large language models "can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself)" (MIT Technology Review).