… in the United States, public investment in science seems to be redirected and concentrated on AI at the expense of other disciplines. And Big Tech companies are consolidating their control over the AI ecosystem. In these ways and others, AI seems to be making everything worse.
This is not the whole story. We should not resign ourselves to AI being harmful to humanity. None of us should accept this as inevitable, especially those in a position to influence science, government, and society. Scientists and engineers can push AI towards a beneficial path. Here’s how.
The essential point is that, like with the climate crisis, a vision of what positive future outcomes look like is necessary to actually get things done. Things with the technology that would make life better. They give a handful of examples and provide broad categories if activities that can help steer what is done.



Success would lead to AI use that properly accounted for its environmental impact and had to justify it’s costs. That likely means much AI use stopping, and broader reuse of models that we’ve already invested in (less competition in the space please).
The main suggestion in the article is regulation, so I don’t feel particularly understood atm. The practical problem is that, like oil, LLM use can be done locally at a variety of scales. It also provides something that some people want a lot:
It’s thus extremely difficult to regulate into non-existence globally (and would probably be bad if we did). So effective regulation must include persuasion and support for the folks who would most benefit from using it (or you need a huge enforcement effort, which I think has its own downsides).
The problem is that even if everyone else leaves the hole, there will still be these users. Just like drug use, piracy, or gambling, it’s easier to regulate when we make a central easy to access service and do harm reduction. To do this you need a product that meets the needs and mitigates the harms.
Persuading me I’m directionally wrong would require such evidence as: