Highlights Some Superforecasters start a substack, as does Dominic Cummings Alex Lawsen and I published Alignment Problems With Current Forecasting Platforms on the arxiv. What if Military AI is a Washout? considers a future in which AI ends up affecting war not because of its overwhelming dominance, but by changing war's tradeoffs and best practices.
> Result: less people in tooling, more people in model quality assurance.
Once AGI arrives, everyone alive will be working on alignment. (All 0 of them.)