This article addresses a dilemma about autonomous vehicles: how to respond to trade-off scenarios in which all possible responses involve the loss of life but there is a choice about whose life or lives are lost. I consider four options: kill fewer people, protect passengers, equal concern for survival, and recognize everyone’s interests. I solve this dilemma via what I call the new trolley problem, which seeks a rationale for the intuition that it is unethical to kill a smaller number of people to avoid killing a greater number of people based on numbers alone. I argue that killing a smaller number of people to avoid killing a greater number of people based on numbers alone is unethical because it disrespects the humanity of the individuals in the smaller-numbered group. I defend the recognize-everyone’s-interests algorithm, which will probably kill fewer people but will not do so based on numbers alone.