If we are to plan conservation strategies that minimize the loss of evolutionary history through human–caused extinctions, we must understand how this loss is related to phylogenetic patterns in current extinction risks and past speciation rates. Nee & May (1997, Science 278, 692–694) showed that for a randomly evolving clade (i) a single round of random extinction removed relatively little evolutionary history, and (ii) extinction management (choosing which taxa to sacrifice) offered only marginal improvement. However, both speciation rates and extinction risks vary across lineages within real clades. We simulated evolutionary trees with phylogenetically patterned speciation rates and extinction risks (closely related lineages having similar rates and risks) and then subjected them to several biologically informed models of extinction. Increasing speciation rate variation increases the extinction–management pay–off. When extinction risks vary among lineages but are uncorrelated with speciation rates, extinction removes more history (compared with random trees), but the difference is small. When extinction risks vary and are correlated with speciation rates, history loss can dramatically increase (negative correlation) or decrease (positive correlation) with speciation rate variation. The loss of evolutionary history via human–caused extinctions may therefore be more severe, yet more manageable, than first suggested.