Policy decisions have vast consequences, but there is little empirical research on how best to communicate underlying evidence to decision-makers. Groups in diverse fields (e.g., education, medicine, crime) use brief, graphical displays to list policy options, expected outcomes and evidence quality in order to make such evidence easy to assess. However, the understanding of these representations is rarely studied. We surveyed experts and non-experts on what information they wanted and tested their objective comprehension of commonly used graphics. A total of 252 UK residents from Prolific and 452 UK What Works Centre users interpreted the meaning of graphics shown without labels. Comprehension was low (often below 50%). The best-performing graphics combined unambiguous metaphorical shapes with color cues and indications of quantity. The participants also reported what types of evidence they wanted and in what detail (e.g., subgroups, different outcomes). Users particularly wanted to see intervention effectiveness and quality, and policymakers also wanted to know the financial costs and negative consequences. Comprehension and preferences were remarkably consistent between the two samples. Groups communicating evidence about policy options can use these results to design summaries, toolkits and reports for expert and non-expert audiences.