n applicants of similar qualification are on an interview list and their salary demands are from a known and continuous distribution. Two managers, I and II, will interview them one at a time. Right after each interview, manager I always has the first opportunity to decide to hire the applicant or not unless he has hired one already. If manager I decides not to hire the current applicant, then manager II can decide to hire the applicant or not unless he has hired one already. If both managers fail to hire the current applicant, they interview the next applicant, but both lose the chance of hiring the current applicant. If one of the managers does hire the current one, then they proceed with interviews until the other manager also hires an applicant. The interview process continues until both managers hire an applicant each. However, at the end of the process, each manager must have hired an applicant. In this paper, we first derive the optimal strategies for them so that the probability that the one he hired demands less salary than the one hired by the other does is maximized. Then we derive an algorithm for computing manager II's winning probability when both managers play optimally. Finally, we show that manager II's winning probability is strictly increasing in n, is always less than c, and converges to c as n →∞, where c = 0.3275624139 · ·· is a solution of the equation ln(2) + x ln(x) = x.