In this paper we consider the problem of deciding which of two populations \pi_1 and \pi_2 has the larger location parameter. We base this decision -- which is a choice between "\pi_1", "\pi_2" and "\pi_1 or \pi_2" -- on summary statistics X1 and X2 , obtained from independent samples from the two populations. Our loss function contains a penalty for the absence of a "good" population as well as for the presence of a "bad" one among those chosen. We show that, for our class of decision rules (see (1.2)), the one that chooses the population with the largest observed value of Xi minimizes the expected loss. It also, obviously, minimizes the expected number of chosen populations. We give conditions under which the expected loss has a unique maximum and, for several examples where these conditions are satisfied, we also show that the expected loss is, for each (\theta_1, \theta_2), strictly decreasing in the (common) sample size n. For the case of normal populations Bechhofer (1954) proposed and studied this decision rule where he chose n to lowerbound the probability of a correct selection. Several new results on distributions having increasing failure rate, needed for our results, are of independent interest, as are new results on the peakedness of location estimators.