We investigate the estimation of the perimeter of a set by a graph cut of a random geometric graph. For Ω ⊆ D = (0, 1)d with d ≥ 2, we are given n random independent and identically distributed points on D whose membership in Ω is known. We consider the sample as a random geometric graph with connection distance ε > 0. We estimate the perimeter of Ω (relative to D) by the, appropriately rescaled, graph cut between the vertices in Ω and the vertices in D ∖ Ω. We obtain bias and variance estimates on the error, which are optimal in scaling with respect to n and ε. We consider two scaling regimes: the dense (when the average degree of the vertices goes to ∞) and the sparse one (when the degree goes to 0). In the dense regime, there is a crossover in the nature of the approximation at dimension d = 5: we show that in low dimensions d = 2, 3, 4 one can obtain confidence intervals for the approximation error, while in higher dimensions one can obtain only error estimates for testing the hypothesis that the perimeter is less than a given number.