The short-term stability of passive atomic frequency standards, especially in pulsed operation, is often limited by local oscillator noise via intermodulation effects. We present an experimental demonstration of the intermodulation effect on the frequency stability of a
continuous atomic fountain clock where, under normal operating conditions, it is usually too small to observe. To achieve this, we deliberately degrade the phase stability of the microwave
field interrogating the clock transition. We measure the frequency stability of the locked, commercial-grade local oscillator, for two modulation schemes of the microwave field: square-wave phase modulation and square-wave frequency modulation. We observe a degradation of the stability
whose dependence with the modulation frequency reproduces the theoretical predictions for the intermodulation effect. In particular no observable degradation occurs when this frequency equals the Ramsey linewidth. Additionally we show that, without added phase noise, the frequency
instability presently equal to 2×10-13 at 1 s, is limited by atomic shot-noise and therefore could be reduced were the atomic flux increased.