In this paper, we present measurements and simulation of random telegraph signal (RTS) noise in n-channel MOSFETs under periodic large signal gate-source excitation (switched bias conditions). This is particularly relevant to analog CMOS circuit design where large signal swings occur and where LF noise is often a limiting factor in the performance of the circuit. Measurements show that, compared to steady-state bias conditions, RTS noise can decrease but also increase when the device is subjected to switched bias conditions. We show that the simple model of a stationary noise generating process whose output is modulated by the bias voltage is not sufficient to explain the switched bias measurement results. Rather, we propose a model based on cyclostationary RTS noise generation. Using our model, we can correctly model a variety of different types of LF noise behavior that different MOSFETs exhibit under switched bias conditions. We show that the measurement results can be explained using realistic values for the bias dependency of tc and te.