In this paper we develop the gamma neural model, a new neural net architecture for processing of temporal patterns. Time varying patterns are normally segmented into a sequence of static patterns that are successively presented to a neural net. In the approach presented here segmentation is avoided. Only current signal values are presented to the neural net, which adapts its own internal memory to store the past. Thus, in the gamma neural net, an adaptive short term mechanism obviates a priori signal segmentation. We evaluate the relation between the gamma net and competing dynamic neural models. Interestingly, the gamma model brings many popular dynamic net architectures, such as the time-delay-neural-net and the concentration-in-time-neural-net, into a unifying framework. In fact, the gamma memory structure appears as general as a temporal convolution memory structure with arbitrary time varying weight kernel w(t). Yet, the gamma model remains mathematically equivalent to the additive (Grossberg) model with constant weights. We present a back propagation procedure to adapt the weights in a particular feedforward structure, the focused gamma net.