Signal propagation in an optical fiber can be described by the nonlinear Schrödinger equation (NLSE). The NLSE has no known closed-form solution, mostly due to the interaction of dispersion and nonlinearities. In this paper, we present a novel closed-form approximate model for the nonlinear optical channel, with applications to passive optical networks. The proposed model is derived using logarithmic perturbation in the frequency domain on the group-velocity dispersion (GVD) parameter of the NLSE. The model can be seen as an improvement of the recently proposed regular perturbation (RP) on the GVD parameter. RP and logarithmic perturbation (LP) on the nonlinear coefficient have already been studied in the literature, and are hereby compared with RP on the GVD parameter and the proposed LP model. As an application of the model, we focus on passive optical networks. For a 20 km PON at 10 Gbaud, the proposed model improves upon LP on the nonlinear coefficient by 1.5 dB. For the same system, a detector based on the proposed LP model reduces the uncoded bit-error-rate by up to 5.4 times at the same input power or reduces the input power by 0.4 dB at the same information rate.
Bibliographical note11 pages, 9 figures, 2 tables