Research output: Contribution to journal › Article › peer-review
Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range. / Terekhov, I. S.; Reznichenko, A. V.; Kharkov, Ya A. et al.
In: Physical Review E, Vol. 95, No. 6, 062133, 26.06.2017, p. 062133.Research output: Contribution to journal › Article › peer-review
}
TY - JOUR
T1 - Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range
AU - Terekhov, I. S.
AU - Reznichenko, A. V.
AU - Kharkov, Ya A.
AU - Turitsyn, S. K.
N1 - Publisher Copyright: © 2017 American Physical Society.
PY - 2017/6/26
Y1 - 2017/6/26
N2 - We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.
AB - We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.
KW - LIMITS
KW - COMMUNICATION
KW - TRANSMISSION
KW - AMPLIFIERS
KW - SYSTEMS
UR - http://www.scopus.com/inward/record.url?scp=85021408281&partnerID=8YFLogxK
U2 - 10.1103/PhysRevE.95.062133
DO - 10.1103/PhysRevE.95.062133
M3 - Article
C2 - 28709237
AN - SCOPUS:85021408281
VL - 95
SP - 062133
JO - Physical Review E
JF - Physical Review E
SN - 2470-0045
IS - 6
M1 - 062133
ER -
ID: 9069597