Calculating the capacity of additive signal plus noise channels is one of the central problems in information theory. We study the maximum achievable differential entropy at the output of a system assigning to each input X the sum X + N, with N a given noise with probability law absolutely continuous with respect to the Lebesgue measure and where the input and the noise are allowed to be dependent. We consider fairly general average cost constraints in the input, as well as amplitude constraints. Under mild regularity requirements on the noise density, it is shown that the corresponding search for the optimum may be performed over joint distributions for the input and the noise concentrated in lower dimensional geometrical objects represented by graphs of sufficiently regular functions in the associated noise-input plane, and a full characterization for the optimal curve is provided in terms of the equation and boundary conditions it satisfies. The results are then applied to characterize the independent input and noise case, so providing lower bounds for channel capacity. Analysis of achievable bounds and associated capacity-achieving input distributions will also be discussed.
General conditions for the existence of a maximum achievable entropy at the output are provided so establishing conditions for capacity to be achievable. The laws of such capacity-achieving input distributions are also uniquely specified. A general approximation to capacity scheme is also provided, The relationship between our results for this input and noise dependent setting and the notion of feedback capacity are also explored.
Welcome to everyone!