An extension of Shannon’s entropy power inequality when one of the summands is Gaussian was provided by Costa in 1985, known as Costa’s concavity inequality. We consider the additive Gaussian noise channel with a more realistic assumption, i.e. the input and noise components are not independent and their dependence structure follows the well-known multivariate Gaussian copula. Two generalizations for the first- and second-order derivatives of the differential entropy of the output signal for dependent multivariate random variables are derived. It is shown that some previous results in the literature are particular versions of our results. Using these derivatives, concavity of the entropy power, under certain mild conditions, is proved. Finally, special one-dimensional versions of our general results are described which indeed reveal an extension of the one-dimensional case of Costa’s concavity inequality to the dependent case. An illustrative example is also presented.