We provide a novel definition of quantitative information flow, called transmissible information, that is suitable for reasoning about informational-theoretically secure (or non-cryptographic) systems, as well as about cryptographic systems with their polynomially bounded adversaries, error probabilities, etc. Transmissible information captures deliberate communication between two processes, and it safely over-approximates the quantity of information that a process unintentionally leaks to another process.
We show that transmissible information is preserved under universal composability, which constitutes the prevalent cryptographic notion of a secure implementation. This result enables us to lift quantitative bounds of transmissible information from simple ideal functionalities of cryptographic tasks to actual cryptographic systems.
We furthermore prove a connection between transmissible information in the unconditional setting and channel capacity, based on the weak converse of Shannon's coding theorem. This connection enables us to compute an upper bound on the transmissible information for a restricted class of protocols, using existing techniques from quantitative information flow.