1. Introduction
Let f be a transcendental entire function of the form
where $z, a_j \in \mathbb{C}$.
Let $(\Omega, \mathcal{F}, \mu)$ be a probability space, where $\mathcal{F}$ is a σ-algebra of subset of Ω and µ is a probability measure on $(\Omega,\, \mathcal{F})$. Along with the function (1.1), we consider the random functions on the probability space $(\Omega,\, \mathcal{F}, \, \mu)$ as follows:
where $z, a_j \in \mathbb{C}$, $\omega \in \Omega$, $\chi_j(\omega)$ ($j=0, 1, 2, \ldots$) are independent and identically distributed complex-valued random variables. Further, we assume that the expectation and variance of χj are zero and one, respectively. It is clear that $f_\omega(z)$ is an entire function for almost all $\omega \in \Omega$ (see [Reference Kahane6]).
In general, we consider three cases regarding $\chi_j(\omega)$. Gaussian entire functions: χj ($j=0, 1, \ldots$) are complex-valued Gaussian random variables with standard Gaussian distribution; Rademacher entire functions: χj ($j=0, 1, \ldots$) are Rademacher random variables, which take the values ±1 with probability 1/2 each; Steinhaus entire functions: $\chi_j={\rm e}^{2\pi i\theta_j}$ ($j=0, 1, \ldots$) are Steinhaus random variables, where θj ($j=0, 1, \ldots$) are independent real-valued random variables with uniform distribution in the interval [0,1].
The study of random polynomials was initiated by Bloch and Pólya in 1932. Since then, there are a lot of publications on random polynomials. Moreover, the research on random transcendental entire functions, especially, on Gaussian, Rademacher and Steinhaus entire functions, has drawn a lot of attention, too (e.g. [Reference Buhovsky, Glücksam and Sodin1, Reference Hough, Krishnapur, Peres and Virág4, Reference Kabluchko and Zaporozhets5, Reference Mahola and Filevych9, Reference Mahola and Filevych10, Reference Nazarov, Nishry and Sodin13, Reference Nazarov, Nishry and Sodin14, Reference Sodin16, Reference Sun and Chen17]). Recently, Nazarov et al. [Reference Nazarov, Nishry and Sodin13, Reference Nazarov, Nishry and Sodin14] made a breakthrough on the logarithmic integrability of Rademacher Fourier series and obtained several important results on the distribution of zeros of Rademacher entire functions. Their results extended earlier work of Littlewood and Offord [Reference Littlewood and Offord7, Reference Littlewood and Offord8]. Also, in 1982, Murai [Reference Murai12] proved the Nevanlinna defect identity for Rademacher entire functions. In 2000, Sun and Liu [Reference Sun and Liu18] obtained the Nevanlinna defect identity for $f(z)+X(\omega)g(z)$ (where $f ,g$ are entire, g is a small function of f and $X(\omega)$ is a non-degenerated complex-valued random variable). Later, Mohola and Filevych [Reference Mahola and Filevych9, Reference Mahola and Filevych10] obtained Nevanlinna’s second main theorem for Steinhaus entire functions.
In this paper, we first define a family $\mathcal{Y}$ of random entire functions, which includes Gaussian, Rademacher and Steinhaus entire functions. Thus, we can deal with these three classes of famous random entire functions all together. Then, we prove several inequalities concerning the maximum modulus $M(r, f)$, $\sigma(r, f)$ and the integrated counting function $N(r, a, f_\omega)$ for the random entire functions in the family $\mathcal{Y}$. These inequalities show that the zero-counting functions of almost all randomly perturbed functions fω are close to the maximum modulus of f, up to an error term. We also carefully treat the error terms in these inequalities. Our Lemma 4.3 verifies that the family $\mathcal{Y}$ includes Gaussian, Rademacher and Steinhaus entire functions. The ingredients in our proofs involve the techniques used by Nazarov–Nishry–Sodin, Mohola–Filevych and Offord. As a by-product of our results, we also establish Nevanlinna’s second main theorems for random entire functions with a careful treatment of its error term. Thus, we obtain that the characteristic function of almost all functions in the family is bounded above by an integrated counting function, rather than by two integrated counting functions as in the classical Nevanlinna theory.
The paper is organized as follows. We devote $\S$ 2 to some preliminaries and previous results. In $\S$ 3, we state our main results and Nevanlinna’s second main theorems for random entire functions. In $\S$ 4, we give some lemmas, which are needed in the proofs of our results, where Lemma 4.3 is one of the key lemmas in the section. In $\S$ 5, we first prove Theorem 3.1, with which, then, we prove a lemma that has its own interests and is needed in the proof of Theorem 3.2. All corollaries are proved in this section, too.
2. Preliminaries
Let X be a complex-valued random variable. We denote the expectation and the variance of X by $\mathbb{E}(X)$ and $\mathbb{V}(X)$, respectively. In particular, if X is either a standard complex-valued Gaussian random variable (its probability density function is $ {\rm e}^{-|z|^2}/\pi$ with respect to Lebesgue measure m in the complex plane), or a Rademacher random variable or a Steinhaus random variable, then $\mathbb{E}(X)=0$ and $\mathbb{V}(X)=\mathbb{E}(|X|^2)=1$. We also denote the probability of an event A by $\mathbb{P}(A)$. For a set $E\subset[1,+\infty)$, we say E has a finite logarithmic measure if $\int_E1/t\,\textrm{d}t \lt +\infty$.
For the reader’s convenience, we recall some standard notation in function theory and state some important theorems in Nevanlinna theory for meromorphic functions g in the complex plane $\mathbb{C}$. These notation and theorems will be used to prove new theorems in Nevanlinna theory as corollaries of our main results for random entire functions. In the sequel, the values of constants, such as $C, C_1, r_0\,\textrm{and}\, r_1$, may be different in each appearance of these constants.
We define the proximity function of g by
and for any $a\in \mathbb{C}$, we define
and the (integrated) counting function of a-value of g by
where $n(t, a, g)$ is the number of zeros of g–a in the disk $D(0, t)$. For $a=\infty$, $N(r,\infty,g)$, sometimes expressed as $N(r,g)$, is called the counting function of poles of g. We denote the Nevanlinna characteristic function of g by
and the maximum modulus of g by
Theorem 2.1. (Jensen Formula, e.g. [Reference Cherry and Ye2, Reference Hayman3])
If g is a meromorphic function, then
where $c_{g}(0)$ is the first non-zero coefficient of Laurent series of g(z) in the neighbourhood of the point z = 0.
The Jensen formula also implies the so-called Nevanlinna’s first main theorem.
Theorem 2.2. (First Main Theorem, e.g. [Reference Cherry and Ye2, Reference Hayman3])
Let g be a meromorphic function in the complex plane and $a\in \mathbb{C}$. Then
where $ |\epsilon(a, r)| \le \log^+|a| +\log 2.$
There are many versions of the second main theorem in Nevanlinna theory. Here, when g is an entire function, we use the one with a better error term.
Theorem 2.3. (Second Main Theorem, e.g. [Reference Cherry and Ye2, Reference Hayman3])
Let g be an entire function in the complex plane and let dj ($j=1, 2$) be two distinct complex numbers. Then
for all large r outside a set E of finite Lebesgue measure, where the error term is
It is known (e.g. [Reference Cherry and Ye2, Reference Ye19]) that the coefficient 1 in the front of $\log T(r, g)$ in the inequality is the best possible, and, clearly, the term O(1) depends on $c_{g}(0)$ and dj.
We say that functions defined in Equation (1.2) have a certain property almost surely (a.s.) if there is a set $F\subset \Omega$ such that $\mu(F)=0$ and the functions with $\omega \in \Omega \setminus F$ possessing the said property.
For $\omega\in\Omega$, we define
and $ \sigma^2(r, f)=\sum\limits_{j=0}^{\infty}|a_j|^2r^{2j}$. Further, if $\mathbb{E}(\chi_j)=0$ and $\mathbb{V}(\chi_j)=1$, then
Set
where $ \sum_{j=0}^{\infty}|\widehat{a_j}(r)|^2=1$ for all r. Let
Definition 2.1. Let f and fω be defined as in Equations (1.1) and (1.2), respectively. Then, the random entire function fω belongs to the family $\mathcal{Y}$ if and only if fω satisfies Condition Y, i.e., there are three positive constants A, B and C such that for all r > 0,
In § 4, we will prove that all Gaussian, Rademacher and Steinhaus entire functions are in family $\mathcal{Y}$. Indeed, if fω is Gaussian, Rademacher or Steinhaus, then fω satisfies Condition Y when we choose $A\in (0, 2)$ and B = 1; A is close to zero and $B=1/6$; $A\in (0,1/3)$ and B = 1, respectively.
Observe that if χj ($j=0, 1, 2, \ldots$) are standard complex-valued Gaussian random variables, then $\mathbb{E}(X_r)$ is a positive constant. Therefore, for any Gaussian entire function fω,
where C is a constant.
In 2010 and 2012, Mahola and Filevych proved the following result, which can be regarded as a version of Nevanlinna’s second main theorem.
Theorem 2.4. ([Reference Mahola and Filevych9, Reference Mahola and Filevych10], Theorem 1)
Let f be an entire function as defined in Equation (1.1) and let $f_\omega(z)$ be a Steinhaus or a Gaussian entire function on $(\Omega,\, \mathcal{F}, \, \mu)$ of the form (1.2). Then, there is a set E of finite logarithmic measure on $(0, \infty)$ such that for every $a\in \mathbb{C}$, the inequality
holds, where $C_1 \gt 0$ is an absolute constant.
Remark 1. Mahola and Filevych [Reference Mahola and Filevych9] proved a similar inequality to that in Theorem 2.4 for the Steinhaus entire functions. In 2012, they proved Theorem 2.4 and other interesting results in [Reference Mahola and Filevych10] for the Steinhaus entire functions. Further, in 2012, Filevych stated that the inequality in Theorem 2.4 is also true for the Gaussian entire functions. Recently, Filevych told one of the authors that although the proof of the statement has not been published, it is essentially a repetition of the considerations from Mahola’s Ph.D. dissertation [Reference Mahola11].
Nazarov, Nishry and Sodin proved
Theorem 2.5. ([Reference Nazarov, Nishry and Sodin14], Theorem 1.1)
Let fω be a Rademacher entire function. There exists a set $E\subset [1, \infty)$ (depending on $|a_k|$ only) of finite logarithmic length such that
(i) for almost every $\omega\in \Omega$, there exists $r_0(\omega) \in [1, \infty)$ such that for every $r\in [r_0(\omega), \infty)\setminus E$ and every $\gamma \gt 1/2$,
\begin{equation*} |n(r,0,{f_\omega }) - r{\mkern 1mu} {{\rm{d}} \over {{\rm{d}}r}}\log \sigma (r,f)| \le C(\gamma ){\left( {r{\mkern 1mu} {{\rm{d}} \over {{\rm{d}}r}}{\mkern 1mu} \log \sigma (r,f)} \right)^\gamma }; \end{equation*}(ii) for every $r\in [1, \infty)\setminus E$ and every $\gamma \gt 1/2$,
\begin{equation*} {\Bbb E}|n(r,0,{f_\omega }) - r{\mkern 1mu} \frac{{\text{d}}}{{{\text{d}}r}}\log \sigma (r,f)| \leqslant C(\gamma ){\left( {r{\mkern 1mu} \frac{{\text{d}}}{{{\text{d}}r}}{\mkern 1mu} \log \sigma (r,f)} \right)^\gamma }. \end{equation*}
3. Our results
In this section, we state several inequalities concerning the maximum modulus $M(r, f)$, $\sigma(r, f)$ and the integrated counting function $N(r, 0, f_\omega)$ for the random entire functions in the family $\mathcal{Y}$ with careful treatment of their error terms. A relationship between $\log \sigma(r, f_\omega)$ and $\log \sigma(r, f)$ is stated and proved in $\S$ 5.
Theorem 3.1. If $f_\omega \in \mathcal{Y}$, then, for any constant C > 1, there exists a constant $r_0=r_0(\omega)$ such that, for $r \gt r_0$,
where the constants A and B are from Condition Y.
Remark 2. Theorem 3.1 tells us that the number of zeros of almost all fω can be controlled from above and below by $\log \sigma(r, f)$ and an error term, which are independent of ω.
Sometimes, it is easier for one to calculate $M(r, f)$ rather than $\sigma(r,f)$. By Lemma 4.6, we obtain:
Corollary 3.1. If $f_\omega \in \mathcal{Y}$, then, for any constant C > 1, there are a constant $r_0=r_0(\omega)$ and a set $E\subset [e, \infty)$ of finite logarithmic measure such that, for $r \gt r_0$ and $r\notin E$,
where the constants A and B are from Condition Y.
Example. Let $f(z)=e^z$ and its random perturbation function fω in the family $\mathcal{Y}$. Then, the corollary tells us that, for almost all fω, its integrated zero-counting function in the disk $D(0, r)$ is close to r although ez does not take the value zero at all.
Now, we state Nevanlinna’s second main theorem (involving the integrated zero-counting function only) for random entire functions as corollaries of above results.
Corollary 3.2. If $f_\omega \in \mathcal{Y}$, then, for any constant C > 1, there exists a constant $r_0=r_0(\omega)$ such that, for $r \gt r_0$,
and
where the constants A and B are from Condition Y.
When fω is a Gaussian, or Rademacher or Steinhaus entire function, we have the following corollary.
Corollary 3.3. Let f and fω be defined as in Equations (1.1) and (1.2), respectively. Then, for any ϵ > 0, there exists $r_0=r_0(\omega, \epsilon)$ such that, for $r \gt r_0$,
(i) if fω is a Gaussian entire function, then
\begin{equation*} T(r, f) \leq N(r,0, f_\omega)+\frac{1+\epsilon}{2}\, \log T(r,f) \qquad \mbox{a.s.} \end{equation*}(ii) if fω is a Rademacher entire function, then
\begin{equation*} T(r, f) \leq N(r,0, f_\omega)+ \left(\left(\frac{eC_0}6\right)^6 +\epsilon \right) \log^6 T(r,f)\qquad \mbox{a.s.}, \end{equation*}where the constant C 0 is from Lemma 4.1.(iii) if fω is a Steinhaus entire function, then
\begin{equation*} T(r, f) \leq N(r,0, f_\omega)+(3+\epsilon)\, \log T(r,f) \qquad \mbox{a.s.} \end{equation*}
Now we consider the case when fω takes any value $a\in \mathbb{C}$.
Theorem 3.2. Let $f_\omega \in \mathcal{Y}$ and define
If $f^*_\omega$ satisfies Condition Y (maybe with different constants A and B), then, for any constant C > 1, there exists a set E of finite logarithmic measure such that, for every $a\in \mathbb{C}$, there is $r_1=r_1(\omega,a)$ such that, for $r \gt r_1$ and $r\not\in E$,
where the constants A and B are from Condition Y.
Remark 3. The first error term of the above inequality only appears in the lower bound for $N(r,a, f_\omega)$. In addition, if f is a Gaussian, or Rademarcher or Steinhaus entire function, then it follows from Lemma 4.3 that both fω and $f^*_\omega$ satisfy Condition Y.
The following corollary is a straightforward consequence of the above theorem and Lemma 4.6.
Corollary 3.4. Under the assumptions of Theorem 3.2, we have that, for any constant C > 1, there exists a set E of finite logarithmic measure such that, for every $a\in \mathbb{C}$, there is $r_1=r_1(\omega,a)$ such that, for $r \gt r_1$ and $r\not\in E$,
where the constants A and B are from Condition Y.
When fω is Gaussian, Rademacher or Steinhuas, Theorem 3.2 and Lemma 4.3 give:
Corollary 3.5. Let f and fω be defined as in Equations (1.1) and (1.2), respectively. Then, for any ϵ > 0, there exists a set E of finite logarithmic measure such that, for every $a\in \mathbb{C}$, there exists $r_0=r_0(\omega, \epsilon, a)$ such that, for $r \gt r_0$ and $r\not\in E$, we have:
(i) if fω is a Gaussian entire function, then
\begin{equation*} \log \sigma (r, f) \leq N(r,a, f_\omega)+(3/2+\epsilon) \log\, \log\sigma (r, f) \qquad \mbox{a.s.} \end{equation*}(ii) if fω is a Rademacher entire function, then
\begin{equation*} \log \sigma (r, f) \leq N(r,a, f_\omega)+ \left(\left(\frac{eC_0}6\right)^6 +\epsilon \right) \log^6 \,\log \sigma(r,f)\qquad \mbox{a.s.}, \end{equation*}where the constant C 0 is from Lemma 4.1.(iii) if fω is a Steinhaus entire function, then
\begin{equation*} \log\sigma(r, f) \leq N(r,a, f_\omega)+(4+\epsilon) \log\, \log\sigma(r,f) \qquad \mbox{a.s.} \end{equation*}
Remark 4. Corollary 3.5 shows that the constant in the error term is $3/2 +\epsilon$ and $2+ \epsilon$ in Gaussian and Steinhaus cases, rather than a constant $C_1 \gt 0$ in Theorem 2.4. It is interesting to know whether these coefficients are the best possible coefficients in these error terms.
The following is Nevanlinna’s second main theorem for random entire functions. It verifies that the characteristic function for almost all random entire functions can be bounded above by one integrated counting function, rather than two integrated counting functions as in the classical case (e.g. Theorem 2.3). The proof of the following corollary is a straightforward consequence of Theorem 3.2 and Lemma 4.4 as we have seen the proof of Corollary 3.2.
Corollary 3.6. If fω and $f^*_\omega$ satisfy Condition Y, then, for any constant C > 1, there exists a set E of finite logarithmic measure such that, for every $a\in \mathbb{C}$, there is $r_1=r_1(\omega,a)$ such that, for $r \gt r_1$ and $r\not\in E$,
and
where the constants A and B are from Condition Y.
4. Some lemmas
In this section, in order to prove our main results, we give several lemmas.
Lemma 4.1. (Log-integrability, [Reference Nazarov, Nishry and Sodin13])
Let fω be a Rademarcher entire function. Then, for any $p\geq1$,
where C 0 is an absolute constant.
Lemma 4.2. (Offord [Reference Offord15])
Let $f_\omega(z)$ be a Steinhaus entire function on $(\Omega,\, \mathcal{F}, \, \mu)$ of the form (1.2), and let $\hat{f_\omega}$ be of the form (2.1). For all $t\geq0$, $\phi\in[0,2\pi)$, set
Then,
where C is an absolute constant.
Lemma 4.3. Let $f_\omega(z)\in \mathcal{Y}$ and let $\hat{f_\omega}(re^{i\theta})$ be defined by Equation (2.1). Then for any positive constant C and all x > 1, there is a positive constant C 1 such that
In particular, we have the following:
(i) If fω is a Gaussian entire function, then for any τ > 0, there is a constant $C_1=C_1(\tau)$ such that
\begin{equation*} \mathbb{P}\left(X_r\ge \frac{1+2\tau}{2}\,\log x\right)\le \frac{C_1}{x^{(({1+2\tau})/({1+\tau}))}}. \end{equation*}(ii) If fω is a Rademacher entire function, then for any τ > 0, $\epsilon\in(0, (6/{(eC_0)})^6)$ (C 0 is from Lemma 4.1), there is a constant $C_1=C_1(\epsilon)$ such that
\begin{equation*} \mathbb{P}\left(X_r\ge \left(\frac{1+\tau}{\epsilon}\right)^6\log^6 x\right)\le \frac{C_1}{x^{1+\tau}}. \end{equation*}(iii) If fω is a Steinhaus entire function, then for any τ > 0, there is a constant $C_1=C_1(\tau)$ such that
\begin{equation*} \mathbb{P}\big(X_r\ge 3(1+\tau)^2\log x\big)\le \frac{C_1}{x^{1+\tau}}. \end{equation*}
Proof. By Markov’s inequality, we obtain
Now. we give the proof of (i). Since χj are independent standard complex-valued Gaussian random variables, then
It follows that $\hat{f}_\omega$ is a standard complex-valued Gaussian random variable. For any x > 0,
Consequently, the probability density function of $ |\log|\hat{f_\omega}||$ is $2\,{\rm e}^{-{\rm e}^{-2x}}\,{\rm e}^{-2x}+2{\rm e}^{-{\rm e}^{2x}}{\rm e}^{2x}$ for x > 0 and is 0 for $x\le 0$. It follows that the expected value $\mathbb{E}|\log |\hat{f_\omega}||$ is independent of θ. Thus, we have
where C 1 is a positive constant. It follows that Gaussian entire functions are in the family $\mathcal{Y}$ by taking $A=(2/({1+\tau}))$ and B = 1. Set $C=({(1+2\tau)}/{(1+\tau)})$. Then, by Equation (4.1) and for $x\ge 1$, we get
This completes the proof of (i).
Next, we prove (ii). By Lemma 4.1, we have, for any positive integer $n\ge 6$,
where C 0 is the constant from Lemma 4.1. Thus, when $\epsilon \in (0, ({6}/{eC_0})^6)$,
Therefore, Rademacher entire functions satisfy the condition Y by choosing $A=\epsilon$ and $B=1/{6}$. Using the inequality (4.1) for $C=1+\tau$, we get
Now, we prove (iii).
For any non-negative integer j and any $\varphi\in[0,2\pi)$, set
Thus, $\hat{f_\omega}=\sum\limits_{j=0}^\infty b_j$, and Bj is a real random variable. Further, we deduce that $ B_j=u\,\cos(2\pi\theta_j)+v\,\sin(2\pi \theta_j), $ where
and $u^2+v^2=|b_j|^2$. The characteristic function of Bj is
which depends only on $|b_j|$. Similarly, we obtain that, for $\varphi\in[0,2\pi)$,
is independent of φ for any non-negative integer n.
Since
(where $\sin\varphi_0=\mbox{Re} (\hat{f_\omega})/|\hat{f_\omega}|$ and $\cos\varphi_0=\mbox{Im} (\hat{f_\omega})/|\hat{f_\omega}|$), it follows that
This together with Jensen inequality gives, for any A > 0,
where C 0 is a positive constant. For fixed $\theta\in(0,2\pi]$, let
and
Since $ \mathbb{E}(|\hat{f_\omega}|^2)=1, $ it follows that, for $0 \lt A \lt 2$,
On the set Y 2, by using Lemma 4.2, we have
Thus, when $0 \lt A \lt 1/3$, we have
Therefore, setting $A=1/3(1+\tau)$ as before, we obtain
It follows that
This completes the proof of the lemma.
Lemma 4.4. Let f and fω be entire functions of the forms (1.1) and (1.2), respectively. Then, there is a constant $r_1 \gt 0$ such that, for $r \gt r_1$,
Proof. By Parseval identity and Jensen inequality, we obtain
The other inequality in the lemma can be proved in the same manner.
Lemma 4.5. (Plane Growth Lemma, e.g. [Reference Cherry and Ye2], p. 100)
Let F(r) be a positive, non-decreasing continuous function satisfying $F(r)\geq e$ for $e \lt r_0 \lt r \lt \infty$. Let $\psi(r) \ge 1$ be a real-valued, continuous, non-decreasing function on the interval $[e, \infty)$ and $ \int_e^{\infty}({\textrm{d}r}/{(r\psi(r)))} \lt \infty$. Let $\phi(r)$ be a positive, non-decreasing function defined for $r_0\leq r \lt \infty$. Set $R=r+\phi(r)/\psi(F(r))$. If $\phi(r)\le r$ for all $r\ge r_0$, then there exists a closed set $E\subset [r_0,\infty)$ with $ \int_E({\textrm{d}r}/{\phi(r))} \lt \infty$ such that for all $r \gt r_0$, $r\not\in E$, we have
and
Lemma 4.6. Let f be an entire function defined as in Equation (1.1). There is a set E of finite logarithmic measure such that, for all large $r\notin E$,
Proof. For any R > r, by Cauchy–Schwarz inequality, we get
Applying Lemma 4.5 to $F(r)=\sigma(r,f)$, $\phi(r)=r$, $\psi(x)=(\log x)^2$ and $ R=r+\frac{r}{\psi(F(r))}$ gives
for all large $r \notin E$. The lemma is proved.
Remark 5. It is straightforward to show that $\sigma(r, f)\le M(r, f)$ for all r > 0.
Now we recall a generalized logarithmic derivative estimates of Gol’dberg–Grinshtein type. To state this result, we introduce some notation. Given a non-constant meromorphic function g and $a\in\mathbb{C}$, we can always write $g(z)=(z-a)^m h(z)$, where integer m is called the order of g at the point a and is denoted by ${ord}_a g$. And, the first non-zero coefficient of Laurent series of g(z) in the neighbourhood of the point z = a is denoted by $c_{g}(a)$.
Lemma 4.7. ([Reference Cherry and Ye2], p. 96)
Let g be a meromorphic function in the complex plane and let $0 \lt \alpha \lt 1$. There exists a constant r 0 such that, for all $r_0 \lt r \lt R \lt \infty$,
where
and
5. Proofs of our main theorems
5.1. Proof of Theorem 3.1
Let fω be a random entire function on $(\Omega,\, \mathcal{F}, \, \mu)$ of the form (1.2). For $\omega\in\Omega$, by Jensen formula,
It follows that, for any r > 0,
Since $\log \sigma(r,f)$ is increasing and unbounded, for any positive integer n, there is rn such that $\log \sigma(r_n, f)=n$ and the sequence $\{r_n\}$ is increasing. Since $f_\omega \in \mathcal{Y}$, there are positive constants A and B such that $\mbox{E}(\exp(AX_r^B))=C_1 \lt +\infty$. For any C > 1, set
Therefore, by Equation (4.1) in Lemma 4.3,
Consequently, $ \sum\mathbb{P}(A_n) \lt \infty$ and by Borel–Cantelli lemma,
Thus, for $\omega \in \Omega\setminus A$, there exist j 0 such that for all $n \gt j_0$, we have
It follows that, for $r\in (r_n, r_{n+1}]$ with $n \gt j_0$, for almost all $\omega\in\Omega$, we have
and
Now we estimate the term $\log|c_{f_\omega}(0)|$. Since $f_\omega(z)=\sum\limits_{j=0}^{\infty} a_j \chi_j(\omega) z^j$, we denote all the j satisfying $a_j\neq0$ by the non-decreasing sequence $\{j_k\}_{k=0}^{\infty}$. It suffices to estimate $|\chi_{j_k}(\omega)|$. Define
and
It is trivial to see that
Thus, for almost every $\omega\in\Omega$, there exist unique k and m such that $\omega\in B^{\prime}_{km}$. Therefore,
This together with Equations (5.1) and (5.2) gives, for r sufficiently large,
The proof of Theorem 3.1 is complete.
5.2. Proof of Theorem 3.2
To prove Theorem 3.2, we need the following lemma, whose proof is based on the result of our Theorem 3.1.
Lemma 5.1. Let $f_\omega(z) \in \mathcal{Y}$. Then there exists a constant $r_0=r_0(\omega)$ such that, for $r \gt r_0$, we have
and
where C > 1 is any constant, and constants A and B are from Condition Y.
Proof. Let φ be a non-negative increasing function. Since
by Markov’s inequality, we have
For any positive integer n, there is rn such that $\sigma(r_n, f)=e^n$ and the sequence $\{r_n\}$ is increasing. Set
Thus, by taking $\varphi(x)=(\log x)^2$, we have
Consequently, $ \sum\mathbb{P}(B_n) \lt +\infty$. Thus, by Borel–Cantelli lemma, for almost all $\omega \in\Omega $, there is $j_1=j_1(\omega)$, when $n \gt j_1$, $r\in (r_n, r_{n+1}]$, we have
For $r \gt r_0$ sufficiently large, we get
On the other hand, by Theorems 3.1 and 2.2 and Lemma 4.4, for any C > 1, there is a constant $r_0=r_0(\omega)$, for $r \gt r_0(\omega)$,
This completes the proof of this lemma.
We are now ready to prove Theorem 3.2.
Proof of Theorem 3.2
By applying Theorem 3.1 to $f^*_\omega$, we obtain two positive constants A, B such that for any positive constant C > 1, there exists a constant $r_0=r_0(\omega) \gt 0$, so that for $r \gt r_0$,
Since $\sigma(r,f^*)\geq\sigma(r,f)\geq e$ for all large r, say, $r \gt r_0$, and the function $y(x)=x-C_0\,\log x $ is increasing on $[x_0,+\infty)$, we have
By Jensen formula and Theorem 2.2, we have, for any r < R and $0 \lt \alpha \lt 1$,
Thus, by using Lemma 4.7 and the estimate of $\log|c_{f_\omega}(0)|$ as in the proof of Theorem 3.1, we obtain
where β 1 is a random constant related to $\log|c_{f_\omega}(0)|$, and C 1 is an absolute constant.
It follows from Lemma 4.4 and Lemma 5.1 that, for any r < R,
Applying Lemma 4.5 to the functions $F(r)=\log \sigma(r,f)$, $\phi(r)=r$, $\psi(r)=\log^2r$ and $ R=r+\frac{r}{\psi(F(r))}$, we get a set $E\subset[r_0,+\infty)$ of finite logarithmic measure, so that for all large r, say, $r \gt r_0$, and $r\not\in E$,
and
Thus, plugging the above two estimates to Equation (5.4) gives
for $r \gt r_0$ and $r \notin E$.
It follows from Equations (5.3) and (5.5) that there is $r_1=r_1(a, r_0)$ such that
for $r \gt r_1$ and $r \notin E$.
On the other hand, by Nevanlinna’s first main theorem, Lemma 4.4 and Lemma 5.1,
Combining this with Equation (5.6) completes the proof of the theorem.
5.3. Proof of Corollaries
Proof of Corollary 3.2
Since the function $x-(C/A\,\log x)^{1/B}$ is an increasing function for all large x and using Lemma 4.4, we have
The rest is a straightforward consequence of Theorem 3.1.
Proof of Corollary 3.3
Let fω be a Gaussian entire function. By Equation (4.2) in the proof of Lemma 4.3, we have, for any τ > 0, r > 0,
Thus, $f_\omega \in \mathcal{Y}$ by choosing $A=2/(1+\tau)$ and B = 1. Applying Corollary 3.2 for $A=2/(1+\tau)$, B = 1 and $C=1+\tau$, we can finish the proof of the corollary in this case.
Similarly, if fω is a Steinhaus entire function or Rademacher entire function, we can choose $A=1/3(1+\tau)$, B = 1, $C=1+\tau$ and $A=\epsilon\in(0, ({6}/({eC_0}))^6)$, $B=1/6$, $C=1+\tau$, respectively. This completes the proof of Corollary 3.3.
Acknowledgements
The authors would like to thank Professors P. V. Filevych, I. Laine, A. Nishry and M. Sodin for their help during the preparation of the manuscript and the referee for her/his valuable suggestions and comments.
Funding Statement
Hui Li would like to thank the University of North Carolina Wilmington for its hospitality during her visit from 2019 to 2020 and was supported by China Scholarship Council (no. 201906470026) and National Natural Science Foundation of China (nos. 12071047 and 12301096). Jun Wang was supported by National Natural Science Foundation of China (no. 11771090). Xiao Yao was supported by National Natural Science Foundation of China (nos. 11901311 and 12371074) and National Key R&D Program of China (no. 2020YFA0713300).