Random versus deterministic exponents in a rich family of diffeomorphisms

Fran\c{c}ois Ledrappier(1), Michael Shub(2), Carles Sim\'o(3), Amie Wilkinson(4)

(1) Centre de Math\'ematiques, \'Ecole Polytechnique, 91128 Palaiseau Cedex,
    France
(2) IBM T. J. Watson Research Center,  P. O. Box 218, Yorktown Heights, 
    NY 10598, USA
(3) Dept. de Matem\`atica. Aplicada i An\`alisi, Univ. de Barcelona, 08071 
    Barcelona, Spain
(4) Mathematics Department, Northwestern University, Evanston IL 60208-273, USA
E-mail: ledrappi@math.polytechnique.fr, mshub@us.ibm.com, carles@maia.ub.es,
        wilkinso@math.northwestern.edu

Abstract

We study, both numerically and theoretically, the relationship between the
random Lyapunov exponent of a family of area preserving diffeomorphisms of 
the 2-sphere and the mean of the Lyapunov exponents of the individual members. 
The motivation for this study is the hope that a rich enough family of
diffeomorphisms will always have members with positive Lyapunov exponents, that
is to say, positive entropy. At question is what sort of notion of richness 
would make such a conclusion valid. One type of richness of a family 
- invariance under the left action of $SO(n+1)$ - occurs naturally in the 
context of volume preserving  diffeomorphisms of the $n$-sphere. 
Based on some positive results for families of linear maps obtained by Dedieu 
and Shub, we investigate the exponents of such a family on the 2-sphere. Again
motivated by the linear case, we investigate whether there is in fact a lower
bound for the mean of the Lyapunov exponents in terms of the random exponents 
(with respect to the push-forward of Haar measure on $SO(3)$) in such a family. 
The family ${\cal F}_{\varepsilon}$ that we study contains a twist map with 
stretching parameter ${\varepsilon}$.

In the family ${\cal F}_{\varepsilon}$, we find strong numerical evidence for
the existence of such a lower bound on mean Lyapunov exponents, when the values
of the stretching parameter ${\varepsilon}$ are not too small. Even moderate
values of ${\varepsilon}$ like ${\varepsilon}\ge 10$ are enough to have an
average of the metric entropy larger than that of the random map. For small 
${\varepsilon}$ the estimated average entropy seems positive but is definitely
much less than the one of the random map. The numerical evidence is in favor 
of the existence of exponentially small lower and upper bounds (in the present
example, with an analytic family).

Finally, the effect of a small randomization of fixed size $\delta$ of the 
individual elements of the family ${\cal F}_{\varepsilon}$ is considered. 
Now the mean of the local random exponents of the family is indeed asymptotic to
the random exponent of the entire family as ${\varepsilon}$ tends to infinity. 
