We examine a recursive algorithm for learning steady states and cycles in stochastic nonlinear models. Necessary and sufficient conditions for local convergence are shown to be equivalent to easily computable expectational-stability conditions. These conditions are affected by the distribution of the random shocks. For the case of small noise it is shown that stochastic cycles exist near nonstochastic ones and that a projection facility in the algorithm is not required for convergence with probability 1 to stable steady states and cycles. The results are applied to an overlapping generations model with productivity shocks.