PAC-Bayes techniques provide frequentist bounds on generalisation error of learning systems that are inspired by Bayesian analysis. We will review earlier work and go on to describe extensions of the technology that enable two new applications. The first is to maximum entropy classification, a thresholded linear classifier that regularises by maximising the entropy of the weights. The second application is to fitting non-linear stochastic differential equation models to observations. The analysis is inspired by a variational Bayesian approximate inference algorithm that models the posterior distribution by a time varying linear stochastic differential equation. The approach provides a lower bound on the expected value of the fit of new data to the posterior marginal distribution.