The Maximum Likelihood Ensemble Filter as a non-differentiable minimization algorithm
Milija Zupanski, I. Michael Navon, Dusanka Zupanski
The Maximum Likelihood Ensemble Filter (MLEF) equations are derived without the differentiability requirement for the prediction model and for the observation operators. Derivation reveals that a new non-differentiable minimization method can be defined as a generalization of the gradient-based unconstrained methods, such as the preconditioned conjugate-gradient and quasi-Newton methods. In the new minimization algorithm the vector of first order increments of the cost function is defined as a generalized gradient, while the symmetric matrix of second order increments of the cost function is defined as a generalized Hessian matrix. In the case of differentiable observation operators, the minimization algorithm reduces to the standard gradient-based form. The non-differentiable aspect of the MLEF algorithm is illustrated in an example with one-dimensional Burgers model and simulated observations. The MLEF algorithm has a robust performance, producing satisfactory results for tested non-differentiable observation operators.