From: Hans Munthe-Kaas
Date: Tue, 9 Apr 1996 10:25:07 +0200
Subject: Report on "State of the Art in Numerical Analysis" Conference
State of the art in Numerical Analysis?
What has happened in Numerical Analysis during the last 10 years, and
what are the most interesting future developments in the field? These
questions were the topics in the conference 'State of the art in Numerical
Analysis' held April 1-4 in York, England. The conference has been
arranged every tenth year since 1966 by IMA (The Institute of Mathematics
and its Applications), and this year it gathered 112 mathematicians from
about 16 countries. Gene Golub asked me to summarize the conference for
the NA-Net readers:
The conference centered around the central themes in numerical analysis:
- Linear algebra
- Ordinary differential equations
- Integral equations
- Approximation
- Optimization
- Partial differential equations
And in addition there was a session on new applications. I will here give
a brief personal summary of the various topics, which of course to some
extent is colored by my own personal interests. All the talks will appear in
the conference proceedings, published by IMA. Judged from the quality of
the talks, this will be a very valuable reference source.
Linear Algebra:
Talks were given by Nick Higham (dense linear algebra), Iain Duff (sparse
direct methods), Gene Golub (iterative methods for linear systems) and
Henk van der Vorst (sparse eigenproblems). The main developments in dense
linear algebra during the last 10 years has been centered around all the work
with the LAPACK project for dense linear algebra. Parallel computers have been
around only for about a decade, so most of the work on parallel linear
algebra is done in this period. This is now seen through the organization
of algorithms around block formulations via the BLAS 2 and 3 routines.
Interestingly, also sequential computers gain speed by this organization.
Also in sparse computations, much of the activity has been inspired by
parallel computers. For iterative methods, the main contributions during
the last decade is perhaps the development of Krylov subspace techniques
for unsymmetric systems (GMRES, CGS, Bi-CGSTAB, QMR). And in eigenproblems
there have been a significant development of Lanczos and Arnoldi type
methods. Some old methods have gained new significance (Jacobi) and some
new ideas have been introduced due to parallel computers (divide and
conquer algorithms). Even some new basal mathematical tools have gained
significant importance (pseudospectra).
Where are we going now? It seems as parallelism per se is not a topic of
major popularity, but it will of course remain constantly in our heads when
we contemplate over new algorithms. Since the 'black box' software
packages in linear algebra is now so excellent, much work in the future will be
centered around exploiting structures which arise in various application
areas. This was pointed out by Gene, who said that he 'just late in life'
realized the importance of exploiting all the information which comes from
knowing the structure of the underlying problems. A lot of this
information is lost if we regard our problems as being 'purely' linear
algebra.
Ordinary differential equations:
In this area, the need for alternatives to the 'black box' software was
even more emphasized than in linear algebra. All the three speakers; Chus
Sanz-Serna (geometric integrators), Andrew Stuart (dynamical systems) and Arieh
Iserles (beyond the classical theory of ODEs), pointed out that there is a
major need for understanding how to conserve various properties of equations
that are essential mathematically, and which has not been given enough
consideration numerically. Chus summarized the work done during the last 10
years on preserving symplecticity and Andrew Stuart talked about recovering
the correct asymptotic properties of dissipative dynamical systems (limit
sets and attractors). Here the classical notion of measuring quality
by the global error is not relevant. Ariehs talk summarized the work done
on delay differential equations and differential algebraic systems during the
last decade. He pointed out some areas of significant current research,
where we may gain major insight in the next decade. This includes the work
currently undertaken to understand the integration of equations where the
solution is known to sit on a specific manifold or on a Lie group. In the
discussion someone pointed out that "Whereas one 20 years ago didn't need
to know much about differential equations to work with numerical solutions
of them, this is no longer the case".
Integral equations:
Two talks were given about integral equations; Christopher Baker (Volterra
functional and integral equations), Kendall Atkinson (Boundary integral
equations). Also in these areas the last decade has been very fruitful. For
boundary integral equations much of the understanding of the numerical
analysis of corner singularities have been gained in this period. For me
as an outside viewer in this field, the most fascinating developments have
perhaps been the various fast algorithms for solving the dense matrix
problems arising in these fields. (Fast multipole algorithms and algorithms
based on wavelet compression and multiresolution analysis). Now the
solution techniques for these dense linear algebra problems have become so fast
that it is important not to form the coefficient matrix explicitly. (The
complexity of solving the linear systems is smaller than the complexity of
assembling the coefficient matrix!)
Approximation:
Talks were given by Alistair Watson (emphasis on the univariate case),
Mike Powell (multivariate interpolation), David Broomhead (neural net
approximations). The most important development in approximation has
probably been the field of wavelets, briefly summarized in Watsons talk.
Optimization:
Three talks in this field: Jorge Nochedal (unconstrained optimization),
David Shanno (interior point methods), Nick Gould (nonlinear constraints).
There has been a tremendous amount of work on interior point methods this
decade, and Shanno referred to large industrial optimization problems where
interior point methods beat simplex by a factor 50 in speed.
Partial diff. eqn:
The talks on PDEs were: Franco Brezzi (Stabilization techniques and
subgrid scales capturing), Charlie Elliott (Large approximation of
curvature dependent interface motion), Endre Suli (Finite element methods
for hyperbolic problems: stability, accuracy, adaptivity), Bill Morton
(Approximation of multi-dimensional hyperbolic PDEs). Some keywords from these
talks are error control and adaptivity.
New applications:
There were two talks on applications; Frank Natterer (Tomography) and
Jean-Michel Morel (nonlinear filtering and PDEs). Morels talk about the
connection between filtering techniques in computer vision and partial
differential evolution equations was highly inspiring. The idea is to
classify various families of discrete image filters via the PDEs they
approximate. In some sense, the work in this field resembles the early work
on statistical mechanics/ transport theory/ continuum mechanics in the
last century. This is an area in its infancy, where the basic
understanding of the processes involved is being developed in the language
of PDEs.
Concluding remarks:
It is hard to summarize all the developments that has been going on in
numerical analysis during the last decade. It has been an immensely
fruitful period, and the subject is truly alive and developing.
It is also a pleasure to remark that the numerical analysis community consists
of a bunch of cheerful people, and that the friendly spirit of the 'late
hours' is also a part of the 'State of the Art' in our field. This was
evident in the hilarious dinner speech by John C. Mason. Thanks to the
organizing committee chaired by Alistair Watson, and to Pamela Bye for
arranging all the practical details.
Info is also found at:
http://www.amtp.cam.ac.uk/user/na/SotANA/SotANA.html
Hans Munthe-Kaas
University of Bergen
Norway