Main Menu

Symposia

Abstracts

Block Schedule

Forms

Committee




Francais

Printer friendly page


Plenary Speakers / Conférenciers principaux


OPENING ADDRESS

TOM ARCHIBALD, Acadia University, Wolfville, Nova Scotia
France, Germany, and the making of modern mathematics

The transformation of mathematical research practice over the course of the nineteenth century culminated in an international mathematics research community which was incipiently modern. We use the term "modern" here in much the same way it would be used by historians of art or literature, for example in the sense that mathematics became less focussed on the representation of a supposed natural world and concentrated more on problems generated within mathematics itself. Much more mathematical research came to have a decidedly abstract character, and philosophical, even metaphysical, issues came to be of central importance to many important practitioners. Corresponding to this shift, mathematics came to be centred in the universities, and the production of pure mathematical research was professionalized both via its professorial context and through other means, such as the rise of national and international mathematical associations. In these and other developments, the French and German mathematical communities were the leaders, though each of these communities had its own internal tensions.

In this paper I discuss how the interaction between the two national groups played a central role in establishing the leading lines of development for mathematics internationally by around 1900. In so doing, I will look at the transition from an older, pre-modern form of mathematical endeavour, exemplified by such figures as C. G. J. Jacobi and Charles Hermite, to the more abstract and structurally oriented work of the early twentieth century. In particular, I will outline the leading role of Hermite as interpreter of, and enthusiast for, German mathematical work (in particular that of Weierstrass and Kronecker). The resulting stresses between modern tendencies and more classical mathematical approaches did a great deal to establish a hierarchy of values for mathematical research, and we shall also look briefly at how these developments played a role in setting the agenda for the twentieth century.

DEBORAH BALL AND HYMAN BASS, University of Michigan
The role of definitions in teaching and learning mathematics

Mathematicians agree that precise use of terms is a cornerstone of mathematical practice, and yet helping students develop such sensibility and skill is not always successful. How can a need for definitions be developed, and how might definitions emerge? Our presentation will span examples from primary school through university level, examining the nature, role, and development of mathematical definitions in learning and teaching mathematics.

ROBERT CALDERBANK, Princeton University, Princeton, New Jersey  08540, USA
Quantum computers and cellular phones

We explore the connection between quantum error correction and wireless systems that employ multiple antennas at the base station and the mobile terminal. These subjects share a common mathematical foundation, which is the combinatorics of binary quadratic forms, that is to say orthogonal geometry. We shall describe how the wireless industry is making use of a mathematical framework developed by Radon and Hurwitz about a hundred years ago.

ANDREW GRANVILLE, Université de Montréal
Uncertainty principles in arithmetic

Try to pick a set A containing roughly half the integers up to (large) x, so that the integers in the set are "as well-distributed as possible"; by this I mean that the number of elements of A which are b(mod q) should be as close to x/2q as possible, for all b and q. In 1964 Roth proved the astounding result that one cannot do this particularly well, in that there will always exist an arithmetic progression b (mod q), with q < x(1/2), which contains either (x/100q)(1/2) more elements of A than expected, or (x/100q)(1/2) less elements of A than expected.

Recently Soundararajan and the speaker found some substantially stronger results about subsets of the primes, arising out of an uncertainty principle for a certain operator. In this talk we will describe some of the new results, try to show the relevance for questions in arithmetic and combinatorics, and discuss the relevant "uncertainty principles".

ANAND PILLAY, University of Illinois at Urbana-Champaign, Department of Mathematics, Urbana, Illinois  61801, USA
Stable theories, examples, and applications

This talk is in honour of Alistair Lachlan. Alistair had a deep influence on classification theory (in model theory) in its various aspects; classifying first order theories, classifying models of first order theories, and describing the category of definable sets in a given structure or model. I plan to describe some of the conceptual apparatus of this theory, and point out how it is meaningful, useful, and suggestive, in a couple of specific examples, the category of compact complex spaces, and the category of "algebraic D-varieties".

MADHU SUDAN, Radcliffe Institute for Advanced Study, Cambridge, Massachusetts, USA
List decoding of error correcting codes

The task of dealing with errors (or correcting them) lies at the very heart of communication and computation. The mathematical foundations for this task were laid in two concurrent and interdependent works by Shannon and Hamming in the late 1940s. The two theories are strikingly powerful and distinct in their modelling of the error. Shannon's theory models errors as effected by a probabilistic/stochastic process, while Hamming envisions them as being introduced by an adversary. While the two theories share a lot in the underlying tools, the quantitative results are sharply diverging. Shannon's theory shows that a channel that corrupt (arbitrarily) close to 50% of the transmitted bits can still be used for transmission of information. Hamming's theory in contrast has often been interpreted to suggest it can handle at most 25% error on a binary channel.

So what can we do if an adversary is given the power to introduce more than 25% errors? Can we protect information against this, or do we just have to give up? The notion of list-decoding addresses precisely this question, and shows that under a relaxed notion of "decoding" (or recovering from errors), the quantitative gaps between the Shannon and Hamming theories can be bridged. In this talk, we will describe this notion and some recent algorithmic developments.

 


top of page
Copyright © Canadian Mathematical Society - Société mathématique du Canada.
Any comments or suggestions should be sent to - Commentaires ou suggestions envoyé à: eo@cms.math.ca.