This course will utilize a Scheme programming environment to teach student how to solve a broad range of Mathematical problems.

Introduction to analysis. Real numbers, limits, continuity, uniform continuity, sequences and series, compactness, convergence, Riemann integration. An in-depth treatment of the concepts underlying calculus.

An introduction to probability theory. Counting methods, conditional probability and independence. Discrete and continuous distributions, expected value, joint distributions and limit theorems. Prepares student for future work in probability and statistics.

Basics of Statistical inference. Sampling distributions, parameter estimation, hypothesis testing, optimal estimates and tests. Maximum likelihood estimates and likelihood ratio tests. Data summary methods, categorical data analysis. Analysis of variance and introduction to linear regression.

An introduction to linear algebra emphasizing matrices and their applications. Gaussian elimination, determinants, vector spaces and linear transformations, orthogonality and projections, eigenvector problems, diagonalizability, Spectral Theorem, quadratic forms, applications. MATLAB is used as a computational tool.

An introduction to abstract algebra. Elementary number theory and congruences. Basic group theory: groups, subgroups, normality, quotient groups, permutation groups. Ring theory: polynomial rings, unique factorization domains, elementary ideal theory. Introduction to field theory.

Information theory traces its beginnings to the work of Claude Shannon in the late 1940s. Information theory is concerned with the amount of information that can be transmitted over a medium that is subject to noise. For example, consider the situation of sending data over a line when the data is in binary form (0s and 1s) and there is the chance that some 0s get changed to 1s and vice verse during the transmission. Shannon proved two fundamental theorems, called coding theorems, that were the beginnings of this theory. The more famous is the noisy channel coding theorem that says that over any such medium, there is a limit R to the rate at which information can be reliably transmitted. Any rate less than R can be achieved, but transmission at any rate greater than or equal to R necessarily incurs errors in transmission that cannot be corrected. This work led to a wealth of further work in the area, including coding theory for example.

In this course, we'll cover Shannon's basic theorems and look at some related coding theorems. This will require some work with random variables and in particular, an analysis of the entropy function. This function is fundamental to defining the capacity of a transmission channel.

The course text will be the main reference, Elements of Information Theory by Thomas and Cover. The topics we'll cover will include the first five chapters of Cover and Thomas, as well as chapter 7, and other topics as time allows: this includes entropy and related concepts (relative entropy, mutual information, etc.), equipartitions and the asymptotic equipartition property, entropy rates of stochastic processes, and codes and data compression, and channel capacity. The other possible topics include Kolmogorov complexity, the Gaussian channel, and the maximum entropy principle. There are numerous other topics in the text, and as with most courses at the upper division that I have taught recently, I will ask each student to prepare a project for presentation to the class and to be written up, in lieu of a final exam.

This course is an introduction to conducting mathematical experiments by computer. Students will learn to program in Mathematica and use web tools such as the Encyclopedia of Integer Sequences to experimentally discover theorems in number theory, combinatorics, dynamical systems, and other areas.

Basics of combinatorics with emphasis on problem solving. Provability, pigeonhole principle, Mathematical induction. Counting techniques, generating functions, recurrence relations, Polya’s counting formula, a theorem of Ramsey.

Introduction to the theory of computation: Formal languages, finite automata and regular languages, deterministic and nondeterministic computation, context-free grammars, languages, and pushdown automata. Turing machines, undecidable problems, recursion theorem, computational complexity, NP-completeness.

A study of important algorithms (including searching and sorting, graph/network algorithms, and algorithms in number theory) and algorithm design techniques (including greedy, recursive, and probabilistic algorithms). Covers the analysis of algorithms (including worst- and average-case analysis) and discussion of complexity classes for decision and enumeration problems (including P, NP, #P, PSPACE).

This introduction to information theory will address fundamental concepts, such as information, entropy, relative entropy, and mutual information. In addition to giving precise definitions of these concepts, the course will include a probabilistic approach based on equipartitions. Many of the applications of information will be discussed, including Shannon's basic theorems on channel capacity and related coding theorems. In addition to channels and channel capacity, the course will discuss applications of information theory to mathematics, statistics ,and computer science.

Errors. Curve fitting and function approximation, least squares approximation, orthogonal polynomials, trigonometric polynomial approximation. Direct methods for linear equations. Iterative methods for nonlinear equations and systems of nonlinear equations. Interpolation by polynomials and piecewise polynomials. Numerical integration. Single-step and multi-step methods for initial-value problems for ordinary differential equations, variable step size. Current algorithms and software.

**Course Goals:** to introduce approximation techniques useful in mathematics, science and engineering; to explain how, when and why they can be expected to work; and to provide a foundation for further study of numerical analysis and scientific computing.

**Learning Outcomes:** Students will be able to understand and use: methods to find roots of nonlinear functions, interpolation techniques to fit data, methods for approximating derivatives, methods for approximating integrals, various methods for solving differential equations, methods for solving linear systems and computing properties of matrices.

The subject of number theory is one of the oldest in Mathematics. The course will cover some basic material and describe interesting applications. One of the recurrent themes is the realization that Mathematics that was developed usually for its own sake, has found applications in many unexpected problems. Some of the topics covered in the class are Pythagorean triples, prime numbers, divisibility and the highest common divisor, linear diophantine equations, congruences, round-robin tournaments and perpetual calendars, multiple functions, perfect numbers, primitive roots, pseudo-random numbers, decimal fractions and continued fractions, quadratic reciprocity.

Under faculty guidance, students will select a topic in current Mathematical research, write an expository article on that topic, and give an oral presentation. This seminar is required of all Mathematics majors who are not doing an Honors Project within the department. Completion of 3980 and 3990 fulfills the college writing requirement.

Mathematics Department, 424 Gibson Hall, New Orleans, LA 70118 504-865-5727 math@math.tulane.edu