All Projects → mitmath → 18303

mitmath / 18303

18.303 - Linear PDEs course

Projects that are alternatives of or similar to 18303

Pix2face
3D human face estimation and rendering from a single image
Stars: ✭ 89 (-3.26%)
Mutual labels:  jupyter-notebook
Ml Twitter Sentiment Analysis
Jupyter Notebook + Python code of twitter sentiment analysis
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Notebooks
IPython Notebooks
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Dsen2
Super-Resolution of Sentinel-2 Images: Learning a Globally Applicable Deep Neural Network
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Pyspark Tutorial
PySpark Code for Hands-on Learners
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Simpletensorflowclassificationexample
SImple notebook and dataset to demonstrate classification in TensorFlow
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Plotly.py
The interactive graphing library for Python (includes Plotly Express) ✨
Stars: ✭ 10,701 (+11531.52%)
Mutual labels:  jupyter-notebook
Juliaworkshop19
Advanced Julia for undergraduate physicists
Stars: ✭ 92 (+0%)
Mutual labels:  jupyter-notebook
Aind Cv Facialkeypoints
AIND, computer vision capstone project. This repo contains starting code for an end-to-end facial keypoint recognition system that relies on a combination of computer vision and deep learning techniques.
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Reinforcementlearning
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Spring2018 Tutorials
Tutorials for Spring 2018
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Bitcoin Value Predictor
[NOT MAINTAINED] Predicting Bit coin price using Time series analysis and sentiment analysis of tweets on bitcoin
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Histogramloss
This repository contains code for paper "Learning Deep Embeddings with Histogram Loss" (NIPS2016)
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Alibaba 3rd Security Algorithm Challenge
第三届阿里云安全算法挑战赛冠军代码
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Stattests
Source code to reproduce experiments from the article Practitioner’s Guide to Statistical Tests
Stars: ✭ 92 (+0%)
Mutual labels:  jupyter-notebook
Tweet Stance Prediction
Applying NLP transfer learning techniques to predict Tweet stance
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
3dunet abdomen cascade
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook
Neural Ode
Neural Ordinary Differential Equation
Stars: ✭ 92 (+0%)
Mutual labels:  jupyter-notebook
Quantum I Ching
A Quantum 爻 System Implementation for Divination
Stars: ✭ 92 (+0%)
Mutual labels:  jupyter-notebook
2016 10 Facebook Fact Check
Data and analysis for the BuzzFeed News article, "Hyperpartisan Facebook Pages Are Publishing False And Misleading Information At An Alarming Rate."
Stars: ✭ 91 (-1.09%)
Mutual labels:  jupyter-notebook

18.303: Linear Partial Differential Equations: Analysis and Numerics

Spring 2021, Dr. Vili Heinonen, Dept. of Mathematics.

Overview

This is the home page for the 18.303 course at MIT in Spring 2021, where the syllabus, lecture materials, problem sets, and other miscellanea are posted.

Course description

Provides students with the basic analytical and computational tools of linear partial differential equations (PDEs) for practical applications in science engineering, including heat/diffusion, wave, and Poisson equations. Analytics emphasize the viewpoint of linear algebra and the analogy with finite matrix problems. Studies operator adjoints and eigenproblems, series solutions, Green's functions, and separation of variables. Numerics focus on finite-difference and finite-element techniques to reduce PDEs to matrix problems, including stability and convergence analysis and implicit/explicit timestepping. Julia (a Matlab-like environment) is introduced and used in homework for simple examples.

Prerequisite: linear algebra (18.06, 18.700, or equivalent).

Syllabus

Lectures: TR 9:30-11am https://mit.zoom.us/j/92763506665. Office Hours: Wednesday 9-10am https://mit.zoom.us/j/92763506665.

Grading: 50% homework, 15% mid-term, 35% final project (due the last day of class). Problem sets are due in class on the due date. The lowest problem set score will be dropped at the end of the term. Missed midterms require a letter from Student Support Services or Student Disabilities Services to justify accommodations. Legitimate excuses include sports, professional obligations, or illness. In the event of a justified absence, an alternative make-up project will be assigned.

Collaboration policy: Make an effort to solve the problem on your own before discussing with any classmates. When collaborating, write up the solution on your own and acknowledge your collaborators.

Books: Introduction to Partial Differential Equations by Olver.

Final project: There is a final project instead of a final exam. In your project, you should consider a PDE or possibly a numerical method not treated in class, and write a 5–10 page academic-style paper that includes:

Review: why is this PDE/method important, what is its history, and what are the important publications and references? (A comprehensive bibliography is expected: not just the sources you happened to consult, but a complete set of sources you would recommend that a reader consult to learn a fuller picture.) Analysis: what are the important general analytical properties? e.g. conservation laws, algebraic structure, nature of solutions (oscillatory, decaying, etcetera). Analytical solution of a simple problem. Numerics: what numerical method do you use, and what are its convergence properties (and stability, for timestepping)? Implement the method (e.g. in Julia) and demonstrate results for some test problems. Validate your solution (show that it converges in some known case).

You must submit a one-page proposal of your intended final-project topic, summarizing what you intend to do. Some suggestions of possible projects will be given before then.

Tentative Schedule

  • Why PDEs are interesting.
  • The Fourier series and eigenfunction expansions for the Poisson equation
  • Optional: Julia Tutorial
  • Spectral methods for numerically solving PDEs
  • Finite difference discretizations
  • Properties of Hermitian operators
  • (Semilinear) Heat : Equation
  • Basic time stepping methods
  • Method of Lines (MOL) Solutions
  • Lax equivalence, stability, Von Neumann Analysis
  • Higher dimensional PDEs
  • Generalized boundary conditions
  • Separation of Variables
  • Wave Equation
  • Traveling waves and D'Alembert's solution
  • Numerical Dispersion
  • Sturm-Liouville Operators
  • Distributions
  • Green's Functions
  • Weak form and Galerkin expansions
  • Finite Element Methods

Lecture Summary

Vector spaces and linear operators

Lecture 1 | Sine series (Julia)

During the first week we covered basic properties of vector spaces and linear operators including norms and inner products. We defined the notion of linear operators and introduced adjoints of linear operators. We went through some examples of smooth functions on the interval [0,1]. We used these tools and notions to solve the Poisson equation with Dirichlet boundary conditions on this interval. We also talked about bases for vector spaces and introduced the notion of an orthogonal and orthonormal bases. We introduced the basic properties of Fourier transform on a finite interval.

Finite differences

Lecture 2 | Finite differences (Julia)

We covered the basic idea of discretizing functions and writing down finite difference approximations of differential operators. We introduced backward, forward, and center difference methods and used these to write a simple discretization for the Laplacian. We talked about matrix representations for the difference operators and the importance of boundary conditions. We briefly discussed how the matrices are invertible if a linear system has a unique solution. As an example we talked about Poisson equation. We also covered deriving finite difference operators using polynomial fitting.

Heat and wave equations

Lecture 3 | Pset 1

We showed that Laplacian operator is self-adjoint with Dirichlet boundaries. We introduced the notion of positive and negative (semi)definite operators. We talked about the superposition principle and used it to solve the heat equation and the wave equation with Dirichlet boundaries. Important theme during this lecture was the ability to separate partial differential equations in sufficiently symmetric domains.

Boundary conditions

Lecture 4

We discussed some general properties of different boundary conditions for partial differential equations. We showed that the general solution is the solution to the homogeneous problem with the desired boundary condition + the solution to the inhomogeneous problem with zero boundaries. We revisited boundary conditions within the framework of finite difference approximation. We saw how the uniqueness of the solution to a linear PDE given by the boundary condition translates in manifested in the finite difference approximation.

Problems in higher dimensions

Lecture 5

We talked about how some of the ideas we used earlier extend to problems in higher dimensions. The main method here was separation of variables, which is a powerful technique to solve PDEs when the problem is nicely symmetric. This is true especially for time-dependent problems -- time is usually independent of the spatial dimensions so solving for the time evolution once you have solved the spatial part is often not too hard. We also discussed calculating Fourier coefficients in greater detail and defined the finite Fourier transform as a linear map from one vector space to another.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].