Mudanças entre as edições de "Stochastic Processes"
|Linha 101:||Linha 101:|
== Exams ==
== Exams ==
* '''P1 (easy):'''
* '''P1 (easy):'''
* '''P2 (harder):'''
* '''P2 (harder):'''
Edição das 07h56min de 4 de abril de 2019
This is the main page of an undergraduate-level course in stochastic processes targeted at engineering students (mainly computer engineering and its interface with mechanical engineering), being taught in 2019 semester 1 at the Polytechnic Institute IPRJ/UERJ.
- 1 General Info
- 2 Approximate Content
- 3 Main Resources
- 4 Homework
- 5 Exams
- 6 Evaluation criteria
- 7 Awesome Links
- 8 Keywords
- Instructor: prof. Ricardo Fabbri, Ph.D. Brown University
- Meeting times: Tuesdays 1:20pm-3:10pm Thursdays 1:20pm - 3:10pm, room 205
- Forum for file exchange and discussion: lista do google groups: firstname.lastname@example.org
- Chat: IRC #labmacambira for random chat
- Undergraduate-level mathematics and probability (will review as needed)
- Desirable: Intermediate programming experience with any numerics scripting language such as Scilab, Python, R or Matlab. Knowing at least one of them will help you learn any new language needed in the course.
The R programming language and data exploration environment will be used for learning, with others used occasionally. Students can also choose to do their homework in Python, Scilab, Matlab or similar languages. The R language has received growing attention, specially in the past couple of years, but it is simple enough so that the student can adapt the code to his preferred language. Students are expected to learn any of these languages on their own as needed, by doing tutorials and asking questions
This year's course will focus on a modern approach bridging theory and practice. As engineers and scientists, you should not learn theory here without also considering broader applications. Recent applications in artificial intelligence, machine learning, robotics, autonomous driving, material science and other topics will be considered. These applications are often too hard to tackle at the level of this course, but having contact with them will help motivate the abstract theory. We will try to focus on key concepts and more realistic applications than most courses (that come from the 1900's), that will prompt us to elaborate theory.
Introduction to Stochastic Processes with R, Robert Dobrow, 2016 (5 stars on Amazon)
Additional books used in the course
Learning stochastic processes will require aditional books, including more traditional ones:
- Markov Chains: gibbs fields, monte carlo simulation and queues, Pierre Bremaud
- An Introduction to Stochastic Modeling, Taylor & Karlin
- Pattern Theory: The Stochastic Analysis of Real-World Signals, David Mumford and Agnes Desolneux - the first chapters already cover many types of stochastic processes in text, signal and image AI
- My own machine learning and computational modeling book draft, co-written with prof. Francisco Duarte Moura Neto and focused on diffusion processes on graphs like PageRank. There is a probability chapter which is the basis for this course. We have many copies at IPRJ's library.
Other books to look at
Basic probability and statistics
- I recommend you review from the above books. They all include a review. But you might have to see:
- Elementary Statistics, Mario Triola (passed down to me by a great scientist and statistician)
- Pattern Theory: From Representation to Inference, Ulf Grenader
Lectures roughly follow the sequence of our main book, with some additional material as needed. All necessary background will be covered as needed. Advanced material will be covered partly.
- Overview & Course Logistics (10Mai18)
- Intro, overview of main processes and quick review
- Markov Chains
- Branching Processes
- Markov Chain Monte Carlo: MCMC
- Poisson Process
- Queue theory
- Brownian Motion
- Stochastic Calculus
- All homework can be done in any language. Most are either in the R programming language or in Scilab/Matlab and Python.
- If you do the homework in two different languages, you get double the homework grade (bonus of 100%)
- Late homework will be accepted but penalized at the professor's will according to how late it is
- Ex 1.1 of the main book
- Due date: 9Abr19
- Exercise 1 Ch1 of the Pattern Theory book by Mumford & Desolneux,
Simulating Discrete Random Variables (pp 51, 52, 53)
- Type your solutions in Latex
- Due date: 11Abr19
- No need to read this book for this exercise. You will review discrete random variables in the context of Markov chains for natural language processing; this is basic for most AI bots nowadays. If you're curious about the applications, you can read the book chapter just for fun.
Assignment 2: Exercise list for chapter 1
7 Exercises: 1.3, 1.5, 1.6, 1.7, 1.9, 1.10, 1.19
- Due date: 18Abr19
Assignment 3: Exercise list for chapter 2
11 Exercises: 2.1, 2.4, 2.6, 2.8, 2.9, 2.10, 2.12, 2.14, 2.15, 2.18 Computer: 2.26
- Due date: 7Mai19
Assignment 4: Exercise list for chapter 3
Exercises: 3.2, 3.5a-c, 3.10a-d, 3.16i-iv, 3.25a-b, 3.37, 3.58
- Due date: 3.2-3.16 (4 exercises): 1 semanas antes da P2 (entrega pessoalmente no inicio da aula), 3.25-3.58 (3 exercises): inicio da P2
Assignment 5 (EXTRA): Exercise list for chapter 6
2 Exercises: 6.3, 6.4
- Due date: inicio da P2
- P1 (easy): 16Mai19
- P2 (harder): 27Jun19
M_p = (P1 + P2)/2 M = 0.7*M_p + 0.3*T, where T is homework If M >= 5, pass --> M Pf - final exam, must be done by whoever gets exam score M_p < 5 M_f = 0.5*(M + P_f) If M_f >= 5, pass with M_f Sub: replaces smallest of P1, P2, P_f (only if someone misses class or wants to improve scores - whoever hands it in, will have the grade substituted as such) M_sub = final grade with Sub If M_sub >= 5, pass --> M_sub For simplicity, M_sub = M_f will be considered the same exam. Whoever uses it as Sub will have the grade substituted independently of the result.ente do resultado.
- Course on robotic path planning with applications of stochastic processes: https://natanaso.github.io/ece276b/schedule.html
- Paper: Markovian robots: minimal navigation strategies for active particles, Arxiv 2017
- Paper: Stochastic processes in vision: from Langevin to Beltrami https://ieeexplore.ieee.org/document/937531/
- Cool applications: https://math.stackexchange.com/questions/1543211/which-research-groups-use-stochastic-processes-and-or-stochastic-differential-eq
- Paper: Building Blocks for Computer Vision with Stochastic Partial Differential Equations https://link.springer.com/article/10.1007/s11263-008-0145-5
- Paper: Variational Bayesian Multiple Instance Learning with Gaussian Processes, CVPR 2017
- Paper: Correlational Gaussian Processes for Cross-domain Visual Recognition, CVPR 2017
Neural Nets and Stochastic Processes
- Generative Models for Stochastic Processes Using Convolutional Neural Networks, arxiv, pesquisadores brasileiros (USP)
- Bayesian SegNet: Model Uncertainty in Deep Convolutional Encoder-Decoder Architectures for Scene Understanding, Cipolla et. al arxiv 2016
random fields, stochastic modeling, data science, queue theory, machine learning, poisson process, markov chains, Gaussian processes, Bernoulli processes, soft computing, random process, Brownian motion, robot path planning, artificial intelligence, simulation, sampling, pattern formation, signal processing, text processing, image processing, dimentionality reduction, diffusion, Markov Chain Monte Carlo MCMC, tracking, branching process, stochastic calculus, SDEs