CSCI 4511/6511
!
, not
, etc.)&&
, and
, etc.)Falsehood:
It is possible to not know things:1
Commonly abbreviated “SAT”
First NP-complete problem
\((X_0 \land X_1) \lor X_2\)
\(X_0 \land \neg X_0 \land X_1\)
\[P(x) = \lim_{n \to \infty} \frac{n_x}{n}\]
…if you thought this course was going to be about LLMs
Possible combinations:
\[\binom{n}{k} = \frac{n!}{k!(n-k)!}\]
Of a variable: \[E[X] = \sum_{i=0}^n x_i \cdot p(x_i)\]
Of a function of a variable:
\[E[g(x)] = \sum_{i=0}^n g(x_i) \cdot p(x_i) \neq g(E[X])\]
\[\text{Var}(X) = E[(X - E[X])^2]\]
\[\begin{align}\text{Var}(X) & = E[(E[X]-\mu)^2]\\ & = \sum_x (x-\mu)^2 p(x) \\ & = \sum_x (x^2 - 2 x \mu + \mu^2) p(x) \\ & = \sum_x x^2 p(x) - 2 \mu \sum_x x p(x) + \mu^2 \sum_x p(x) \\ & = E[X^2] - 2 \mu \mu + \mu^2 \\ & = E[X^2] - E[X]^2 \end{align}\]
\(n = 12, \theta=0.2\)
\(n = 12, \theta=0.6\)
scipy.stats
😎
\(\theta = 0.4\)
\(\theta = 0.2\)
\(r = 3, \theta = 0.5\)
\(r = 2, \theta = 0.25\)
\[P\{X=k\} = \frac{\lambda^k e^{-\lambda}}{k!}\]
\(\lambda = 5\)
\(\lambda = 2\)
\(U(0,1)\)
\(U(0,5)\)
\(\mu = 1, \sigma^2 = 1\)
\(\mu = 3, \sigma^2 = 2\)
(Remarkably unsatisfying.)
Conditional probability:
\[P(x | y) = \frac{P(x, y)}{P(y)}\]
Bayes’ rule:
\[P(x | y) = \frac{P(y | x)P(x)}{P(y)} \]
\[P(x | y) = P(x) \rightarrow P(x,y) = P(x) P(y)\]
Notice a theme?
Stuart J. Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. 4th Edition, 2020.
Mykal Kochenderfer, Tim Wheeler, and Kyle Wray. Algorithms for Decision Making. 1st Edition, 2022.
Ross, *
Stanford CS231
UC Berkeley CS188