Probablity

Empirical Probability, Joint Probablity, Binomial Distribution Calculator, Bayes Probablity Theorem, Geometric Probablity, Conditional Probablity, Mean And Variance, Joint Probablity, Binomial Distribution, Poisson Distribution , Hypergeometric, Bayes, Exponential, Distribution, Gamma Distribution

Usefull Properties

Genral Formula:

$$ P(E) = \dfrac{Number \enspace of \enspace favourable \enspace outcomes}{Total \enspace number \enspace of \enspace outcomes} $$

Genral Form of Addition Theorem of probability

$$ P(A1 \cup A2 \cup ... \cup An) = \sum_{i-1}^{n} P(A_{i}) - \sum_{i<j} P(A_{i} \cap A_{j}) + \sum P(A_{i} \cap A_{j} \cap A_{k}) - ... + (-1)^{n-1} P(A_{1} \cap A_{2} \cap ... \cap A_{n}) $$

Conditional Probablity

$$ P(\dfrac{E_{1}}{E_{2}}) = \dfrac{P(E_{1} \cap E_{2})}{P(E_{2})} = \dfrac{n (E_{1} \cap E_{2})}{n(E_{2})} $$

Empirical Probablity

$$ P(E) = \dfrac{Number \enspace of \enspace time \enspace event \enspace occurs}{Total \enspace number \enspace of \enspace times \enspace experiments \enspace performed} $$ $$ P(E) = \dfrac{f}{n} $$

Probablity

Joint Probablity

$$ P (A⋂B) $$

Probablity
Probablity

Independent And Dependent Probablity

Dependent Probablity

$$ P(A \enspace and \enspace B) = P(A) X P(\dfrac{B}{A}) $$

Independent Probablity

$$ P(A \enspace and \enspace B) = P(A) X P(B) $$

Probablity

Binomial Distribution

i

Independent And Dependent Probablity

Dependent Probablity

$$ P(A \enspace and \enspace B) = P(A) X P(\dfrac{B}{A}) $$

Independent Probablity

$$ P(A \enspace and \enspace B) = P(A) X P(B) $$

Probablity

Binomial Distribution

PMF of Binomial Distribution Calculator

$$ P_x (X) \gtreqqless 0 $$ $$ \sum_{xe \enspace range (x) } P_x (X) = 1 $$

$$ \dfrac{n!}{(n-r)! * r!} * p^x (1-p)^y $$ $$ Where $$ $$ n = number \enspace of \enspace trails $$ $$ r = number \enspace of \enspace succesful \enspace events $$ $$ x = probablity \enspace of \enspace sucess $$ $$ y = probablity of not being sucessfull $$

Probablity

PMF Of Poisson Distribution

$$ P.M.F = P(X=k) = \dfrac{\lambda^k e^{-\lambda}}{k!} $$ $$ Where \enspace e = is \enspace euler's \enspace number (e = 2.71828...) $$ $$ k = the \enspace number \enspace of \enspace occurrences (k=0,1,2...) $$ $$ \lambda = parameter \enspace such \enspace that \lambda > 0 $$

Probablity

Hypergeometric Distribution

$$ C(n,r) = \dfrac{n!}{r! (n-r)!} $$

Hypergeometric Distribution PMF

$$ P.M.F = P(X = k) = \dfrac{ (^K_k) (^{N-K}_{n-k}) }{ ^N_n} $$

Probablity

Exponential Distribution

What is the Exponential Distribution used for

$$ P.M.F = f(x;\lambda) = {\lambda e^{\lambda x}.... where x \gvertneqq 0 } $$ $$ P.M.F = f(x;\lambda) = {0.... where x < 0 } $$ $$ Where \enspace \lambda = rate \enspace parameter \enspace x = random \enspace variable $$

Probablity

Gamma Distribution

Gamma Distribution Function

$$ P.M.F = f(x; \alpha, \beta ) = \dfrac{\beta ^{\alpha} x^{\alpha -1} e^{- \beta x} }{ \gamma(\alpha) } $$ $$ where \enspace \alpha = shape \enspace parameter $$ $$ x = random \alpha variable $$ $$ \beta = scale \enspace parameter $$ $$ \gamma (x) = gamma \enspace function $$

Probablity

Bayes Theorem

$$ P(\dfrac{A}{B}) = \dfrac{P(A∩B)}{P(B)}) $$ $$ Where \enspace P(A|B) \enspace is \enspace the \enspace probability \enspace of \enspace condition \enspace when \enspace event \enspace A \enspace is \enspace occurring \enspace while \enspace event \enspace B \enspace has \enspace already \enspace occurred. $$ $$ P(A ∩ B) \enspace is \enspace the \enspace probability \enspace of \enspace event \enspace A \enspace and \enspace event \enspace B $$ $$ P(B) \enspace is \enspace the \enspace probability \enspace of \enspace event \enspace B $$ - Use MakesMathEasy tool to solve Bayes Theorem questions.You can find this tool under probablity section

Probablity
Probablity