5.4, due on October 27

The idea of a probability mass function was kind of difficult to grasp, because I didn't see how adding those terms up would be interesting. But then I realized in the example that the sum of the probabilities was 1, meaning that this sum is like a weighted average value for the random variable. The most interesting part to me is how Bernoulli random variables could produce a binomial distribution, like when people first started realizing that successive dice rolls followed the normal curve.

I'm sure these random variables are very applicable whenever involved in statistics or any kind of probabilistic modeling. And I can see how being able to apply functions to random variables and calculate the probability mass function for them could be very useful in trying to see how uncertain factors can affect outputs in a system.

Comments

Popular posts from this blog

8.7, due on December 11

Finals preparation, due on December 13

8.4, due on December 4