6.4, due on November 13

The explanation of what Bayesian statistics is was really well put. It makes sense to not just talk about expected values for different parameters of a distribution but also the entire distribution of that parameter. I can see how to do the calculations in the discrete case, but it would be nice to work through an example of how to do it in a continuous case.

To me, in seems like in the worked example of the continuous case we don't choose the Beta distribution for anything else but the fact that it can represent lots of different assumptions about what the unknown parameter probably is. This contrasts with my understanding of the different distributions from the previous chapters. In the previous chapters, each distribution had a nice example of something in the real world that was distributed in that way. This time, it seems like we're breaking away from that into just using the p.m.f.s and p.d.f.s as we like.

Comments

Popular posts from this blog

8.7, due on December 11

Finals preparation, due on December 13

8.4, due on December 4