An introduction to the
Excellent video, you just saved my life.
Thank you very much for the explanations of non-informative prior and informative prior. Very helpful for my research.
what tHE BLEEP did he just say?
1:25 Why does this mean? Prior = Beta (1.0, 1.0)
how to calculate odd ratio in bayesian ordered logistic plz tell me
Explains what is meant by
This video sufficiently well explained the concept.
blue print on blue background. Am I the only one who is having trouble reading this?
Thanks so much! That was a good explanation about sufficiency!
thanks
which other book can we refer for this topic?
Thank you for your kind help.
Thanks. Perhaps you do another video to call it part 0 as the building blocks for this part 1. Introduction that is :)
Posterior is proportional to the MLE x prior , not equal =
why is the posterior narrower at 5:15?
Thank you. That was very clear and helpful.
so fucking fast..
At 1:40, shouldn't the area under the graph be equal to 1? What does the y-axis represent?
This is the best introduction to this that I've found online! Thanks!
Would like to share the details of the following books
Bayesian Methodology: An overview with the help of R software
https://www.amazon.com/dp/B07QCHTR54 - E-book
https://www.amazon.com/dp/109293989X - Paper back
ISBN-13: 978-1092939898
International Journal of Statistics and Medical Informatics
www.ijsmi.com/book.php
One question I would have on this, is how can you be sure you are not biasing your result using these informative priors? I believe the most conservative approach is indeed the uniform (equivalent to I don't know anything so everything is equally possible for me), but when I start getting "clever", choosing appropriate priors, I can't make a real hypothesis test with that because I already tell the coin to be 50:50 (while someone could have potentially given me a magic coin of 10:90).
Maybe the video creator intended to explain Bayesian statistics, but did not.
The concepts start to be explained, then there is a stepwise jump into mentioning prior and posterior probability, with the introduction of on screen equations but no further explanations - it's like it was read out of a technical manual that only 'insiders' know about. This then quickly turns into how to use the software/which buttons to press, which seems applicable to those who already know about Bayes and want to use the software - and not for those who want an introduction.
So I'm sorry to say this video was not useful to introduce Bayesian statistics and I would recommend giving it a miss.
Hi, thanks for the video. What I wonder is, what are " default priors" when it comes to bayesian inference? As I understand, the priors are specific to each hypothesis or data, so how come some packages include these defaults? What do these priors entail?
I have the same version of Stata as yours. However, my Bayesmh window doesn't have the "univariate distribution" option. What could be the reason? Can you give me a hint?
There's no information about what the Y in the graph is/refers to. This is unacceptable
That was excellent explanation of the interaction between the parameters, thank a lot for putting the time and effort to do the animations
this is Advance basic concept.
Finally I understand this thing. Thank you.
Awesome, thank you! Animations are really helpful.
What is the application tool you were using? Is it publicly available?
I am really struggling with a layered Baysian theoretical puzzle and not finding a clear path forwards even after watching several videos.
Each helps me a little more and yours, while more technical, was very helpful.
Thanks!
Thanks,
Doug
i understand nothing
.75x speed
This is awesome. So intuitive and interesting. Why did we ever use null hypothesis testing? With the computational power we have now, this should be the norm.
great vid! so informative
Isn't there an error at 5:18
Shouldn't the beta distribution's a and b be 86 and 84 NOT 106 and 114 ???? as the mean of 86 and 84 gives the mean on the screen (0.506) ......
Whereas the mean of the beta(106,114) is 0.481
Please, could you send us the video transcript?
excellent explanation sir.....
excellent sir
Hi,
On what depends the type of likelihood distribution?
Thanks,
too many basic errors: "distribution closer to .5" such a claim is not even formally defined
Wow, my understanding acquired from this video is more than from dozen of hours on classes.
I shouldn't be saying that loud but dunno about you, I find this prior distribution & Ledoit-Wolf shrinkage method for accrued efficiency very difficult to picture and don't get me started on these affecting eigenvalues instead of eigenvectors... it's a mess in my head right now... I really need to pull myself together
Please could you indicate some friendly material about bayesian inference?
Thank you. The first video that makes me understand this reasoning in one go.
Woo
great explanation
would someone please tell me what is he saying at 0:28 ? thank you