logo

What is theta in statistics

Share on facebook
Share on twitter

1. Maximum Likelihood Estimation Basics

118 816 views | 10 Aug. 2017

Maximum likelihood is a

Maximum likelihood is a method of point estimation. This video covers the basic idea of ML.

Mohammad ALKADI

Great explanation. Thank you !

Esteban Burguete

Impressive...How do you do that. It is amazing. Well, Thank you, your videos are excellent.

Pasquale Nicolò

Thank u so much!

Joy Chandra Paul

Nice

Edward Raywer

Thanks...

H H

I need the MLE for my thesis in trade! You made it really clear and easy for me to understand! I literally didn't even know what that is and now I'll use it for my estimation. THANK YOU

Arjun

very helpful,thank you

Louis Wang

suggestion: dress black clothes..

Daniel Rehling

You used theta as the parameter here. Does this theta hold the same meaning as lambda for a Poisson distribution?

pankaj negi

thank you!!

Razikh Shaik

Super helpful tutorial! Thank you so much :)

murtiza ali

Please make a vedio on "sufficiency or sufficient estimator ".

Richard Parker

You are so awesome! I wish I could sit in your classrooms!

bergamo bobson

i finally understnad it, thanks

govind ashrit

Thanks. Nice explanation.

Fifikus Fifikus

Very helpful!

Danny Butvinik

Thank you! very presentative and intuitive explanation

Vincent Liow

Thank you! needed a quick recap and this helped. The software that you are using is amazing!

Andrea Stevens Karnyoto

you are amazing prof.

Daniel Jiménez Flores

Thanks for you great job ! It's so helpful.

Rohit Kalia

wtf she writes backward

Antonio De la Torre

Do you actually write backwards?!?

Faizan Hussain

Thanks, I m from India.

Pasindu Tennage

Excellent explanation

sharzil khan

i have tried to understand stats .. but not my cup of tea. :(

What is theta in statistics

Share on facebook
Share on twitter

Non-ignorable missing data | Statistics for proteomics

152 views | 19 Jan. 2021

This lecture on

This lecture on non-ignorable missing data was presented by Professor Alexander Franks (http://afranks.com) and is part of the course ?????????? ??? ??????????: http://slavovlab.net/stats The full YouTube playlist is available at http://stats.slavovlab.net

What is theta in statistics

Share on facebook
Share on twitter

Likelihood | Log likelihood | Sufficiency | Multiple parameters

47 277 views | 26 Jul. 2018

See all my videos here:

See all my videos here: http://www.zstatistics.com/

***************************************************************

0:00 Introduction

2:17 Example 1 (Discrete distribution: develop your intuition!)

7:25 Likelihood

8:52 Likelihood ratio

10:00 Likelihood function

11:05 Log likelihood function

14:41 Sufficient statistics

16:30 Example 2 (Continuous distribution)

20:53 Multiple parameters

26:11 Nuisance parameters

***************************************************************

I would definitely advise watching the video from the beginning here as Example 1 is referred to throughout!

Kamran Esgersoy

One of the unsuccessful video.. I am fun of Zstatistics but this video doesn't deserve to be among Zstatistics family.

Felipe Toledo

If I'm unsure whether my data points are better described by distribution A or B, can I compare the maximum likelihood of distribution A and B and pick the highest one or likelihood of different distributions shouldn't be compared?

Furkan BAŞKAN

Amazing. Far beyond the expectation for a youtube lecture. Thank you so much

Yulin Liu

Excellent! Many thanks!

Ern Ying

best!

al m

Great presentation! You make statistics come alive!

Ashwadip Garud

Hi, I am very happy with your explain-ability about statistics, May I request you to develop a full series of stat from start to end including everything not a single point should be excluded, It will help a lot who want to do self study all statistics literature very logically and clearly .

Benny Tajfar

Hi I admire your knowledge but in 28:23 when u show the likelihood formulate your explanation was wrong about product the fact is likelihood for one y is not a product, the likelihood when we have a Y that consists of y1,y2,...,yn then the formulate becomes a product of likelihood of all these ys please correct it

donkee

for videos like this one I just wish youtube allowed me to speed it up to 16x instead of just 2x

Aesthetic Athlete

Exceptionally clear explanation!

Cindy Oliver

Very helpful. Thank you so much!

Infinnacage Music

This was very useful, thank you.

Momof3 Kings

This is like trying to learn Japanese when you only know English smh

Piyush Gupta

This is the kind of content I would rather pay for.
Presentation Game: 10/10
Clarity and Brevity: 10/10
Content Quality: 20/10
And yes nothing subjective here.
This channel is gonna be on my top recommendation list when it comes to all things DS/Stats. Thanks!!

alxndrdg8

I am watching only bcoz Justin Zeltzer made this video. His regression videos simply impressed me.
But I have to admit, Statistics subject and I do not go well. I think Statistics is full of nonsense! Have you seen in drama where lovers take a rose and peel of petals one at a time, saying: 'She loves me' and in next petal 'She loves me not', and rely on the last petal as the likelihood of their love. Statisticians do the same thing by playing with equations which no-one truly understands!
Statistics/Stochastic is an imperfect field. Those equations are merely to torture learners.
At the end of all that learning what you have is likelihood, probability and distributions which do not give exact result.
If nature followed it, you wouldn't exist. It takes only one sperm to meet the egg and create baby. But a million sperms are in the race to reach the egg. So the probability of a baby being created is 1 in a million. Does this mean you take the risk of not using a condom?
Please don't do that. In nature, even 1 entity has power to do magic or tragic.

Joe

brilliant

erolxtreme

What s the meaning of C in likelihood function?

Gustav Streicher

Nice video. I don't agree with you that likelihood by itself has no meaning, because that would imply that all pdf values by themselves have no meaning. Sure, the ratio is also informative, but by themselves likelihood informs you of the probability density value given the choice of your parameters for the given sample.

You also fail to mention that the product formula for likelihood is a direct result of the samples being independent. This formula thus assumes independent samples. If your samples turm out to be dependent then you cannot simplify the joint pdf into a product of univariate pdfs, which is what the product formula is.

Jill Valentine

I cannot plot it in wolframalpaha

Amalie Petersen

Some of the best youtube lectures I have watched so far!

TheSingularity

stat queeeest, it's for muthafukin, gangstaaaaaaz!

Ravindu Abeygunasekara

best statistics videos on internet!

Andre Angelo

gauss bless you

Charles Rauch

At 20:00 why do you only substitute T(y) for sigma y and not sigma y^2/200 ?

Vidal Hernandez Jr.

These lecture videos really fills the gaps from the book my professor is using, Statistical Inference 2nd Edition by Casella. Thanks!

Fahimul Islam

10:57
Likelihood function, L(θ)= prod_of [f_i (y_i;θ)]
Should the PDF f have a subscript of i? Or it's a mistake?

Tyler Matthew Harris

14:11 about the "junk material" has me so confused. Isn't C of theta just a function? Where did log(^n*C_y) come from? Awesome videos!

Sushmita Saxena

When i first saw ur picture while scrolling down my search, i thought wht is this young guy gonna teach. He looks my age n all u were the last one i opted to watch but after seeing this video i am truly stunned. U r like a Handsome Hulk of Statistics. Thank you!

Geordon Worley

This is amazingly helpful. Thank you so much.

Hedayat Razawy

at 19:02 you have (1/200*pi)^5/2 where did this 5/2 come from?

thalassatrinculo

great video but it gets confusing at the end.

Ahmed Serag

genius presentation

Alex Shnaidman

So nice to see new videos. You are the best!

Abir Kar

Brilliant! All the other videos were full of jargon. This is the only one with a bottom up approach. great job!

Marco Badwal

Fantastic, thanks a lot @zedstatistics.
Would have appreciated a bit more detail on how using the log-likelihood instead of the likelihood affects our likelihood distribution and the values (Y and X axis).

Jae Oppa

great video!

Soyyy

saved my life thx

Xiaoye Cai

Those vedios saved my college degree. I learned much more than my whole semester.

juliakbrown

Thank you so much for making these videos! I can't tell you how much I appreciate you and others like you who have put so much time and thought into creating material that is accessible and helpful for those of us who are struggling through stats and other math classes.

Vítor Barros

you are the best bro!!! you're saving a brazilian student

aabens

The lecturer at one of my university statistics courses really doesn't know how to teach. You are a savior, thank you so much

Ahmed Taha

Thank you so much

Howard Lo

What's the slides template you're using? It's beautiful.

Titurel

20:08 why - mu^2? shouldn't it be +? and why is nu in equation now?? is nu to be =5?

WhetstoneGuy

Mr. Justin Z: Example 2 Video 18,28 how did you get to that formula WITHOUT the Euler e's? Thank you.

Roman Teplov

Thank you for the great videos!
Interestingly, in your example in the beginning of the video (with prevalence of thalasemia) for both cases the probability of second smaller value is quite close to the actual value probability (i.e. in the case of 7%, the probability of having 6 people out of 100 with thalasemia is quite close to the probability of having 7 people out of 100, similar for 8% case), however it decreases with higher values (i.e. the probability of having 8 people out of 100 is smaller than the peobability of having 6 people out of 100 for 7% case and again the same happens in 8% case also).
Is it just by chance or there is some specific reason why the values which are the nearest to our expected value have such different probability?