Talk:Likelihood function
This is the talk page for discussing improvements to the Likelihood function article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1 |
This level-5 vital article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||
|
This article cannot be easily understood by people who do not already know what likelihood function is
[edit]This article can only be understood by someone who already knows a bit of statistics and already knows the difference between probability and likelihood. But then, why would someone like that come and read this article ? :D @Bender235: I see that you have been actively participating in developing this article. Can you please cleanup this article keeping in mind the following : a) Wikipedia is an encyclopedia. b) Encyclopedia articles are supposed to explain things to people who have no prior knowledge of the topic. --Sahir 05:05, 19 August 2021 (UTC)
- The article opening, and especially the first sentences, are already written with this in mind. It needs to strike a balance between being understandable to a novice, but also technically correct. If you have specific suggestions on how to improve the article, feel free to implement them. --bender235 (talk) 13:56, 19 August 2021 (UTC)
- @Bender235: I think one sentence or two that clearly explains the essence of likelihood function would do the trick. The layman would not usually want to read beyond that. I came to this page looking for such an explanation. Later, I watched a few youtube videos about this topic and now I have a vague idea about what it is, but not enough to edit the article :) --Sahir 08:33, 20 August 2021 (UTC)
- In my opinion, the opening sentence does that already. The essence of the likelihood function is its purpose, and its purpose is to fit a (probabilistic) model to a given sample. The very first sentence of the article says that, in a few more words and with wikilinks for further details. --bender235 (talk) 13:18, 20 August 2021 (UTC)
- @Sahirshah: does the re-write address your concern? --bender235 (talk) 20:24, 10 October 2021 (UTC)
- @Bender235: Yes. Thank you --Sahir 05:51, 11 October 2021 (UTC)
- @Sahirshah: does the re-write address your concern? --bender235 (talk) 20:24, 10 October 2021 (UTC)
I looked at the new version and I find it to be harder to fillow than the previous version. For example, what is a probablistic prediction? A prediction implies that we are trying to assign a value to an unknown phenomenon. However, the likelihood is well defined, it doesn't predict the probability, it simply says what it is goven the setting (the joint distribution, data, and parameters). Also, likelihood doesn't have to be a product of densities. It is also defined for descrete rv, as well as for dependent random variables (when the product wouldn't be correct). Also, why should missing value machanism be onteoduced to the opening paragraph? How is that relevant? I suggest to add more references and proceed with a more straightforward introduction. As for the baysien vs frequentist: I think THE primary usecase is frequentist. The baysien approach is interested in the distribution of the parameter, and in such context the likelihood is not interesting. It is obvious that the formula of the likelihood is in the definition of the posterior, but more from a mathematical point of view, rather than a modelling pov. Tal Galili (talk) 21:38, 10 October 2021 (UTC)
- @Talgalili: Maybe 'probabilistic prediction' is not the perfect terminology, but what it means is that for a chosen model (e.g., ) and some specific parameter value (e.g., ), the likelihood function returns the probability of the observed sample (say ), and of course does so for every permissible . You're correct, though, that it doesn't necessary have to be densities that form the joint probability of the sample. Any suggestion as to how to rephrase that?
- As for the likelihood in Bayesian statistics, you assessment couldn't be further from the truth. Most Bayesian textbooks spent a good deal on how to derive the likelihood (e.g., BDA3). --bender235 (talk) 02:49, 11 October 2021 (UTC)
- Hey bender235,
- Regarding how to rephrase - I won't get to do it soon, but wanted to point some glaring issue, as I saw them.
- Regarding "likelihood in Bayesian statistics", I want to separate between the computation and the meaning. For computation purposes, of course we'd want the likelihood (since the posterior is proportional to it, up to also multiplying in the prior). From an interpretation point of view, the likelihood itself is a random variable as a function of the parameter (which in the bayesian context is the random variable), and not of the data (this is opposed to the frequentist version in which the likelihood is a random variable since it is a function of the data, but for some fixed value of the paramter). In that context, I imagine the likelihood could have some interesting meaning, but (a) I don't know what it is. (b) I wouldn't guess it is fundamental to bayesian statistics, at least not from what I've seen of it. But given that I am far from any expert in bayesian statistics, this question can easily be addressed with relevant citations (which is true for all issues, in general :) ).
- In any case, I'd like to give you props for trying to improve the intro. I think it still needs work. I may try it at some point in the future, but not today. Cheers. Tal Galili (talk) 04:26, 11 October 2021 (UTC)
- @Talgalili: Ok thanks, looking forward to your suggestion. As an aside, Bayesians do not view parameters as random variables. They view them as fixed quantities whose precise value—in light of finite data—is uncertain (but not random, see [1]). --bender235 (talk) 12:55, 11 October 2021 (UTC)
The introduction is incorrect; a likelihood function is *not* a joint probability because those MUST lie in [0,1]. It looks like a lot of people have tried to edit it over and over, but have not used the explanation given in Lehman & Casella, a standard reference. I've seen that others have suggested going all the way down to measure theory; this is not necessary at all. I'm just going to give a definition which is technically correct, and I'll try my best to make it readable for beginners. alex n. — Preceding undated comment added 08:10, 20 August 2023 (UTC)
Likelihood equations
[edit]I find this section a bit confusing:
What is ? I do not see a definition anywhere.
" is well-defined in an open neighborhood about 0 with probability going to one" -- I wonder what this notion of probability might mean.