Here is the note for lecture two.
Discussion of how to get final hypothesis from hypothesis set.
How do we choose the best hypothesis?
All results for any cases , we call these ensemble(总体).
Some results for some data given, we call these sample(样本).
There is a precondition that all these come up by the same hypothesis.
If the results in ensemble could be all right, the hypothesis for these is the best one, we also call it final hypothsis.
However, we cannot use all cases, it comes to sample.
Instead of ensemble, we use sample to judge the hypothesi. Is it all right? There will be some discuss.
there is an inequality called Hoeffding's Inequality
(u means sample, and v means ensemble.)
Soin big sample, u is probably close to v. It means that we can use sample to judge the hypothesis instead of ensemble.
There is another problem, if we make the first hypothesis and find that the result is pretty good, what should we do?
Maybe some of you will say" ok, this is my final hypothesis." So this is the problem.We get only one hypothesis and let it
the final one, this processing is not learning. There is no comparision, what we have done called verfication.
It is clear for us to correct now. Firstly, we could provide several different hypothesis, which named hypothesis set. Secondly,
We use data to get samples for different hypothesis, as it discussed above, big sample usually on behalf of ensemble. Finally,
it is time for learning algorithm, according to the sample, learning algoruthm choose the best one from hypothesis set.
I am confused for one it come up. In case of sample cannot represent ensemble, The Hoeffding's Inequality have been transfered.
M means the number of hypothesis.
I don't know why. And then, a new idea come up with according to the inequality.If hypothesis set is too large, the final hypothesis will become unaccurate.
本文发布于:2024-01-28 08:55:18,感谢您对本站的认可!
本文链接:https://www.4u4v.net/it/17064033226261.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
留言与评论(共有 0 条评论) |