For a PAC learnable hypothesis Show that its sample complexity $m_{\mathcal{H}}$ is monotonically non-increasing in each of its parameters

2.4k Views Asked by At

Not sure if this is the right place to post this, if this isn't i'll be grateful if someone will direct me where best to post it.

I'm independently taking the course Introduction to Machine language (as in, doing it by myself) using the book:

Understanding Machine Learning: From Theory to Algorithms
By Shai Shalev-Shwartz, Shai Ben-David

I've gotten to the chapter about PAC learning and am trying to do the given exercises for it, in particular I'm stuck on the first exercise (3.1):

Let $\mathcal{H}$ be a hypothesis class for a binary classification task.

Suppose that $\mathcal{H}$ is PAC learnable and its sample complexity is given by $m_{\mathcal{H}}\left(\cdot,\cdot\right)$.

Show that $m_{\mathcal{H}}$ is monotonically non-increasing in each of its parameters. That is, show that given $\delta\in\left(0,1\right)$ , and given $0<\epsilon_{1}\leq\epsilon_{2}<1$ , we have that $m_{\mathcal{H}}\left(\epsilon_{1},\delta\right)\geq m_{\mathcal{H}}\left(\epsilon_{2},\delta\right)$. Similarly, show that given $\epsilon\in\left(0,1\right)$ and given $0<\delta_{1}\leq\delta_{2}<1$ we have that $m_{\mathcal{H}}\left(\epsilon,\delta_{1}\right)\geq m_{\mathcal{H}}\left(\epsilon,\delta_{2}\right)$

Any help is appreciated, thank you

1

There are 1 best solutions below

0
On

Hint: What is the definition of $m_\mathcal{H}$? If $0<\epsilon_1<\epsilon_2<1$, how can you use this definition to immediately conclude that $m_\mathcal{H}(\delta,\epsilon_1)\geq m_\mathcal{H}(\delta,\epsilon_2)$? (And similarly for when $\delta$ varies) You might be overcomplicating this exercise - write out the definition and the conclusion should be clear.