I have a dictionary(Matrix) D and an input(Vector) Y. Now I want to solve this problem:
What is the Sparse Representation of input Y according to dictionary D.
this problem is an important question in Sparsity field and solves with this optimization

or this

What is the best solution to solve this in matlab???
Note that the dictionary D is a matrix by n*d and input Y is vector by n, and we have both of them. epsilon and lambda are constant.
You should look under the LASSO or basis pursuit denoising or ISTA/FISTA.
See for example the following papers/books:
There are many implementations on MATLAB Central and toolboxes which you can find with a simple google of the term. Alternatively, if the problem is small, you can try the code here (using CVX or otherwise). There is also the old l1magic package.
Here is also a list of some packages.