What is more correct? Likelihood of data given parameter or likelihood of parameter given data? In https://en.wikipedia.org/wiki/Likelihood_function for instance, we see both
likelihood of $\hat{\theta}$
but also maximizing the likelihood of the specific observation $x_{j}$
what is best?
Consider Modern Mathematical Statistics with Application by Devore, Berk, and Carlton, Chapter 7 Point Estimation, subsection Maximum Likelihood Estimation.
The authors claim:
It is called the maximum likelihood estimate because for fixed x1, ..., x10, it is the parameter value that maximizes the likelihood (joint pmf) of the observed sample.
Therefore, likelihood is always of the observed sample? In other words, likelihood is always likelihood of data?

If I understand your question correctly, the definition of a likelihood function is: Given a random variable $X$ with a value of $x$, we want to know the probability that some unknown parameter has a true value of $\theta$. Then, we call the likelihood function: $$\mathcal{L}(\theta | x)$$ Which is defined differently for discrete and continuous random variables. Specifically, your question asks which is more correct, the likelihood of $\theta$ (yes) or maximizing the likelihood of the specific observation $x$ (not really).
The likelihood function is defined to be the likehood of $\theta$ given that $X = x$. So, calling it the likelihood of $\theta$ is clearly correct. However, we have an equivalent function that can tell us the probability of getting $x$ given a parameter $\theta$, which we call the p.d.f. of $x$. However, $\mathcal{L}(\theta | x)$ does not tell us that, $f(x | \theta)$ tells us the likelihood of our event given the parameter. It's the same function, but in one version we fix $x$, the likelihood function, and the other version we fix $\theta$, which is a probability density function.
If you would like a reputable source, you may read Modern Mathematical Statistics with Application by Devore, Berk, and Carlton, Chapter 7 Point Estimation, subsection Maximum Likelihood Estimation.
Edit: I cannot find your quote, and page 353 is not what I cite but, directly from the book on page 547 (the page number not pdf number):
Based on your interpretation of the quote: sure. The maximum likelihood estimate, given a fixed sample $x_1, x_2, \ldots, x_n$, can be seen as the value of $\theta$ that maximizes $\mathcal{L}(\theta | x)$. But what you have written in your question at the top:
Refers to two different scenarios. When we look at a likelihood function in general, it tells us the likelihood of $\theta$ given our sample $x_1, \ldots, x_n$. If we want to maximize our likelihood function, then yes it tells us the value of $\theta$ that maximizes the likelihood that we will get our sample $x_1, \ldots, x_n$ compared to any other value of $\theta$.
Final edit : Yes. These are two different concepts:
A maximum likelihood esitmate is the value of $\theta$ that maximizes the likelihood that we can obtain our sample $x_1, \ldots, x_n$. This is found by $\max_\theta (\mathcal{L}(\theta | x))$, but to be clear it is not the only result that we can find from the likelihood function.
A likelihood function gives us the likelihood of a certain value of $\theta$ given that we have already observed the sample $x_1, \ldots, x_n$.
So in your original question you ask which is better: If you have a maximum likelihood estimate, then maximizing the likelihood of the specific observation $x_j$ is best.
If you are simply referring to what a likelihood function does, then the likelihood of $\theta$ works best.