We can build statistical models for inference purposes or predictive purposes. Imagine that we fit a simple linear regression model Y = b0 + b1X. If we fit this model for the purpose of statistical inference, what are our primary motivations and what are the types of conclusions that we would like to draw? What statistical tests are used to help us make these conclusions?
Now, if we are building this model for predictive purposes, is statistical inference important? What is important? Are we interested in the results of statistical tests or are there other metrics that of more importance? How do we evaluate a model for predictive purposes? What metrics can we use?
The question you ask is too generic to have an answer. Let us consider the simple example of stock investment: finding a trading strategy that performs well in back testing is relatively easy, but make sure it has robust performance in future is very, very hard. The more refined your model is, the more likely it can suffer from overfitting and sensitive to noise. If you cannot do any kind of inference, then tuning the model would be very difficult in general, as you have to rely on multiple rounds of cross-validation to control the degree of overfitting you have in backtesting.
The other thing to keep in mind is that "inference" assumes a probability distribution already, while a lot of predictive modeling is not based on modelling probability distributions. For example if you are in love and talk to people, you want to appear naturally talking about things that mutally interests the other person. Since everyone is different, a statistical model based Q&A responses from Okcupid would be of little use - it would be hard to image you doing inference in real time to adjust to others' behavior to give a feedback that suits most people, but not your significant one. And other methods like game theory came into play.