What are some 'nice' real-world applications of Lagrange multipliers that could be used to motivate the concept to students in an introductory optimization course?
I like the pressure as a Lagrange multiplier and Dido's problem examples, but they maybe a bit too advanced. This application is nice but not very real-world.
I'm not sure that this is the "nicest" application, but if the students were exposed to some statistics, you can show that for a linear model of a kind $$ Y = X\beta + \epsilon, $$ by using Lagrange multipliers one can find the BLU estimators of $\beta$. This isn't much of a "surprising" fact, but it still interesting as usually in statistics courses the $\hat{\beta}$ are derived by minimizing the least squares (or the empirical MSE) or (e..g., for Gaussian errors) maximizing the Likelihood function. Hence, it could be nice to see that for appropriate assumptions on $\epsilon$, you get the same result independently of which method you've used (including Lagrange multipliers).
Let us look at the simplest although non-trivial example. Assume the following model $$ Y_i = \beta x_i + \epsilon_i, \quad i=1,...,n $$ where $E\epsilon = 0$, $E\epsilon_i \epsilon_j =0$ and $E\epsilon_i = \sigma^2$, thus if we are looking at the class of all linear unbiased estimators then we have to look at estimators of the form $$ \alpha = \sum_{i=1}^nw_iY_i, $$ that are satisfy $E(\alpha) = \beta \sum w_ix_i = \beta$, i.e., $\sum w_ix_i = 1$. Thus, you minimization problem (the optimality, "best", criterion) is $$ \arg\min_{w\in \mathbb{R}^n} \left( \sigma^2\sum_{i=1}^nw_i^2 - \lambda (\sum_{i=1}^n w_ix_i-1) \right), $$ where $\sigma^2\sum w_i^2=Var(\sum w_iY_i)$. Now, note that $$ \mathcal{L}'_{w_k} = 2\sigma^2w_k - \lambda x_k = 0\to w_k =\frac{\lambda x_k}{2\sigma^2}, \forall k . $$ By using the restriction you get that $$ \sum w_ix_i = \lambda\frac{\sum x_i^2}{2\sigma^2} =1 \to\lambda=\frac{2\sigma^2}{\sum x_i^2} \to w_k = \frac{x_k}{\sum x_i^2}. $$ As such the estimator is given by $$ \alpha = \frac{\sum x_i Y_i}{\sum x_i^2}. $$ You can check that the Hessian matrix is indeed positive definite. This is, actually, called the Gauss - Markov theorem.