I need to develop an algorithm for finding the optimal dimensions for setting a set of symbols on a grid (for a typesetting library I'm writing). I need to minimize the number of cells in my grid while having enough to fit $n$ symbols. The next step is to minimize the distance between the two axis sizes $x$ and $y$ so I can choose whether I want a rectangular grid or squarelike grid.
Formally:
Given a set of symbols $Q$ of cardinality $n$,find $(x,y)\in\mathbb{N^2}$ to minimize $xy-n>0$ and $|x-y|$
I have some intuition that this is an integer programming problem, but I'm not sure exactly how to solve it that way. I can also do some form of gradient descent, but not sure exactly how.
As @Agawa001 said in his comment, the solution is rather simple.
$x = ceil(\sqrt{n})$ and $y=floor(\sqrt{n})$