$(\sum_{i=1}^x \frac{1}{i}) -\ln(x)$
10: 0,62638316097421
100: 0,58220733165153
1000: 0,57771558156821
$(\sum_{i=1}^x \frac{1}{i}) - \ln(x+\frac{1}{2})$
10: 0,57759299680478
100: 0,57721979014049
1000: 0,57721570652656
$(\sum_{i=1}^x \frac{1}{i}) -\ln(x)$
10: 0,62638316097421
100: 0,58220733165153
1000: 0,57771558156821
$(\sum_{i=1}^x \frac{1}{i}) - \ln(x+\frac{1}{2})$
10: 0,57759299680478
100: 0,57721979014049
1000: 0,57721570652656
On
This is not especially a number theory phenomenon. You are comparing a sum and an integral. Similar to this picture from a different question:
Now, before I paste in the diagram, just note that $f(n)$ is estimated by $\int_{n-1}^n f(t) dt \;,$ or by $\int_{n}^{n + 1} f(t) dt \;,$ but is estimated much better by $$\int_{n - \frac{1}{2}}^{n + \frac{1}{2} } \; f(t) \; dt \;.$$
Also note that the advantage of using the halves happens only in that first term. After that, both asymptotic expansions are similar, just even exponents. Here we go, $$ H_n = \gamma + \log \left( n + \frac{1}{2} \right) +\frac{1}{24\left( n + \frac{1}{2} \right)^2} - \frac{7}{960\left( n + \frac{1}{2} \right)^4} + O \left(\frac{1}{\left( n + \frac{1}{2} \right)^6} \right) $$ Note $$ \gamma \approx 0.57721566490153 $$
====================================================
double h = 0.0;
for(int n = 1; n <= 105; ++n)
{
h += 1.0 / n;
double half = n + (1.0 / 2);
double below = h - log(half) - 1 / ( 24 * half * half );
double above = below + 7 / ( 960 * half * half * half * half );
cout.precision(16);
cout << setw(3) << n << setw(20) << h << setw(20) << below << setw(20) << above << endl;
}
=======================================================
n H_n below gamma above gamma
1 1 0.5760163733733171 0.5774567025914241
2 1.5 0.5770426014591783 0.577229268125845
3 1.833333333333333 0.5771690042937476 0.577217595158665
4 2.083333333333333 0.5771983233883347 0.5772161052305335
5 2.283333333333333 0.5772078306265884 0.577215799116901
6 2.45 0.5772116298045227 0.5772157146288168
7 2.592857142857143 0.577213381574137 0.577215686100886
8 2.717857142857143 0.577214278092129 0.5772156749464131
9 2.828968253968254 0.5772147748446763 0.5772156700700557
10 2.928968253968254 0.5772150678554185 0.5772156677426398
11 3.019877344877345 0.5772152496467665 0.5772156665501748
12 3.103210678210678 0.5772153672357559 0.5772156659024226
13 3.180133755133755 0.5772154460039576 0.5772156655328736
14 3.251562326562327 0.577215500362116 0.5772156653130572
15 3.318228993228994 0.5772155388494049 0.5772156651775194
16 3.380728993228994 0.5772155667148677 0.5772156650912913
17 3.439552522640758 0.577215587289521 0.577215665034905
18 3.495108078196314 0.5772156027470478 0.5772156649971248
19 3.547739657143682 0.5772156145413271 0.5772156649712566
20 3.597739657143682 0.5772156236663827 0.5772156649531959
21 3.645358704762729 0.5772156308153391 0.5772156649403632
22 3.690813250217275 0.5772156364801514 0.5772156649310989
23 3.73429151108684 0.5772156410156465 0.5772156649243132
24 3.775958177753507 0.5772156446815149 0.5772156649192762
25 3.815958177753507 0.5772156476703771 0.5772156649154918
26 3.854419716215045 0.5772156501268676 0.577215664912616
27 3.891456753252082 0.5772156521608239 0.5772156649104084
28 3.927171038966368 0.5772156538565317 0.5772156649086969
29 3.961653797587057 0.5772156552792992 0.5772156649073582
30 3.994987130920391 0.577215656480186 0.5772156649063024
31 4.02724519543652 0.5772156574994485 0.5772156649054635
32 4.05849519543652 0.5772156583690728 0.5772156649047917
33 4.08879822573955 0.5772156591146631 0.57721566490425
34 4.118209990445433 0.5772156597568547 0.5772156649038104
35 4.146781419016861 0.577215660312387 0.5772156649034516
36 4.174559196794639 0.5772156607949213 0.5772156649031569
37 4.201586223821666 0.5772156612156711 0.577215664902914
38 4.22790201329535 0.5772156615838906 0.577215664902712
39 4.253543038936376 0.5772156619072569 0.5772156649025438
40 4.278543038936376 0.5772156621921698 0.5772156649024033
41 4.302933282838815 0.5772156624439875 0.5772156649022848
42 4.326742806648339 0.5772156626672178 0.5772156649021847
43 4.349998620601827 0.5772156628656684 0.5772156649020997
44 4.3727258933291 0.5772156630425677 0.5772156649020274
45 4.394948115551322 0.5772156632006642 0.5772156649019654
46 4.416687245986104 0.5772156633423058 0.5772156649019121
47 4.437963841730785 0.5772156634695057 0.5772156649018663
48 4.458797175064118 0.5772156635839953 0.5772156649018267
49 4.479205338329423 0.5772156636872687 0.5772156649017924
50 4.499205338329423 0.5772156637806188 0.5772156649017626
51 4.518813181466678 0.5772156638651685 0.5772156649017368
52 4.538043950697447 0.5772156639418947 0.5772156649017143
53 4.556911875225749 0.5772156640116504 0.5772156649016947
54 4.575430393744267 0.5772156640751811 0.5772156649016772
55 4.593612211926086 0.577215664133143 0.5772156649016624
56 4.611469354783229 0.5772156641861109 0.5772156649016494
57 4.629013214432351 0.5772156642345918 0.5772156649016372
58 4.646254593742697 0.5772156642790354 0.5772156649016271
59 4.66320374628507 0.5772156643198376 0.5772156649016178
60 4.679870412951736 0.5772156643573509 0.5772156649016096
61 4.696263855574687 0.5772156643918882 0.5772156649016019
62 4.712392887832752 0.5772156644237292 0.5772156649015958
63 4.728265903705767 0.5772156644531214 0.5772156649015899
64 4.743890903705767 0.5772156644802882 0.5772156649015847
65 4.759275519090383 0.5772156645054282 0.5772156649015799
66 4.774427034241898 0.577215664528721 0.5772156649015762
67 4.789352407376227 0.5772156645503264 0.5772156649015726
68 4.804058289729168 0.5772156645703888 0.5772156649015691
69 4.818551043352357 0.5772156645890391 0.5772156649015661
70 4.832836757638071 0.5772156646063945 0.5772156649015633
71 4.846921264680325 0.5772156646225621 0.5772156649015612
72 4.860810153569214 0.5772156646376377 0.5772156649015593
73 4.8745087837062 0.5772156646517078 0.5772156649015567
74 4.888022297219713 0.5772156646648526 0.5772156649015545
75 4.901355630553047 0.5772156646771447 0.5772156649015531
76 4.914513525289889 0.5772156646886492 0.5772156649015519
77 4.927500538276902 0.5772156646994255 0.5772156649015505
78 4.940321051097415 0.5772156647095285 0.5772156649015492
79 4.952979278945516 0.577215664719008 0.5772156649015481
80 4.965479278945517 0.5772156647279097 0.5772156649015472
81 4.977824957957862 0.5772156647362753 0.5772156649015462
82 4.990020079909081 0.5772156647441425 0.5772156649015449
83 5.002068272680166 0.5772156647515477 0.5772156649015442
84 5.013973034584928 0.5772156647585223 0.5772156649015433
85 5.025737740467281 0.577215664765096 0.5772156649015425
86 5.037365647444025 0.5772156647712965 0.5772156649015419
87 5.048859900317588 0.5772156647771484 0.577215664901541
88 5.060223536681224 0.5772156647826757 0.5772156649015406
89 5.071459491737405 0.5772156647878995 0.5772156649015404
90 5.082570602848516 0.5772156647928391 0.5772156649015399
91 5.093559613837527 0.5772156647975133 0.5772156649015394
92 5.104429179054918 0.5772156648019391 0.5772156649015392
93 5.115181867226961 0.5772156648061318 0.5772156649015389
94 5.125820165099301 0.5772156648101061 0.5772156649015384
95 5.136346480888775 0.5772156648138752 0.5772156649015376
96 5.146763147555442 0.5772156648174528 0.5772156649015376
97 5.157072425905957 0.5772156648208494 0.5772156649015373
98 5.16727650753861 0.5772156648240766 0.5772156649015374
99 5.177377517639621 0.577215664827144 0.5772156649015374
100 5.187377517639621 0.5772156648300606 0.577215664901537
101 5.197278507738631 0.5772156648328358 0.5772156649015368
102 5.207082429307258 0.5772156648354776 0.5772156649015365
103 5.216791167171336 0.5772156648379941 0.5772156649015368
104 5.226406551786721 0.5772156648403919 0.577215664901537
105 5.235930361310531 0.5772156648426773 0.5772156649015369
n H_n below gamma above gamma
=========================================================
Let $H_n=\sum_{k=1}^n\frac1 k$. The Euler-Maclaurin summation formula gives $$H_n=\ln n+\gamma+\frac{1}{2n}+O(n^{-2}).$$ But $$\ln(n+1/2)=\ln n+\ln\left(1+\frac1{2n}\right)=\ln n+\frac1{2n}+O(n^{-2}).$$ Therefore $$H_n=\ln\left(n+\frac12\right)+\gamma+O(n^{-2}).$$ The error in the approximation is now $O(1/n^2)$ rather than $O(1/n)$.