Sharp downsampling / downscaling

I am wondering how OpenGL does the texture downscaling.
Here’s an example:

The top is the original size texture with some text.
The second one is the same texture rendered at 0.5 size.
The last one is my CPU downscaled buffer uploaded to the new texture (0.5 of the original texture size).

The one downscaled by OpenGL looks very sharp and good, my version looks blurry and soft. I tried resizing it using different algorithms, even in photoshop with some detail preserve options, but I never get the quality I have with OpenGL.
How to get that result without having the original texture size?

It’s much more visible with some background colors:

OpenGL supports either nearest-neighbour resampling or linear interpolation. The filter is set on a per-texture basis with glTexParameter. For downsampling, it also supports mipmaps. In short, nearest-neighbour tends to produce jagged edges when feature sizes are close to or less than a pixel, while linear interpolation looks blurred.

For textures with large areas of solid colour, nearest-neighbour will probably look neater provided that feature sizes remain significantly larger than a pixel. FWIW, the preferred way to scale “decal” textures is to use signed distance maps with a suitable fragment shader.