I’m currently coding a skysphere in outer space, and I have stars on it. These stars are rendered as a vertex array of GL_LINES, so that the lines leave a trail when moving. However, when the movement is very slow, the two ends of a line are on the same pixel and nothing is drawn.

Is there any way to tell OpenGL to render a dot if the line is on one pixel?

Not really very obvious. To calculate the length of a line I’d need to project both ends of each of about 1000 lines to 2D first… If the GPU supports transformation, this would mean double work.

Are your “star lines” saved as two distinct points or as a position and velocity vector?
The latter would be preferable.

Get the post transformed w value of your star position (build a combined modelview and projection matrix in system memory; then for each star multiply the position vector with the last column of that combined matrix).

Clamp the length of the velocity vector to be at least that post-transform w. This will ensure that the line has a length of at least one unit in screen space, and then it should be visible. This is an approximation, though.

The extra work required is one dot product (DOT4) per star and one matrix*matrix mult per frame.

No-brainer method would be to sent the exact same data twice, once as points, once as lines.
Double the stride and halve the count for the points to save geometry bandwidth.