Why the default position of directional of light(which is 0,0,1,0) points to -z axis?

If I’m right then when the fourth component of light position is zero(0) than the light is a directional light and the first three component tell its direction.
The default value of this is (0,0,1,0), in this shouldn’t the direction of light be +z axis but ‘The openGL programming guide’ book says that it is towards -z axis and so does
the documentation.
Am I missing something here?

Let say you have a point light at the position (0,0,1). If you have a vertex at the position (0,0,0), simple vector math dictate that the light vector will then be (0,0,-1).
In homogeneous coordinates, the three first components are divided by the fourth component (w). Therefore, if w is equal to 1, the (real) normalized coordinates will be the same values. If w is equal to 2, then the three first components will be half their specified values…
Directional lights are lights that are so far that all their rays are said to be parallel. This can be represented in homogeneous coordinates with using the fourth component set to 0. Thus making the three first components to approach infinity. You then represent a light very far away. But to get the light vector you still use the same mathematic: deduce the position from the light ‘position’.
Therefore setting a light position to (0,0,1,0) tells that there is a light far way at the direction (0,0,1), but then the light vector will be (0,0,0) - (0,0,1), that said (0,0,-1).

The light position is the vector from the origin to the light.

For a positional light, the vector from the light to the vertex is the vertex position minus the light position. For a directional light, it’s the negation of the light position (the vertex position is ignored, as the light is at infinity).

So the default light position corresponds to the light being at the far end of the positive Z axis. The direction of the light is parallel to the negative Z axis.