I have an image with 4 bit luminacne and 4 bit alpha which I want to use as OpenGL texture.

Now, I can tell glTexImage2D() to use GL_LUMINANCE4_ALPHA4 as an internal format for the texture, but I see no way to feed data in this format into gl (I guess I’d need GL_LUMINANCE_ALPHA/GL_UNSIGEND_BYTE_4_4, but this is not available). Is there any way to do give the data directly to gl or do I have to convert it to 8 bit luminance and 8 bit alpha first (which seems somewhat stupid because gl will convert it right back if I choose GL_LUMINANCE4_ALPHA4 as an internal format and the card supports this format)?