can't figure out alpha blitting

Look at the actual data you have.

Bind the FBO. Then,

GLint bits;
glGetIntegerv(GL_ALPHA_BITS, &bits);
printf("The FBO drawable actually has %d alpha bits
", bits);

If you bind back to FBO zero and repeat this query, you should get zero if the window drawable really has zero alpha bits.

Also double check the internal format of the attachment you’re rendering into. It’s possible for the driver to pick a different precision than the one you requested, but if you asked for alpha when you created the attachment, the driver has to give you alpha.

If you’re rendering into a texture:
GLint internal;
glBindTexture(<texture target>, <your attachment id>);
glGetTexLevelParameteriv(<texture target>, <level you attached to the fbo>, GL_TEXTURE_INTERNAL_FORMAT, &internal);
printf("Actual texture internal format is %04x
", internal);

Or, if you’re rendering into a renderbuffer:
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, <your attachment id>);
GetRenderbufferParameterivEXT(GL_RENDERBUFFER_EXT, GL_RENDERBUFFER_INTERNAL_FORMAT_EXT, &internal);
printf("Actual renderbuffer internal format is %04x
", internal);

If you created the attachment with a generic internalformat “RGBA”, this should come back as something like RGBA8.