off topic but important

I am creating an app using openGL that needs to be platform independent. Originally it is being created on a SGI workstation. My program reads many binary files and this is my problem. Binray files in IRIX are big-endian and in linux they are little-endian. I have written a converter to switch betweeen the two and it works. However in windows it does not. I was under the impresson that windows used little-endian. I also assumed that big/little endian was a processor thing and was not based on what os one uses. I thought little-endian was for Intel/amd processors so I figues that if little-endian worked on a linux machine with an intel processor it would work on a windows machine with an intel processor. Can someone help me out. I hope I was clear

well obviously, if it is not right the way you assumed it to be, it is wrong, causing it to be right the other way (like on irix). if this is also not the case, obviously something totally different is wrong. maybe something file-system-specific?

I gues this does not really help you…


In Windows platform, you’ve got CR-LF(Carriage return & line feed) for the end of line. i can not remember exactly ,maybe the asc code of linefeed is the 13 and carriage return is 10 . whatever one is 10,one is 13

i don’t think RunningRabbit post would help you since the CR-LF is for ascii files and for those files, you read only char so there is no need to swap.
maybe you should check if the windows compiler makes some data alignment optimizations that the SGI and linux compilers does not.

Sorry for the misleading answer in the previous post.
Just search the microsoft msdn and following is quote.
"Intel-based machines store data in the reverse order of Macintosh (Motorola) machines. The Intel byte order, called “little-Endian,” is also the reverse of the network standard “big-Endian” order. "