Loading .tga images on Unix/Linux (saved on PC photoshop) same as loading on PC?

I’ve saved a few tga images in Photoshop on PC.
In my mini game engine, the CTexture class supports loading tgas and loads them just fine. When I took the code to Linux, the program doesn’t crash, but doens’t dislpay the textures, either. I’ve based my tga file loadin gfrom the code on Nehe’s site and ‘Opengl game programming book’. Why isn’t it working on Unix running Mesa Opengl…?

Thank you,

That’s not an easy question to answer, you have to debug your code to find out where the problem is, or post a copy of the code that loads the tga here.


on unix/linux you have another endianess.

but i recommend you to use DevIL @ sourceforge; although it didn’t load all files, only 99.95% (there are some sporadic bugs with particular files)

“Endianness” is based on the system architecture, not the OS. If you are using an x86 architecture you are going to have a little-endian system wether you run Windows or Linux.

In any case, that might be one thing to look at if you are not running Linux on an x86 computer. (Pentium, AMD, etc.)

As already stated there are many reasons why your code could not be working. Without seeing it, there is no way for us to tell you what the problem is.

Hi, first of all thank you all for responding… indeed, the Unix/system runs on Sun workstations (not the same system as the .PC I was loading the tga files…) My code is below, sorry about the length… If the issue is endianess, I have no idea how to deal with it :frowning: Never came accross it, until now, so any help would be greatly appreciated.
The whole texture.h is below.
Again, the tga files were saved on PC on photoshop. tga loading works perfect on PC. When I take it on Sun workstation the textures do not load… (Sun workstation and Mesa Opengl for graphics). How can I fix this endian issue (if that is it…) never dealt with it before… Huge thanks for all your help.

#ifndef __TEXTURE_H
#define __TEXTURE_H

#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <GL/glut.h>

Implements loading of tga textures. The tga files can be 24 or 32 bit.
This implementation is based on the code from “Opengl game programming”
book by Kevin Hawkins.
Supports alpha channels in the texture.
Limitations: will not load compressed tga textures

Standard tga header.
Only part of the tga header is used here.
See specifications of tga file.
typedef struct {
char imageTypeCode;
short int imageWidth;
short int imageHeight;
char bitCount;

Holds the texture object
typedef struct {

long int scaledWidth;
long int scaledHeight;
char tgaImageCode;		/* 0 = not TGA image, 2 = color, 3 = greyscale*/
int width;				/* width (in pixels) of the texture */
int height;				/* height (in pixels) of the texture */
int bitDepth;			/* holds the big depth of image (16, 24 or 32 bits supported)*/
int texID;				/* Opengl texture id for texture object support */
char *data;				/* Actual pixel data */

} Texture;

/**************************** FUNCTIONS ********************************/
Clears the texture /
void UnloadTexture(Texture
texture) {

glDeleteTextures(1, texture-&gt;texID);
if (texture-&gt;data != NULL)
texture-&gt;data = NULL;


Loads TGA file, since we are only using part of the header,
a lot of bytes will be skipped (defined as ‘garbage’ here.

char* LoadTGAFile(char *filename, TGAHEADER *tgaHeader){

FILE *filePtr;
char ucharBad;				/* garbage data*/
short int	sintBad;		/* garbage data*/
long	imageSize;			/* size of TGA image*/
int colorMode;				/* 4 for RGBA, 3 for RGB*/
long imageIdx;				/* counter variable*/
char colorSwap;				/* swap variable*/
char *imageData;			/* the TGA data*/

/* open the TGA file */
filePtr = fopen(filename, "rb");
if (!filePtr)
	return NULL;

/* read first two bytes of garbage */
fread(&ucharBad, sizeof( char), 1, filePtr);
fread(&ucharBad, sizeof( char), 1, filePtr);

/*read in the image type*/
fread(&tgaHeader-&gt;imageTypeCode, sizeof( char), 1, filePtr);

/* for our purposes, the image type should be either a 2 or a 3*/
if ((tgaHeader-&gt;imageTypeCode != 2) && (tgaHeader-&gt;imageTypeCode != 3))
	return NULL;

/* read 13 bytes of garbage data */
fread(&sintBad, sizeof(short int), 1, filePtr);
fread(&sintBad, sizeof(short int), 1, filePtr);
fread(&ucharBad, sizeof( char), 1, filePtr);
fread(&sintBad, sizeof(short int), 1, filePtr);
fread(&sintBad, sizeof(short int), 1, filePtr);

/* read image dimensions */
fread(&tgaHeader-&gt;imageWidth, sizeof(short int), 1, filePtr);
fread(&tgaHeader-&gt;imageHeight, sizeof(short int), 1, filePtr);

/* read bit depth */
fread(&tgaHeader-&gt;bitCount, sizeof( char), 1, filePtr);

/* read garbage */
fread(&ucharBad, sizeof( char), 1, filePtr);

/* colormode -&gt; 3 = BGR, 4 = BGRA */
colorMode = tgaHeader-&gt;bitCount / 8;
imageSize = tgaHeader-&gt;imageWidth * tgaHeader-&gt;imageHeight * colorMode;

/* allocate memory for image data */
imageData = ( char*)malloc(sizeof( char)*imageSize);

/* read image data */
fread(imageData, sizeof( char), imageSize, filePtr);

/* change BGR to RGB so OpenGL can use the data */
for (imageIdx = 0; imageIdx &lt; imageSize; imageIdx += colorMode)
	colorSwap = imageData[imageIdx];
	imageData[imageIdx] = imageData[imageIdx+2];
	imageData[imageIdx + 2] = colorSwap;

/* close the file */

return imageData;

Loads a tga file into a texture structure
void LoadTGATexture(Texture
texture, char *filename) {


texture-&gt;data = (char*)malloc(sizeof(char));

texture-&gt;data = LoadTGAFile(filename, &tga);
if (texture-&gt;data == NULL)

/* store texture information */
texture-&gt;width = tga.imageWidth;
texture-&gt;height = tga.imageHeight;
texture-&gt;scaledHeight = 0;
texture-&gt;scaledWidth = 0;
texture-&gt;tgaImageCode = tga.imageTypeCode;
texture-&gt;bitDepth = tga.bitCount;



I’ve never worked with Suns, but my guess is that they use a big-endian architecture. Try using the following macros on all your ints and shorts after you’ve read them from the file.

Note: I haven’t tested these, but I think they should be close…

#define LITTLE_TO_BIG_32(v)
(((v & 0xFF000000) >> 24) |
((v & 0x00FF0000) >> 8) |
((v & 0x0000FF00) << 8) |
((v & 0x000000FF) << 24))

#define LITTLE_TO_BIG_16(v)
(((v & 0xFF00) >> 8) |
((v & 0x00FF) << 8))

// Usage:
int i = LITTLE_TO_BIG_32(intValue);
short int s = LITTLE_TO_BIG_16(shortValue);

Thank you, I’ll check that out.

Deiussum, should I only convert the ints when loading the tga texture? Or all the ints in the program?
Also, will it make a difference if the tga were saved on PC or Sun? For instance, if I made the tga images and saved them on Sun (as opposed to what I did now - saved them on PC and tried reading on Sun) would I still have to do the conversion of ints?
Finally, are only ints effected? What about chars, floats, doubles etc.?

Huge thanks,

You’d only have to convert the ints read in from the TGA. Also, the TGA spec says that these are stored in a little-endian manner, so it shouldn’t matter which system you save the TGA on.

I’d probably actually do something like so:

// You’ll have to determine how to decide if LITTLE_ENDIAN is defined or not
// This is just an example.
#define LITTLEEND_TO_HOST_32(v) v
#define LITTLEEND_TO_HOST_16(v) v

// Example usage:
int i = LITTLEEND_TO_HOST_32(intValue);
short s = LITTLEEND_TO_HOST_16(shortValue);

This would not change the values at all for little-endian systems, but would do the byte swapping for big-endian systems.

The reason for this is that if you have something like so:

short s = 0x0123;

It is stored in memory on little-endian systems as 0x2301. (Note the bytes are swapped. On big-endian systems it is stored as 0x0123.

For 32-bit integers the words are swapped as well as each byte of the word, so you end up with:

int i = 0x01234567;

Little-endian memory: 0x67452301
Big-endian memory: 0x01234567

You don’t have to worry about chars, since they are a single byte. And since the TGA spec doesn’t have any floats or doubles (at least, not that I can recall), you shouldn’t have to worry about them either.

[This message has been edited by Deiussum (edited 12-01-2003).]