nVidia glsl compilation bug

Hello

While compiling different vertex/fragment shaders with my geforce 6600, I often have the following compilation error:

Fragment info

(X) : error C0000: syntax error, unexpected $undefined at token “<undefined>”
(X) : error C0501: type name expected at token “<undefined>”

or

Fragment info

(X) : error C0000: syntax error, unexpected ‘[’ at token “[”
(X) : error C0501: type name expected at token “[”
(X) : error C0000: syntax error, unexpected ‘(’ at token “(”
(X) : error C0501: type name expected at token “(”

With (X) being the last line of the shader (that is, the closing character “}”)

If I remember well, I had same troubles with my old geforce fx too with different shader I wrote. That’s extremely annoying because this compilation error is not caused by any glsl syntax error, but seems to be bound to some unknown cause… I suspect a misinterpretation of the “return” character, or… for any nVidia staff going around here, here’s an attached fragment shader causing trouble:
http://www.divideconcept.net/misc/3subpixlevelf.txt
I prefer giving a link to the original file rather than pasting the code, because I suspect one hexadecimal character of it to cause trouble.
What leads me to say that is that tweaking the file (that is, removing blank line, or deleting and re-introducing return character make it work most of the time, without changing ANY line of the actual glsl code).
Before any remark: yes I know there are unused variables in the fragment shader I gave. That’s because that was a longer shader that I tried to reduce to just keep the error. Removing a few line more make the problem disappear.

As a final note: I had this trouble since glsl has been implemented in nVidia’s driver, with a few shader I wrote on different projects, so it happens whatever the card’s model or driver version (I currently use 91.31).
And often, just tweaking the file - playing with the layout WITHOUT changing the actual glsl code - make it work.

So so…

Thanks in advance, and despite this annoying bug I must say I was never disapointed with nVidia’s products !

A few posts down from yours:
http://www.opengl.org/discussion_boards/ubb/ultimatebb.php?ubb=get_topic;f=11;t=001162

Thanks, sorry I didn’t had a look before posting.
I’ll made a few test.

It works ! Thanks for pointing to the previous thread, and to brtnrdr for pointing me to lighthouse3d.com source code, which perfectly load shaders.

As a reference, here’s the source code from lighthouse3d for loading shader:

char *ShaderSource;
ShaderSource=textFileRead(ShaderPath);


char *textFileRead(char *fn) {


	FILE *fp;
	char *content = NULL;

	int count=0;

	if (fn != NULL) {
		fp = fopen(fn,"rt");

		if (fp != NULL) {
      
      fseek(fp, 0, SEEK_END);
      count = ftell(fp);
      rewind(fp);

			if (count > 0) {
				content = (char *)malloc(sizeof(char) * (count+1));
				count = fread(content,sizeof(char),count,fp);
				content[count] = '\0';
			}
			fclose(fp);
		}
	}
	return content;
}

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.