Using textures instead of points in a map OpenGl-ES 2.0

Hi I added in glBlendFunc() and glEnable(GL_BLEND), visually this did not have an affect. I have tried this in a couple of different spots without luck.

[Edit] Actually it was the image itself, I tried with another image that I knew was already working with OpenGL-ES 2.0 from a downloadable tutorial http://www.learnopengles.com/android-lesson-seven-an-introduction-to-vertex-buffer-objects-vbos/
This image rendered, however it was upside down.

My image was a png that I had created in Fireworks, it had two layers in it, Layer 1: a bitmap, Layer 2: a vector path. It also was transparent around the edges.

Upon finding that the tutorial image did work, I altered it a little by wiping out a corner of the bitmap making it transparent. This causes it to render the corner black and show the rest fine.

Now I can see adding your glBlendFunc() and glEnable(GL_BLEND) provides the transparency.

Questions

  1. what are the specification when creating a texture?
  2. Is it my png that is not compatible?
  3. Is my texture coordinate array in the fragment shader working properly? I ask because no matter how I manipulate the values the image always displays upside down. I have tried reversing the t coords, have tried reversing all coords. Doesn’t seem to change.

After looking at the images that were working I found that creating a texture 256x256 must have been a factor, my old texture was 54x54. After increasing the size it worked.

So last thing I’m wondering about is, why it is upside-down. Reversed t coord no change.

final float[] squareNDRRATextureCoordinateData =
{
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f
};

The usual reason is a combination of:

[ol]
[li] Textures are typically (but not always) stored with the top-most row first (and texture-loading functions typically return the top-most row first regardless of storage order).
[/li][li] OpenGL interprets the data passed to glTexImage2D as starting at texture coordinate (0,0).
[/li][li] The texture coordinates have been assigned on the basis that (0,0) is the lower-left (consistent with OpenGL screen coordinates).
[/li][/ol]

Because the shader is using the texture coordinates to offset the vertex coordinates from the point’s centre, if you flip the texture coordinates passed to the shader, you’ll also flip the vertex coordinates, so the two cancel each other out.

You’ll need to either reverse the order in which the rows are passed to glTexImage2D, or make the vertex shader flip the Y coordinate (for either the vertex coordinates or the texture coordinates, but not both).

GClements thank you for all your help! My textured points look fantastic, this issue is now resolved.
Curious have you much experience with GIS?

Used this to flip the texture coords:

	protected String getPointVertexShader()
	{
		// Define a simple shader program for our points.
		final String vertexShader =
		"uniform mat4 u_MVPMatrix;							
"		
		+ "uniform vec2 u_pointSize;						
"
		+ "attribute vec4 a_Position;						
"
		+ "attribute vec2 a_TexCoordinate;					
"		// Per-vertex texture coordinate information we will pass in.			  

		+ "varying vec2 v_TexCoordinate;					
"		// This will be passed into the fragment shader.
		
		+ "void main()                  					
"
		+ "{                            					
"
		//+ "   v_TexCoordinate = a_TexCoordinate;			
"		// Pass through the texture coordinate.
		+ "   v_TexCoordinate = a_TexCoordinate.st * vec2(1.0, -1.0);
"// Pass through the texture coordinate.
		+ "   gl_Position = u_MVPMatrix * a_Position;		
"		// gl_Position is a special variable used to store the final position.
		+ "   gl_Position += vec4(gl_Position.w * u_pointSize * (a_TexCoordinate - vec2(0.5,0.5)), 0, 0);
"
		+ "}                              					
";
		return vertexShader;
	}

I’m moderately active in developing GRASS.

Ok hopefully you might know what’s going on here, I have this map that I’m trying to design on android using opengl-es 2.0. For the most part everything is looking pretty good. I have plotted out lines and points as per their coordinates.

However when I do a comparison in MapInfo there are noticeable variations in the way lines are drawn. I have attached two images, first is the MapInfo image the second is from my android, (apologies about the pixelation)
Looking at the vertical lines in the MapInfo image, they seem more inline (straighter) than on the android. I realize that the GIS system warp the maps somehow, just wondering if you know what method they use and if I can do the same thing with OpenGL?
[ATTACH=CONFIG]514[/ATTACH]

[ATTACH=CONFIG]515[/ATTACH]

[QUOTE=Hank Finley;1254620]However when I do a comparison in MapInfo there are noticeable variations in the way lines are drawn. I have attached two images, first is the MapInfo image the second is from my android, (apologies about the pixelation)
Looking at the vertical lines in the MapInfo image, they seem more inline (straighter) than on the android. I realize that the GIS system warp the maps somehow, just wondering if you know what method they use and if I can do the same thing with OpenGL?[/QUOTE]
From looking at the image, I’d suspect a precision issue. Ensure that any fixed offset is removed before converting from int or double to float (i.e. for the coordinates passed to OpenGL, 0,0 should be the centre of your data set, not the Gulf of Guinea). Try adding “highp” qualifiers to any shader variables.

Cartographic projections (e.g. lat-lon to UTM) won’t have noticeable curvature at the scale of a street or even a town, and they won’t introduce “wobble”. At worst, they may introduce an affine transformation (translation, scale, rotation, shear), but applying an affine transformation to points which lie in a straight line results in transformed points which also lie in a straight line.

The data is in a projection of 4326 lat-long.
The data is stored as doubles, as soon as I extract it, it is set to floating point.
After this I assess the min and max bounds of all the data, from which I calculate my offset.
I then create the various vbo’s, using the floating point coordinates, minus the offset.

The only thing I tried was setting the precision in the shader:
(No visible change)

protected String getLineVertexShader()
{
	final String vertexShader =
		"precision highp float;			  
"		// Set the default precision to high.
		+ "uniform mat4 u_MVPMatrix;      
"		// A constant representing the combined model/view/projection matrix.
				
		+ "attribute vec4 a_Position;     
"		// Per-vertex position information we will pass in.
			  
		+ "void main()                    
"		// The entry point for our vertex shader.
		+ "{                              
"
		+ "   gl_Position = u_MVPMatrix * a_Position;
"// Multiply the vertex by the matrix to get the final point in
		+ "   gl_PointSize = 15.0;        
" 			                                            			 
		+ "}                              
";    // normalized screen coordinates.
	return vertexShader;
}
protected String getLineFragmentShader()
{
	final String fragmentShader =
		"precision highp float;     	  
"		// Set the default precision to high.
		+ "uniform vec4 u_Color;          
"	// This is the color from the vertex shader interpolated across the triangle per fragment.			  
		+ "void main()                    
"	// The entry point for our fragment shader.
		+ "{                              
"
		+ "   gl_FragColor = u_Color;     
"	// Pass the color directly through the pipeline.		  
		+ "}                              
";
	return fragmentShader;
}

Can you clarify what you mean here.

If the data is converted to single precision (i.e. “float”) before removing the offset, that’s going to hurt the precision. Specifically, coordinates in the region of (152,-27) stored in single precision will have a precision of around 1.5 metres in the X direction and 0.2 metres in the Y.

Hi, finally got around to modifying the way these coordinates were set. It has worked perfectly, thanks for all the help and advice!
Kind regards Hank.