4821

Converting GLSL modern OpenGL 3.2

I'm following a freetype tutorial on wikibooks on a mac running 10.9 with Xcode 5. I have it running with shader version 120 but I want to use some modern features so I set the SDL hints to OpenGL 3.2 and convert my shaders to 150. The problem is in version 150, the use of texture2D will prevent the shader from compiling.

Here is the version 120 of the shaders:

const GLchar* vTextSource = "#version 120\n" "attribute vec4 coord;" "varying vec2 texcoord;" "void main(void) {" " gl_Position = vec4(coord.xy, 0, 1);" " texcoord = coord.zw;" "}"; const GLchar* fTextSource = "#version 120\n" "varying vec2 texcoord;" "uniform sampler2D tex;" "uniform vec4 color;" "void main(void) {" " gl_FragColor = vec4(1,1,0, texture2D(tex, texcoord).a) * color;" "}";

And this is what I have for version 150. The vertex shader builds but the fragment shader fails unless I remove any uses of texture2D. All the CPU code is the same between them.

const GLchar* vTextSource = "#version 150 \n" "in vec4 coord;" "out vec2 texcoord;" "void main(void) {" " gl_Position = vec4(coord.xy, 0, 1);" " texcoord = coord.zw;" "}"; const GLchar* fTextSource = "#version 150\n" "uniform sampler2D tex;" "in vec2 texcoord;" "uniform vec4 color;" "out vec4 outColor;" "void main(void) {" " outColor = vec4(1,1,1, texture2D(tex, texcoord).a) * color;" "}";

Is there something I am missing? Is setting up the texture sampler different in core profile than in compatibility mode?

Edit: I have changed from texture2D(...) to texture(...) in the fragment shader. It compiles now but shows nothing. I'm not sure how to go about debugging this. I've included the texture initialization routine:

void SdlApplication::render_text(const char *text, float x, float y, float sx, float sy) { const char *p; FT_GlyphSlot g = face->glyph; /* Create a texture that will be used to hold one "glyph" */ GLuint tex; glActiveTexture(GL_TEXTURE0); glGenTextures(1, &tex); glBindTexture(GL_TEXTURE_2D, tex); glUniform1i(ogl.uniform_tex, 0); /* We require 1 byte alignment when uploading texture data */ glPixelStorei(GL_UNPACK_ALIGNMENT, 1); /* Clamping to edges is important to prevent artifacts when scaling */ glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); /* Linear filtering usually looks best for text */ glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); /* Set up the VBO for our vertex data */ glEnableVertexAttribArray(ogl.attribute_coord); glBindBuffer(GL_ARRAY_BUFFER, vboText); glVertexAttribPointer(ogl.attribute_coord, 4, GL_FLOAT, GL_FALSE, 0, 0); /* Loop through all characters */ for (p = text; *p; p++) { /* Try to load and render the character */ if (FT_Load_Char(face, *p, FT_LOAD_RENDER)) continue; /* Upload the "bitmap", which contains an 8-bit grayscale image, as an alpha texture */ glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, g->bitmap.width, g->bitmap.rows, 0, GL_ALPHA, GL_UNSIGNED_BYTE, g->bitmap.buffer); /* Calculate the vertex and texture coordinates */ float x2 = x + g->bitmap_left * sx; float y2 = -y - g->bitmap_top * sy; float w = g->bitmap.width * sx; float h = g->bitmap.rows * sy; point box[4] = { {x2, -y2, 0, 0}, {x2 + w, -y2, 1, 0}, {x2, -y2 - h, 0, 1}, {x2 + w, -y2 - h, 1, 1}, }; /* Draw the character on the screen */ glBufferData(GL_ARRAY_BUFFER, sizeof box, box, GL_DYNAMIC_DRAW); glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); /* Advance the cursor to the start of the next character */ x += (g->advance.x >> 6) * sx; y += (g->advance.y >> 6) * sy; } glDisableVertexAttribArray(ogl.attribute_coord); glDeleteTextures(1, &tex); }

Edit 2: Added vao to my vertices setup. I now get squares of solid colors where the text should be. So it seems like the texture coordinates are messed up again.

I added checks after each call and found out I get the 1280 code right after glewInit but not before...

Answer1:

The update of your GLSL code to the latest standards looks fine, except for the problem with texture2D(). As was already pointed out, the texture sampling functions are now overloaded, and texture() needs to be used instead of texture2D().

The remaining problems are mostly with updating the code to use the Core Profile, which deprecates many legacy features. Looking at the posted code, this includes:

    <li>

    Using VAOs (Vertex Array Objects) is mandatory for setting up vertex state. Use functions like glGenVertexArrays() and glBindVertexArray() to set up a VAO, and make sure that you have it bound while using vertex state setup functions like glVertexAttribPointer() and glEnableVertexAttribArray().

    </li> <li>

    The GL_ALPHA texture format is not supported anymore. For using a texture format with a single 8-bit component, use GL_R8 for the internal format, and GL_RED for the format:

    glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, g->bitmap.width, g->bitmap.rows, 0, GL_RED, GL_UNSIGNED_BYTE, g->bitmap.buffer);

    This will also require a minor change in the shader code, since the sampling value for the 1-component texture is now in the red component:

    outColor = vec4(1,1,1, texture2D(tex, texcoord).r) * color; </li> </ul>

    Answer2:

    Instead of using texture2D in your fragment shader, you should be using texture :

    void main(void) { outColor = vec4(1, 1, 1, texture(tex, texcoord).a) * color; }

Recommend

  • Pygame text not rendering
  • NSException on import of matplotlib, kivy in OSX
  • How to avoid multiple loops with multiple variables in R
  • Retrieve plaintext WS-Security password at service endpoint with Metro + WSIT?
  • Regex match pattern after positive look ahead
  • XAML UWP Flyout positioning
  • What is the runtime complexity of Python's deepcopy()?
  • Why VBA goes to error handling code when there is no error?
  • Controlling tab space in a using CSS?
  • I don't get what's the difference between format() and … (python)
  • Using HTML/CSS for UI in XNA?
  • How to define an array of floats in Shader properties?
  • Scala: Function returning an unknown type
  • In matplotlib, how do you change the fontsize of a single figure?
  • Java Scanner input dilemma. Automatically inputs without allowing user to type
  • Adding a button at the bottom of a table view
  • C# - Is there a limit to the size of an httpWebRequest stream?
  • Read text file and split every line in MSBuild
  • req.body is undefined - nodejs
  • Counter field in MS Access, how to generate?
  • Opengl-es onTouchEvents problem or a draw problem? [closed]
  • Get object from AWS S3 as a stream
  • AES padding and writing the ciphertext to a disk file
  • Java applet as stand-alone Windows application?
  • MySQL WHERE-condition in procedure ignored
  • Convert array of 8 bytes to signed long in C++
  • Adding custom controls to a full screen movie
  • Do create extension work in single-user mode in postgres?
  • Web-crawler for facebook in python
  • Comma separated Values
  • Linker errors when using intrinsic function via function pointer
  • trying to dynamically update Highchart column chart but series undefined
  • Error creating VM instance in Google Compute Engine
  • Hits per day in Google Big Query
  • how does django model after text[] in postgresql [duplicate]
  • LevelDB C iterator
  • How to Embed XSL into XML
  • UserPrincipal.Current returns apppool on IIS
  • Conditional In-Line CSS for IE and Others?
  • java string with new operator and a literal