Name
gl.TexImage2D -- specify a two-dimensional texture image
Synopsis
gl.TexImage2D(level, internalformat, w, h, border, format, type, pixels)
Function
Texturing maps a portion of a specified texture image onto each graphical primitive for which texturing is enabled. To enable and disable two-dimensional texturing, call gl.Enable() and gl.Disable() with argument #GL_TEXTURE_2D.

To define texture images, call gl.TexImage2D(). The arguments describe the parameters of the texture image, such as height, width, width of the border, level-of-detail number (See gl.TexParameter for details.), and number of color components provided. The last three arguments describe how the image is represented in memory; they are identical to the pixel formats used for glDrawPixels.

Data is read from pixels as a sequence of signed or unsigned bytes, shorts, or longs, or single-precision floating-point values, depending on type which can be #GL_UNSIGNED_BYTE, #GL_BYTE, #GL_BITMAP, #GL_UNSIGNED_SHORT, #GL_SHORT, #GL_UNSIGNED_INT, #GL_INT, and #GL_FLOAT. These values are grouped into sets of one, two, three, or four values, depending on format, to form elements. If type is #GL_BITMAP, the data is considered as a string of unsigned bytes (and format must be #GL_COLOR_INDEX). Each data byte is treated as eight 1-bit elements, with bit ordering determined by #GL_UNPACK_LSB_FIRST (See gl.PixelStore for details.).

The first element corresponds to the lower left corner of the texture image. Subsequent elements progress left-to-right through the remaining texels in the lowest row of the texture image, and then in successively higher rows of the texture image. The final element corresponds to the upper right corner of the texture image.

format determines the composition of each element in pixels. It can assume one of nine symbolic values:

#GL_COLOR_INDEX
Each element is a single value, a color index. The GL converts it to fixed point (with an unspecified number of zero bits to the right of the binary point), shifted left or right depending on the value and sign of #GL_INDEX_SHIFT, and added to #GL_INDEX_OFFSET (See gl.PixelTransfer for details.). The resulting index is converted to a set of color components using the #GL_PIXEL_MAP_I_TO_R, #GL_PIXEL_MAP_I_TO_G, #GL_PIXEL_MAP_I_TO_B, and #GL_PIXEL_MAP_I_TO_A tables, and clamped to the range [0,1].

#GL_RED
Each element is a single red component. The GL converts it to floating point and assembles it into an RGBA element by attaching 0 for green and blue, and 1 for alpha. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_GREEN
Each element is a single green component. The GL converts it to floating point and assembles it into an RGBA element by attaching 0 for red and blue, and 1 for alpha. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_BLUE
Each element is a single blue component. The GL converts it to floating point and assembles it into an RGBA element by attaching 0 for red and green, and 1 for alpha. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_ALPHA
Each element is a single alpha component. The GL converts it to floating point and assembles it into an RGBA element by attaching 0 for red, green, and blue. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_RGB
Each element is an RGB triple. The GL converts it to floating point and assembles it into an RGBA element by attaching 1 for alpha. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_RGBA
Each element contains all four components. Each component is multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_LUMINANCE
Each element is a single luminance value. The GL converts it to floating point, then assembles it into an RGBA element by replicating the luminance value three times for red, green, and blue and attaching 1 for alpha. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_LUMINANCE_ALPHA
Each element is a luminance/alpha pair. The GL converts it to floating point, then assembles it into an RGBA element by replicating the luminance value three times for red, green, and blue. Each component is then multiplied by the signed scale factor #GL_c_SCALE, added to the signed bias #GL_c_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

#GL_DEPTH_COMPONENT
Each element is a single depth component. It is converted to floating-point, then multiplied by the signed scale factor #GL_DEPTH_SCALE, added to the signed bias #GL_DEPTH_BIAS, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).

If an application wants to store the texture at a certain resolution or in a certain format, it can request the resolution and format with internalformat. internalformat specifies the internal format of the texture array. See Internal pixel formats for details. The GL will choose an internal representation that closely approximates that requested by internalformat, but it may not match exactly. (The representations specified by #GL_LUMINANCE, #GL_LUMINANCE_ALPHA, #GL_RGB, and #GL_RGBA must match exactly. The numeric values 1, 2, 3, and 4 may also be used to specify the above representations.)

A one-component texture image uses only the red component of the RGBA color extracted from pixels. A two-component image uses the R and A values. A three-component image uses the R, G, and B values. A four-component image uses all of the RGBA components.

Texturing has no effect in color index mode.

The texture image can be represented by the same data formats as the pixels in a gl.DrawPixels() command, except that #GL_STENCIL_INDEX and #GL_DEPTH_COMPONENT cannot be used. gl.PixelStore() and gl.PixelTransfer() modes affect texture images in exactly the way they affect gl.DrawPixels().

Please note that this command operates directly with memory pointers. There is also a version which works with tables instead of memory pointers, but this is slower of course. See gl.TexImage for details. See Working with pointers for details on how to use memory pointers with Hollywood.

Please consult an OpenGL reference manual for more information.

Inputs
level
specifies the level-of-detail number; level 0 is the base image level, level n is the nth mipmap reduction image
internalformat
specifies the number of color components in the texture; must be 1, 2, 3, or 4, or a symbolic constant (see above)
w
specifies the width of the texture image; must be 2^n + 2*border for some integer n; all implementations support texture images that are at least 64 texels wide
h
specifies the height of the texture image; must be 2^m + 2*border for some integer m; all implementations support texture images that are at least 64 texels high
border
specifies the width of the border; must be either 0 or 1
format
specifies the format of the pixel data (see above)
type
specifies the data type of the pixel data (see above)
pixels
specifies a pointer to the image data in memory
Errors
#GL_INVALID_ENUM is generated if format is not an accepted format constant. Format constants other than #GL_STENCIL_INDEX are accepted.

#GL_INVALID_ENUM is generated if type is not a type constant.

#GL_INVALID_ENUM is generated if type is #GL_BITMAP and format is not #GL_COLOR_INDEX.

#GL_INVALID_VALUE is generated if level is less than 0.

#GL_INVALID_VALUE may be generated if level is greater than log2max, where max is the returned value of #GL_MAX_TEXTURE_SIZE.

#GL_INVALID_VALUE is generated if internalformat is not 1, 2, 3, 4, or one of the accepted resolution and format symbolic constants.

#GL_INVALID_VALUE is generated if width or height is less than 0 or greater than 2 + #GL_MAX_TEXTURE_SIZE, or if either cannot be represented as 2k + 2*border for some integer value of k.

#GL_INVALID_VALUE is generated if border is not 0 or 1.

#GL_INVALID_OPERATION is generated if gl.TexImage2D() is executed between the execution of gl.Begin() and the corresponding execution of gl.End().

Associated gets
gl.GetTexImage()

gl.IsEnabled() with argument #GL_TEXTURE_2D


Show TOC