I want to call CGBitmapContextCreate with texture->data to create a CGContextRef,
and create a CGImageRef by CGBitmapContextCreateImage(context).
However, the image created is not as expected :(
The one created from CGBitmapContextCreateImage:

The actual one (slight different since I take it with another camera):

Codes: (texture.bytesPerPixel = 2)
CGContextRef context = CGBitmapContextCreate(texture.data, 512, 512, 5, 512 * texture.bytesPerPixel, CGColorSpaceCreateDeviceRGB(), kCGImageAlphaNoneSkipFirst);
CGImageRef cg_img = CGBitmapContextCreateImage(context);
UIImage* ui_img = [UIImage imageWithCGImage: cg_img];
UIImageWriteToSavedPhotosAlbum(ui_img, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
Complete Code:
http://ihome.ust.hk/~tm_lksac/OpenGLSprite.m
The application usually calls - (void)drawSelfIfNeeded:(BOOL)needed
to update the texture in the application. But I want to take a "screenshoot" of the texture and save it as UIImage as further image processing.