I have the following code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
public class TextureScript : MonoBehaviour
{
Texture2D texture;
// Start is called before the first frame update
void Start()
{
texture = new Texture2D(256,256, TextureFormat.RGB24, true);
var rectTransform = transform.GetComponent<RectTransform>();
rectTransform.sizeDelta = new Vector2(texture.width, texture.height);
int pixelCount = texture.width * texture.height;
Queue<Color> queue = new Queue<Color>();
for(int i = 0; i < pixelCount; i++)
{
queue.Enqueue(new Color(255,0,0));
}
Color[] colorArray = queue.ToArray();
texture.SetPixelData(colorArray, 0, 0);
texture.filterMode = FilterMode.Point;
texture.Apply(updateMipmaps: false);
GetComponent<RawImage>().material.mainTexture = texture;
}
// Update is called once per frame
void Update()
{
}
}
Which produces this:

I am unsure as to what could be producing it. Previously I was using a byte array for tests which worked, but my work now requires a queue, and to then convert this queue into an array (which I have done using .ToArray()).
I also used RGBA32 rather than RGB24 with varying alpha values of 255, 128, 1, & 0; however, this produced a nearly see-through image each time.
Thank-you for your help.
Queueinstead of simply an array?SetPixelDatainstead of simplySetPixels?I would simply do e.g.
SetPixelDatais ratherWhat happens in your case is:
Colorhas abytesize of 4 float (=16 byte) since it also has an alpha value! Your texture is usingRGB24and therefore only expecting 3 bytes (=24 bit) per pixel.So in case if you really want to stick to that (might be faster - or not ^^) you would rather have to do e.g.
also regarding
note that
Colortakes float arguments from0to1. If you are looking for a byte based input rather go forColor32andSetPixels32Further note that
RawImageis actually quite expensive - you could stick to anImagecomponent and simply create aSpritefrom yourTexture2D