Skip to content

Instantly share code, notes, and snippets.

@aras-p
Created September 9, 2022 09:59
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save aras-p/6807f4e9ec42c63a8bcb95143b549aac to your computer and use it in GitHub Desktop.
Save aras-p/6807f4e9ec42c63a8bcb95143b549aac to your computer and use it in GitHub Desktop.
Unity integer texture formats test
Shader "Unlit/IntegerShader"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
}
SubShader
{
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma require compute
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}
Texture2D<uint4> _MainTex;
StructuredBuffer<uint2> _Buffer;
half4 frag (v2f i) : SV_Target
{
int2 iuv = i.uv * 2;
#if 1
// read from the texture
uint4 val = _MainTex.Load(int3(iuv.x, iuv.y, 0));
#else
// read from the buffer
uint2 rawVal = _Buffer[iuv.x + iuv.y * 2];
uint4 val;
val.x = rawVal.x & 0xFFFF;
val.y = rawVal.x >> 16;
val.z = rawVal.y & 0xFFFF;
val.w = rawVal.y >> 16;
#endif
half4 col = 0;
col.r = (val.x & 7) / 7.0;
col.g = val.y / 65535.0;
col.b = val.z;
return col;
}
ENDCG
}
}
}
using UnityEngine;
using UnityEngine.Experimental.Rendering;
public class IntegerTexture : MonoBehaviour
{
void Start()
{
// What we'd want to do is create a texture in "integer" format (RGBA, 16 bit integer per channel),
// and then manually do texture.Load in the shader to get raw integer values.
//
// Similar to e.g. https://forum.unity.com/threads/render-10bit-video-to-r16g16b16a16_unorm-texture.817584/
//
// However in all versions I tried (2021.3.9, 2022.1.15, 2022.2b7, 2023.1a9), using
// GraphicsFormat.R16G16B16A16_UInt for the texture creation fails with this error:
// Texture creation failed. 'R16G16B16A16_UInt' is not supported for Sample usage on this platform
// It seems to be because of this code:
// https://github.com/Unity-Technologies/UnityCsReference/blob/master/Runtime/Export/Graphics/Texture.cs#L644
// that checks whether the texture can be used to "Sample". But I don't want to "Sample", I just merely want
// to "Load"!
//
// The code below also tries to work around this issue by using a compute buffer instead of a
// texture. Which kinda works but is not efficient on the GPU, and is cumbersome too.
var data = new ushort[]
{
0,0,0,0, 3,16000,1,1,
5,30000,0,2, 7,65000,2,3,
};
var mat = GetComponent<Renderer>().material;
var tex = new Texture2D(2,2,GraphicsFormat.R16G16B16A16_UInt,TextureCreationFlags.None);
if (tex)
{
tex.SetPixelData(data, 0);
tex.Apply();
mat.mainTexture = tex;
}
var buf = new ComputeBuffer(4, 8);
buf.SetData(data);
mat.SetBuffer("_Buffer", buf);
}
}
@aras-p
Copy link
Author

aras-p commented Feb 17, 2023

@weichx yes, and after a month or two of investigation the bug was closed with "this is not a bug, but a feature request" :( Which is an opinion I don't agree with but...

@weichx
Copy link

weichx commented Feb 17, 2023

ouch, well thanks for the update. Have you explored getting around this with the native plugin approach? That's the next thing I'm going to try since the platform obviously supports this.

@aras-p
Copy link
Author

aras-p commented Feb 17, 2023

No I haven't tried that -- this is not a functionality I needed for anything; I was just trying to file a bug for someone else. Another workaround that comes to mind is to create e.g. R32G32_SFloat format, put data in there (it's the same 64 bits per pixel as the format you want), make sure to use nearest/point filtering and in the shader do like uint2 rg_uint = asuint(tex2D(...).rg) and then decode 16-bit integers out of that by some shifting and masking.

@weichx
Copy link

weichx commented Feb 17, 2023

That's a good thought as well, I'll give that a try, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment