-
-
Save jakep84/f2c124249fa294aa75544f08faa43c5c to your computer and use it in GitHub Desktop.
I think it is just "PushVideoFrame" now not sure though
Yes, just PushVideoFrame
Hey, the code doesn't work in build. Also, because in your code you are not initializing any VideoSurface, I tried initializing it inside JoinSuccess callback. I am able to see local stream in the editor, nothing happens in build. Also, no errors popup. I have enabled all kind of logs.
Hey, the code doesn't work in build. Also, because in your code you are not initializing any VideoSurface, I tried initializing it inside JoinSuccess callback. I am able to see local stream in the editor, nothing happens in build. Also, no errors popup. I have enabled all kind of logs.
This code does not call a video surface, it shares what the users sees in the game view over a stream. Have you read the corresponding tutorial? https://medium.com/@jake_agora.io/how-to-broadcast-your-screen-with-unity3d-and-agora-io-10006b8a4aa7
Hi. I share the screen windows build unity and see the rotated image on web demo.
Why the image is rotated. Changing the rotation to 0. but the result image is rotated.
externalVideoFrame.rotation = 0;
https://webdemo.agora.io/agora-web-showcase/examples/Agora-Screen-Sharing-Web/?_ga=2.201467473.1125400638.1587082579-1090229122.1587082579
I have the same rotation issue. Did you come up with a solution for it?
same rotation issue!
I'm using this script and the receiver have the video flipped and color issue. any solution?
I have followed this tutorial, which lead me here. It seems that the tutorial (and the above snippet) was built based on an older version of the agora unity sdk. Even though things seem largely similar, I cannot get this to function after some thorough debugging.
The captured screen frame appears to be processed and pushed as expected:
- return code of PushVideoFrame (see line 85 of gist snippet above) is 0 (indicating -> OK)
- byte size of my sent frame corresponds to the screen I am sharing
On the receiving end, I am not getting any meaningful output:
- user join recognised & VideoSurface is created
- TextureManager of the VideoSurface (at given remote user id) ends up in error state at each frame (IRIS_VIDEO_PROCESS_ERR.ERR_NULL_POINTER) -> see snippet below taken from TextureManager.cs
internal void ReFreshTexture()
{
var ret = _videoStreamManager.GetVideoFrame(ref _cachedVideoFrame, ref isFresh, _sourceType, _uid, _channelId);
this.Width = _cachedVideoFrame.width;
this.Height = _cachedVideoFrame.height;
Debug.Log("width " + this.Width + " height " + this.Height); // outputs [0, 0] for faulty stream
if (ret == IRIS_VIDEO_PROCESS_ERR.ERR_BUFFER_EMPTY || ret == IRIS_VIDEO_PROCESS_ERR.ERR_NULL_POINTER)
{
_canAttach = false;
Debug.Log(string.Format("no video frame for user channel: {0} uid: {1}", _channelId, _uid)); // outputs expected channel and uid
Debug.Log("source type " + _sourceType); // outputs VIDEO_SOURCE_REMOTE
Debug.Log("isFresh " + isFresh); // outputs false
Debug.Log("Error " + ret); // outputs ERR_NULL_POINTER
return;
}
else if (ret == IRIS_VIDEO_PROCESS_ERR.ERR_SIZE_NOT_MATCHING)
{
// prepare resize -> see original source
}
else
{
_canAttach = true;
}
if (isFresh)
{
// apply fresh texture -> see original source
}
}
The most likely reason for this I see is the adjusted signature of SetExternalVideoSource(bool enabled, bool useTexture, EXTERNAL_VIDEO_SOURCE_TYPE sourceType, SenderOptions encodedVideoOption);
. The latter two (new) parameters I used were EXTERNAL_VIDEO_SOURCE_TYPE.VIDEO_FRAME, new SenderOptions()
. I can see how default constructed SenderOptions could create ill-defined frames:
public SenderOptions()
{
ccMode = TCcMode.CC_ENABLED;
codecType = VIDEO_CODEC_TYPE.VIDEO_CODEC_GENERIC_H264;
targetBitrate = 6500;
}
Does anyone here have any pointers in the right direction, to solve the issue I am describing? Is there an updated version of the screen share tutorial?
Figured it out myself... If you look carefully (unlike me), you'll find a CustomCaptureVideo Example shipped along with the agora sdk for Unity (located under Agora-RTC-Plugin/API-Example/Examples/Advanced/CustomCaptureVideo). If you open CustomCaptureVideo.cs, you'll see a very similar implementation to the gist above. The major difference in my case turned out to be a difference in how raw bytes are being copied at newer Unity versions:
#if UNITY_2018_1_OR_NEWER
NativeArray<byte> nativeByteArray = _texture.GetRawTextureData<byte>();
if (_shareData?.Length != nativeByteArray.Length)
{
_shareData = new byte[nativeByteArray.Length];
}
nativeByteArray.CopyTo(_shareData);
#else
_shareData = _texture.GetRawTextureData();
#endif
The external video source is instantiated similar to how I had tried:
private void SetExternalVideoSource()
{
var ret = RtcEngine.SetExternalVideoSource(true, false, EXTERNAL_VIDEO_SOURCE_TYPE.VIDEO_FRAME, new SenderOptions());
this.Log.UpdateLog("SetExternalVideoSource returns:" + ret);
}
I have error
“‘IRtcEngine’ does not contain a definition for ‘PushExternVideoFrame’ and no accessible extension method ‘PushExternVideoFrame’ accepting a first argument of type ‘IRtcEngine’ could be found”