Skip to content

Instantly share code, notes, and snippets.

@jakep84
Last active February 28, 2023 15:39
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jakep84/f2c124249fa294aa75544f08faa43c5c to your computer and use it in GitHub Desktop.
Save jakep84/f2c124249fa294aa75544f08faa43c5c to your computer and use it in GitHub Desktop.
How to screenshare with Agora.io and unity3d
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using agora_gaming_rtc;
using UnityEngine.UI;
using System.Globalization;
using System.Runtime.InteropServices;
using System;
public class ShareScreen : MonoBehaviour
{
Texture2D mTexture;
Rect mRect;
[SerializeField]
private string appId = "Your_AppID";
[SerializeField]
private string channelName = "agora";
public IRtcEngine mRtcEngine;
int i = 100;
void Start()
{
Debug.Log("ScreenShare Activated");
mRtcEngine = IRtcEngine.getEngine(appId);
// enable log
mRtcEngine.SetLogFilter(LOG_FILTER.DEBUG | LOG_FILTER.INFO | LOG_FILTER.WARNING | LOG_FILTER.ERROR | LOG_FILTER.CRITICAL);
// set callbacks (optional)
mRtcEngine.SetParameters("{\"rtc.log_filter\": 65535}");
//Configure the external video source
mRtcEngine.SetExternalVideoSource(true, false);
// Start video mode
mRtcEngine.EnableVideo();
// allow camera output callback
mRtcEngine.EnableVideoObserver();
// join channel
mRtcEngine.JoinChannel(channelName, null, 0);
//Create a rectangle width and height of the screen
mRect = new Rect(0, 0, Screen.width, Screen.height);
//Create a texture the size of the rectangle you just created
mTexture = new Texture2D((int)mRect.width, (int)mRect.height, TextureFormat.BGRA32, false);
}
void Update()
{
//Start the screenshare Coroutine
StartCoroutine(shareScreen());
}
//Screen Share
IEnumerator shareScreen()
{
yield return new WaitForEndOfFrame();
//Read the Pixels inside the Rectangle
mTexture.ReadPixels(mRect, 0, 0);
//Apply the Pixels read from the rectangle to the texture
mTexture.Apply();
// Get the Raw Texture data from the the from the texture and apply it to an array of bytes
byte[] bytes = mTexture.GetRawTextureData();
// Make enough space for the bytes array
int size = Marshal.SizeOf(bytes[0]) * bytes.Length;
// Check to see if there is an engine instance already created
IRtcEngine rtc = IRtcEngine.QueryEngine();
//if the engine is present
if (rtc != null)
{
//Create a new external video frame
ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame();
//Set the buffer type of the video frame
externalVideoFrame.type = ExternalVideoFrame.VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA;
// Set the video pixel format
externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_BGRA;
//apply raw data you are pulling from the rectangle you created earlier to the video frame
externalVideoFrame.buffer = bytes;
//Set the width of the video frame (in pixels)
externalVideoFrame.stride = (int)mRect.width;
//Set the height of the video frame
externalVideoFrame.height = (int)mRect.height;
//Remove pixels from the sides of the frame
externalVideoFrame.cropLeft = 10;
externalVideoFrame.cropTop = 10;
externalVideoFrame.cropRight = 10;
externalVideoFrame.cropBottom = 10;
//Rotate the video frame (0, 90, 180, or 270)
externalVideoFrame.rotation = 180;
// increment i with the video timestamp
externalVideoFrame.timestamp = i++;
//Push the external video frame with the frame we just created
int a = rtc.PushVideoFrame(externalVideoFrame);
Debug.Log(" pushVideoFrame = " + a);
}
}
}
@gtk2k
Copy link

gtk2k commented Jan 21, 2020

I have error
“‘IRtcEngine’ does not contain a definition for ‘PushExternVideoFrame’ and no accessible extension method ‘PushExternVideoFrame’ accepting a first argument of type ‘IRtcEngine’ could be found”

@tywiggs
Copy link

tywiggs commented Feb 4, 2020

I think it is just "PushVideoFrame" now not sure though

@jakep84
Copy link
Author

jakep84 commented Mar 27, 2020

Yes, just PushVideoFrame

@harsh-priyadarshi
Copy link

Hey, the code doesn't work in build. Also, because in your code you are not initializing any VideoSurface, I tried initializing it inside JoinSuccess callback. I am able to see local stream in the editor, nothing happens in build. Also, no errors popup. I have enabled all kind of logs.

@jakep84
Copy link
Author

jakep84 commented Apr 16, 2020

Hey, the code doesn't work in build. Also, because in your code you are not initializing any VideoSurface, I tried initializing it inside JoinSuccess callback. I am able to see local stream in the editor, nothing happens in build. Also, no errors popup. I have enabled all kind of logs.

This code does not call a video surface, it shares what the users sees in the game view over a stream. Have you read the corresponding tutorial? https://medium.com/@jake_agora.io/how-to-broadcast-your-screen-with-unity3d-and-agora-io-10006b8a4aa7

@hskim-inventis
Copy link

주석 2020-04-17 173557
Hi. I share the screen windows build unity and see the rotated image on web demo.
Why the image is rotated. Changing the rotation to 0. but the result image is rotated.
externalVideoFrame.rotation = 0;
https://webdemo.agora.io/agora-web-showcase/examples/Agora-Screen-Sharing-Web/?_ga=2.201467473.1125400638.1587082579-1090229122.1587082579

@lukasrandom
Copy link

I have the same rotation issue. Did you come up with a solution for it?

@ooohlee
Copy link

ooohlee commented May 22, 2020

same rotation issue!

@shinsim
Copy link

shinsim commented Jun 9, 2020

I'm using this script and the receiver have the video flipped and color issue. any solution?

@b00dle
Copy link

b00dle commented Feb 28, 2023

I have followed this tutorial, which lead me here. It seems that the tutorial (and the above snippet) was built based on an older version of the agora unity sdk. Even though things seem largely similar, I cannot get this to function after some thorough debugging.

The captured screen frame appears to be processed and pushed as expected:

  • return code of PushVideoFrame (see line 85 of gist snippet above) is 0 (indicating -> OK)
  • byte size of my sent frame corresponds to the screen I am sharing

On the receiving end, I am not getting any meaningful output:

  • user join recognised & VideoSurface is created
  • TextureManager of the VideoSurface (at given remote user id) ends up in error state at each frame (IRIS_VIDEO_PROCESS_ERR.ERR_NULL_POINTER) -> see snippet below taken from TextureManager.cs
internal void ReFreshTexture()
{
    var ret = _videoStreamManager.GetVideoFrame(ref _cachedVideoFrame, ref isFresh, _sourceType, _uid, _channelId);
    this.Width = _cachedVideoFrame.width;
    this.Height = _cachedVideoFrame.height;

    Debug.Log("width " + this.Width + " height " + this.Height); // outputs [0, 0] for faulty stream

    if (ret == IRIS_VIDEO_PROCESS_ERR.ERR_BUFFER_EMPTY || ret == IRIS_VIDEO_PROCESS_ERR.ERR_NULL_POINTER)
    {
        _canAttach = false;
        Debug.Log(string.Format("no video frame for user channel: {0} uid: {1}", _channelId, _uid)); // outputs expected channel and uid
        Debug.Log("source type " + _sourceType); // outputs VIDEO_SOURCE_REMOTE
        Debug.Log("isFresh " + isFresh); // outputs false
        Debug.Log("Error " + ret); // outputs ERR_NULL_POINTER
        return;
    }
    else if (ret == IRIS_VIDEO_PROCESS_ERR.ERR_SIZE_NOT_MATCHING)
    {
        // prepare resize -> see original source
    }
    else
    {
        _canAttach = true;
    }

    if (isFresh)
    {
        // apply fresh texture -> see original source
    }
}

The most likely reason for this I see is the adjusted signature of SetExternalVideoSource(bool enabled, bool useTexture, EXTERNAL_VIDEO_SOURCE_TYPE sourceType, SenderOptions encodedVideoOption);. The latter two (new) parameters I used were EXTERNAL_VIDEO_SOURCE_TYPE.VIDEO_FRAME, new SenderOptions(). I can see how default constructed SenderOptions could create ill-defined frames:

public SenderOptions()
{
   ccMode = TCcMode.CC_ENABLED;
   codecType = VIDEO_CODEC_TYPE.VIDEO_CODEC_GENERIC_H264;
   targetBitrate = 6500;
}

Does anyone here have any pointers in the right direction, to solve the issue I am describing? Is there an updated version of the screen share tutorial?

@b00dle
Copy link

b00dle commented Feb 28, 2023

Figured it out myself... If you look carefully (unlike me), you'll find a CustomCaptureVideo Example shipped along with the agora sdk for Unity (located under Agora-RTC-Plugin/API-Example/Examples/Advanced/CustomCaptureVideo). If you open CustomCaptureVideo.cs, you'll see a very similar implementation to the gist above. The major difference in my case turned out to be a difference in how raw bytes are being copied at newer Unity versions:

#if UNITY_2018_1_OR_NEWER
                NativeArray<byte> nativeByteArray = _texture.GetRawTextureData<byte>();
                if (_shareData?.Length != nativeByteArray.Length)
                {
                    _shareData = new byte[nativeByteArray.Length];
                }
                nativeByteArray.CopyTo(_shareData);
#else
                _shareData = _texture.GetRawTextureData();
#endif

The external video source is instantiated similar to how I had tried:

private void SetExternalVideoSource()
{
    var ret = RtcEngine.SetExternalVideoSource(true, false, EXTERNAL_VIDEO_SOURCE_TYPE.VIDEO_FRAME, new SenderOptions());
    this.Log.UpdateLog("SetExternalVideoSource returns:" + ret);
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment