Skip to content

Instantly share code, notes, and snippets.

@Namburger
Created August 28, 2020 22:51
Show Gist options
  • Save Namburger/f700eb6b18bd1e3697638088d5995c8b to your computer and use it in GitHub Desktop.
Save Namburger/f700eb6b18bd1e3697638088d5995c8b to your computer and use it in GitHub Desktop.
disable-hdmi-mendel.dts
/dts-v1/;
/plugin/;
/ {
compatible = "fsl,imx8mq-hdmi";
fragment@0 {
target-path = "/hdmi@32c00000";
__overlay__ {
status = "disabled";
};
};
fragment@1 {
target-path = "/dcss@0x32e00000";
__overlay__ {
status = "disabled";
};
};
};
@Namburger
Copy link
Author

@hwjalapeno the only thing that this overlay does is disabling hdmi so I guess you just can't expects to be able to use the display, other than that I don't anticipate any other heads up. You are running edgetpu_classify_server which is trying to sink the images to a display using glimagesink. That's definitely isn't going to work.

So assuming I will recompile the model with this addition tomorrow

you wouldn't have to recompile the model, just try using the headless pipeline

@hwjalapeno
Copy link

Sure, will check it out, @borguleabhijeet , thanks!

@Namburger
Copy link
Author

@borguleabhijeet
Copy link

borguleabhijeet commented Dec 7, 2022 via email

@hwjalapeno
Copy link

Sure, will try it out, thanks @Namburger

@hwjalapeno
Copy link

Sure, will try out your piece of code, thanks @borguleabhijeet

@hwjalapeno
Copy link

import cv2 import gi import socket s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) s.connect(("8.8.8.8", 80)) IpAddr=(s.getsockname()[0]) s.close() gi.require_version('Gst', '1.0') gi.require_version('GstRtspServer', '1.0') from gi.repository import Gst,GObject,GstRtspServer class SensorFactory(GstRtspServer.RTSPMediaFactory): def init(self, *properties): super(SensorFactory, self).init(properties) self.cap =cv2.VideoCapture(0) # self.cap.set(cv2.CAP_PROP_MODE, cv2.CAP_MODE_YUYV); self.cap.set(3,1280) self.cap.set(4,720) self.number_frames = 0 self.fps = 30 self.duration = 1 / self.fps * Gst.SECOND # duration of a frame in nanoseconds caps_str = 'caps=video/x-raw,format=BGR,width={},height={},framerate={}/1 '.format(640, 480, 30) self.launch_string = 'appsrc name=source is-live=true block=true format=GST_FORMAT_TIME ' \ 'caps=video/x-raw,format=BGR,width=1280,height=720,framerate={}/1 ' \ '! videoconvert ! video/x-raw,format=I420 ' \ '! x264enc speed-preset=ultrafast tune=zerolatency ' \ '! rtph264pay config-interval=1 name=pay0 pt=96'.format(30) print(self.launch_string) def on_need_data(self, src, lenght): if self.cap.isOpened(): ret, frame = self.cap.read() # print("sending data") if ret: data = frame.tostring() buf = Gst.Buffer.new_allocate(None, len(data), None) buf.fill(0, data) buf.duration = self.duration timestamp = self.number_frames * self.duration buf.pts = buf.dts = int(timestamp) buf.offset = timestamp self.number_frames += 1 retval = src.emit('push-buffer', buf) # print('pushed buffer, frame {}, duration {} ns, durations {} s'.format(self.number_frames, # self.duration, # self.duration / Gst.SECOND)) if retval != Gst.FlowReturn.OK: print(retval) def do_create_element(self, url): print("Parsed") return Gst.parse_launch(self.launch_string) def do_configure(self, rtsp_media): self.number_frames = 0 appsrc = rtsp_media.get_element().get_child_by_name('source') appsrc.connect('need-data', self.on_need_data) print("configured") class GstServer(GstRtspServer.RTSPServer): def init(self, properties): super(GstServer, self).init(properties) self.address = IpAddr # Pi local IP self.set_address(self.address) self.set_service('8558') self.factory = SensorFactory() self.factory.set_shared(True) self.get_mount_points().add_factory("/test", self.factory) self.attach(None) GObject.threads_init() Gst.init(None) server = GstServer() print('Liesening on:rtsp://{0}:{1}'.format( server.get_address(), server.get_bound_port())) loop = GObject.MainLoop() loop.run() Sorry working via phone try this code it works with me.

On Thu, Dec 8, 2022, 1:35 AM Varun Raghavendra @.
> wrote: @.
commented on this gist. ------------------------------ Sure, will check it out, @borguleabhijeet https://github.com/borguleabhijeet , thanks! — Reply to this email directly, view it on GitHub https://gist.github.com/f700eb6b18bd1e3697638088d5995c8b#gistcomment-4395022 or unsubscribe https://github.com/notifications/unsubscribe-auth/AA63WTVW2TI5B3OB4KKG53LWMC4GHBFKMF2HI4TJMJ2XIZLTSKBKK5TBNR2WLJDHNFZXJJDOMFWWLK3UNBZGKYLEL52HS4DFQKSXMYLMOVS2I5DSOVS2I3TBNVS3W5DIOJSWCZC7OBQXE5DJMNUXAYLOORPWCY3UNF3GS5DZVRZXKYTKMVRXIX3UPFYGLK2HNFZXIQ3PNVWWK3TUUZ2G64DJMNZZDAVEOR4XAZNEM5UXG5FFOZQWY5LFVEYTANJRGA2TOMRQU52HE2LHM5SXFJTDOJSWC5DF . You are receiving this email because you were mentioned. Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub .

Hey, tried out your code on Google Coral

This is the warning log, we are not able to stream it on VLC as well as online RTSP servers,

Could you help us out here?

@borguleabhijeet
Copy link

Its works you should be installing the following package
sudo apt install gir1.2-gst-rtsp-server-1.0
Next, change the frame height and width which is currently 1280x720-->640x480 in both gstreamer pipeline cap.set or comment on it .
Finally, please use /test after the IP address and port as in the following example,
rtsp://192.168.0.1:8558/test
See the working example below
image

@hwjalapeno
Copy link

Sure thanks a lot, will check it out

@hwjalapeno
Copy link

Hi @Namburger

Found this in the Google Coral SoM Documentation

"HDMI reference clock (27 MHz) (positive/negative).  Required for bootup , even if you don't use HDMI.  "

This is blocking me to access the EdgeTPU for any application, it basically fails to detect the EdgeTPU with your device tree overlay, is this usual, or are you able to use it without this clock, , is there any way to disable this clock so that I can use the EdgeTPU properly? Thanks

@Namburger
Copy link
Author

Namburger commented Dec 13, 2022

@hwjalapeno it does not blocks you from accessing the EdgeTPU in any form. As I already mentioned here, your issue is that the application is trying to sync your result to a monitor which fails. There are 2 things that we suggested:

  • Have you tried using the headless pipeline as I mentioned?
  • @borguleabhijeet showed you how to sync the resulting buffer to a remote monitor via rtsp and it worked for him, did you try that?

@borguleabhijeet
Copy link

borguleabhijeet commented Dec 13, 2022 via email

@borguleabhijeet
Copy link

borguleabhijeet commented Dec 13, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment