MsgWaitForMultipleObjectsEx, indeed core function

Formerly I wrote message loop code like

1
2
3
4
5
	if (GetMessage (&msg, NULL, 0, 0, PM_REMOVE))
	{
		TranslateMessage (&msg);
		DispatchMessage (&msg);
	}

If no message occured, I couldn't do anything else. It's the reason why I couldn't wrote a cross-platform UI framework which needed to realize IOC abstraction.

In the GStreamer and GLib source code, I learnt from its G_WIN32_MSG_HANDLE processing logic, there is a critical api, MsgWaitForMultipleObjectsEx, which can bundle message and all other types of events together for polling, we needn't repetitively call PeekMessage to query if a message happens, it's ridiculous.

I guess the Qt framework really knows this trick while the Awtk doesn't.

Rtsp memo

Terms,

1
2
3
4
5
6
7
8
CC: CSRC Counter
CSRC: Contributing Source
SSRC: Synchronization Source
FIR: Full Intra Request, rfc5104
FCI: Feedback Control Information
REMB: Receiver Estimated Max Bitrate
TWCC: Transport wide Congestion Control
RTX: retransmission

Sdp example

v=0
o=- 17618512456233465749 1 IN IP4 127.0.0.1
s=Session streamed with GStreamer
i=rtsp-server
t=0 0
a=tool:GStreamer
a=type:broadcast
a=control:*
a=range:npt=0-
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:3000
a=rtpmap:96 H264/90000
a=framerate:30
a=fmtp:96 packetization-mode=1;sprop-parameter-sets=J2QAFKwrYKD9gIgAAAMACAAAAwHnQgAW4gAC3G173wdocKuA,KO48sA==;profile-level-id=640014;level-asymmetry-allowed=1
a=control:stream=0
a=ts-refclk:local
a=mediaclk:sender
a=ssrc:4060724388 cname:user1571168269@host-d3352119

refer to:
https://www.rfc-editor.org/rfc/rfc3550
https://www.rfc-editor.org/rfc/rfc3551
https://www.rfc-editor.org/rfc/rfc2326
https://www.rfc-editor.org/rfc/rfc4588
https://www.rfc-editor.org/rfc/rfc4585
https://www.rfc-editor.org/rfc/rfc7273
https://blog.csdn.net/weixin_42462202/article/details/98986535
https://github.com/fanxiushu/xdisp_virt
https://www.ngui.cc/el/72647.html

onnx relocation R_X86_64_TPOFF32 against hidden symbol

1
2
3
export CMAKE_ARGS="-DONNX_USE_PROTOBUF_SHARED_LIBS=ON"
apt-get install libprotobuf-dev protobuf-compiler
pip install onnx -i https://pypi.tuna.tsinghua.edu.cn/simple

Render video from OpenCV Mat using Direct3d

mainwindow.h

1
2
3
4
5
6
7
8
#include "vren_thread.h"
 
class MainWindow : public QMainWindow
{
	Q_OBJECT
	...
	vren_thread vren_;
};

mainwindow.c

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
#include "mainwindow.h"
#include <opencv2/opencv.hpp>
#include <opencv2/imgproc/types_c.h>
 
void CALLBACK DecCBFun(long nPort, char* pBuf, long nSize, FRAME_INFO* pFrameInfo, long nReserved1, long nReserved2)
{
	long lFrameType = pFrameInfo->nType;
 
	if (lFrameType == T_YV12)
	{
		MainWindow* win = port2MainWindow[nPort];
		if (nullptr == win)
		{
			qDebug() << "lookup main window from " << nPort << " failed.";
			return;
		}
 
		win->realYuvCallback(pBuf, nSize, pFrameInfo->nStamp, pFrameInfo->nWidth, pFrameInfo->nHeight);
	}
}
 
void MainWindow::realYuvCallback(const char* pBuf, int len, int64_t nStamp, int width, int height)
{
	cv::Mat dst(height, width, CV_8UC3);
	cv::Mat src(height + height / 2, width, CV_8UC1, (uchar*)pBuf);
	cv::cvtColor(src, dst, CV_YUV2RGBA_YV12); // CV_YUV2BGR_YV12);
	cv::line(dst, cv::Point(0, 0), cv::Point(100, 100), cv::Scalar(255, 0, 0), 10);
 
	vren_.Render_d3d(dst);
}
 
void CALLBACK fRealDataCallBack(LONG lRealHandle, DWORD dwDataType, BYTE* pBuffer, DWORD dwBufSize, void* pUser)
{
	MainWindow* pThis = (MainWindow*)pUser;
	pThis->realDataCallback(lRealHandle, dwDataType, pBuffer, dwBufSize);
}
 
void MainWindow::realDataCallback(LONG lRealHandle, DWORD dwDataType, BYTE* pBuffer, DWORD dwBufSize)
{
	DWORD dRet = 0;
	BOOL inData = FALSE;
 
	switch (dwDataType)
	{
	case NET_DVR_SYSHEAD:
		if (!PlayM4_GetPort(&port_))
		{
			break;
		}
 
		port2MainWindow[port_] = this;
		playWnd_ = (HWND)ui->widgetVideo->winId();
		vren_.SetParam(playWnd_);
 
		if (!PlayM4_OpenStream(port_, pBuffer, dwBufSize, 1024 * 1024))
		{
			dRet = PlayM4_GetLastError(port_);
			break;
		}
 
		if (!PlayM4_SetDecCallBackEx(port_, DecCBFun, NULL, NULL))
		{
			dRet = PlayM4_GetLastError(port_);
			break;
		}
 
		if (!PlayM4_Play(port_, NULL)) // playWnd_))
		{
			dRet = PlayM4_GetLastError(port_);
			break;
		}
	}
}

vren_thread.h

Read more

Compile gstreamer's test-launch in Windows

Include paths

1
2
3
D:\work\gstreamer\gstreamer\installed\include\gstreamer-1.0
D:\work\gstreamer\gstreamer\installed\include\glib-2.0
D:\work\gstreamer\gstreamer\installed\lib\glib-2.0\include

Lib paths

1
2
D:\work\gstreamer\gstreamer\installed\lib\gstreamer-1.0
D:\work\gstreamer\gstreamer\installed\lib

Libs:

1
2
3
4
5
6
7
8
9
10
11
gstrtspserver-1.0.lib
gstgio.lib
gstges.lib
gstmediafoundation.lib
gstwic.lib
gstadaptivedemux2.lib
gstd3d11.lib
gstnetsim.lib
gstreamer-1.0.lib
glib-2.0.lib
gobject-2.0.lib

MacOs 3d presentation in VMWare

Although VMWare can't support hardware accelerated 3d rendering for macos, I happened to see Garageband running in my old laptop, it indicates 'macOS Catalina (Version 10.15.4)' supports middle layer OpenGL software simulation, while up-to-date macOS deviates from this style.

invalid cast from 'GstVideoTestSrc' to 'GstBin'

videotestsrc is just an element, not a bin container, so we must use round brackets to embrace the elements all, which means rtsp://127.0.0.1:8554/test is the default mount point.

1
2
test-launch.exe "( videotestsrc ! qsvh264enc ! rtph264pay name=pay0 pt=96 )"
gst-launch-1.0.exe playbin uri=rtsp://127.0.0.1:8554/test --gst-debug=d3d11window:5

Build GStreamer in Windows

1
2
3
4
5
6
7
8
9
10
11
12
python -m pip install meson
python -m pip install ninja
 
git clone https://gitlab.freedesktop.org/gstreamer/gstreamer.git
cd gstreamer
 
rd /S /Q build
meson --buildtype=release -Dprefix=%CD%/installed -Dlibav=enabled -Dgst-plugins-ugly:x264=enabled -Dgst-plugins-bad:nvcodec=enabled build
 
meson compile -C build
 
meson install -C build

refer to:
https://gstreamer.freedesktop.org/documentation/installing/building-from-source-using-meson.html
http://dljz.nicethemes.cn/news/show-62582.html
https://blog.csdn.net/yuwg_le/article/details/126147636

GStreamer memo

https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html

https://cloud.tencent.com/developer/article/1820526
https://blog.csdn.net/csdnhuaong/article/details/80026433
https://blog.csdn.net/qq_42711516/article/details/123921984
https://blog.csdn.net/han2529386161/article/details/102724856

https://blog.csdn.net/m0_51004308/article/details/121357638
https://blog.csdn.net/Aidam_Bo/article/details/109772430

gstreamer之RTSP Server一个进程提供多路不同视频
https://blog.51cto.com/u_13161667/3310768

全网首发:gstreamer如何接入RTSP流(IP摄像头)的代码范例
https://blog.51cto.com/u_13161667/3310521

GstMP4Mux caps
DeepStream/Gstreamer queue实现预缓存功能
https://blog.csdn.net/qq_41632852/article/details/124959601

gstreamer中tee如何实现动态增减支路(预览+截图+录像)
https://blog.csdn.net/qq_41563600/article/details/121343927

【gstreamer opencv::Mat】将opencv的cv::Mat数据转换成MP4视频
https://blog.csdn.net/weixin_44495869/article/details/121900517

#播放并存储为h264
gst-launch-1.0 -e -v  udpsrc port=10001 !  "application/x-rtp, media=video, clock-rate=90000, encoding-name=H264"  ! rtph264depay ! tee name=t  t. ! queue ! h264parse ! "video/x-h264, stream-format=byte-stream" ! filesink location=./aa.h264 t. !  queue ! avdec_h264 ! glimagesink

#混流+缩放
./gst-launch-1.0  intervideosrc !  video/x-raw,framerate=(fraction)10/1,width=3200,height=2400 ! videomixer name=mix sink_1::xpos=0 sink_2::xpos=1000 sink_2::width=100 sink_2::height=100 ! videoconvert ! glimagesink filesrc location=./ttt.mov ! decodebin ! video/x-raw,width=1280,height=720 ! videoscale ! video/x-raw,width=100  ! mix.  videotestsrc ! video/x-raw ! mix.

gst-launch-1.0.exe videotestsrc ! queue ! d3d11videosink

http://t.zoukankan.com/missmzt-p-10918216.html

GStreamer插件:appsrc 解决播放实时视频流高延迟问题
https://blog.csdn.net/zzs0829/article/details/111562443

https://gstreamer.freedesktop.org/documentation/tutorials/basic/debugging-tools.html

g_setenv("GST_DEBUG_DUMP_DOT_DIR", "D:/work/gstreamer/gstdot", true);
gst_init(NULL, NULL);

GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "capture1234");

g_setenv("GST_DEBUG_FILE", "D:/work/gstreamer/gstdot/output.log", true);
g_setenv("G_MESSAGES_DEBUG","all",TRUE);
gst_debug_set_default_threshold(GST_LEVEL_TRACE);

http://www.graphviz.org/doc/info/command.html

1
2
3
4
push
	upstream element calls downstream sink pads' gst_pad_push
pull
	downstream element calls upstream source pads' gst_pad_pull_range

gst_rtspsrc_loop_interleaved
gst_rtspsrc_stream_configure_manager
gst_qt_mux_sink_event_pre_queue

rtp_session_process_sr
https://stackoverflow.com/questions/53169699/gstreamer-calculate-delay-in-received-video-frames-buffers-to-detect-communicat