Notify us socket is ready after WSAEWOULDBLOCK

Two ways to achieve this goal:

  • Use select api.
  • Use WSAEventSelect and WSAWaitForMultipleEvents. This method is recommended, for we can add a cancellable event along with the socket for polling like GCancellable structure in glib. Cancellable architecture responds smarter than tangible waiting for the ending of sub-modules.

refer to:
gstreamer/subprojects/glib/gio/gsocket.c:g_socket_condition_timed_wait
https://blog.csdn.net/qq_30145355/article/details/78379969

Build FFMpeg in Windows

In MSYS2 x64 bash,

pacman -S mingw-w64-x86_64-toolchain
pacman -S base-devel
pacman -S yasm nasm gcc

basedir=/d/work/open

cd ${basedir}/x264
./configure --prefix=${basedir}/x264_install --enable-static --extra-cflags="-O0 -g3" --enable-debug
make
make install

cd ${basedir}/ffmpeg
./configure --prefix=${basedir}/ffmpeg_install --enable-static --disable-shared --extra-cflags=-I${basedir}/x264_install/include --extra-ldflags=-L${basedir}/x264_install/lib
make
make install

refer to:
http://events.jianshu.io/p/53ecc4dbe7d0
https://www.videolan.org/developers/x264.html
http://ffmpeg.org/download.html
https://www.msys2.org/
https://git-scm.com/download/win

Perspectively transform points

After we get the transformation matrix 'trans_mat' from the 'getPerspectiveTransform', we can transform a point in this way

1
2
3
4
5
        cv::Point2f src = cv::Point2f(123, 456);
        std::vector<cv::Point2f> in_pts, out_pts;
        in_pts.push_back(src);
        cv::perspectiveTransform(in_pts, out_pts, trans_mat);
        cv::Point2f dst = out_pts.front();

refer to:
https://blog.csdn.net/xiaowei_cqu/article/details/26478135

MsgWaitForMultipleObjectsEx, indeed core function

Formerly I wrote message loop code like

1
2
3
4
5
	if (GetMessage (&msg, NULL, 0, 0, PM_REMOVE))
	{
		TranslateMessage (&msg);
		DispatchMessage (&msg);
	}

If no message occured, I couldn't do anything else. It's the reason why I couldn't wrote a cross-platform UI framework which needed to realize IOC abstraction.

In the GStreamer and GLib source code, I learnt from its G_WIN32_MSG_HANDLE processing logic, there is a critical api, MsgWaitForMultipleObjectsEx, which can bundle message and all other types of events together for polling, we needn't repetitively call PeekMessage to query if a message happens, it's ridiculous.

I guess the Qt framework really knows this trick while the Awtk doesn't.

Rtsp memo

Terms,

1
2
3
4
5
6
7
8
CC: CSRC Counter
CSRC: Contributing Source
SSRC: Synchronization Source
FIR: Full Intra Request, rfc5104
FCI: Feedback Control Information
REMB: Receiver Estimated Max Bitrate
TWCC: Transport wide Congestion Control
RTX: retransmission

Sdp example

v=0
o=- 17618512456233465749 1 IN IP4 127.0.0.1
s=Session streamed with GStreamer
i=rtsp-server
t=0 0
a=tool:GStreamer
a=type:broadcast
a=control:*
a=range:npt=0-
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:3000
a=rtpmap:96 H264/90000
a=framerate:30
a=fmtp:96 packetization-mode=1;sprop-parameter-sets=J2QAFKwrYKD9gIgAAAMACAAAAwHnQgAW4gAC3G173wdocKuA,KO48sA==;profile-level-id=640014;level-asymmetry-allowed=1
a=control:stream=0
a=ts-refclk:local
a=mediaclk:sender
a=ssrc:4060724388 cname:user1571168269@host-d3352119

refer to:
https://www.rfc-editor.org/rfc/rfc3550
https://www.rfc-editor.org/rfc/rfc3551
https://www.rfc-editor.org/rfc/rfc2326
https://www.rfc-editor.org/rfc/rfc4588
https://www.rfc-editor.org/rfc/rfc4585
https://www.rfc-editor.org/rfc/rfc7273
https://blog.csdn.net/weixin_42462202/article/details/98986535
https://github.com/fanxiushu/xdisp_virt
https://www.ngui.cc/el/72647.html

onnx relocation R_X86_64_TPOFF32 against hidden symbol

1
2
3
export CMAKE_ARGS="-DONNX_USE_PROTOBUF_SHARED_LIBS=ON"
apt-get install libprotobuf-dev protobuf-compiler
pip install onnx -i https://pypi.tuna.tsinghua.edu.cn/simple

Render video from OpenCV Mat using Direct3d

mainwindow.h

1
2
3
4
5
6
7
8
#include "vren_thread.h"
 
class MainWindow : public QMainWindow
{
	Q_OBJECT
	...
	vren_thread vren_;
};

mainwindow.c

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
#include "mainwindow.h"
#include <opencv2/opencv.hpp>
#include <opencv2/imgproc/types_c.h>
 
void CALLBACK DecCBFun(long nPort, char* pBuf, long nSize, FRAME_INFO* pFrameInfo, long nReserved1, long nReserved2)
{
	long lFrameType = pFrameInfo->nType;
 
	if (lFrameType == T_YV12)
	{
		MainWindow* win = port2MainWindow[nPort];
		if (nullptr == win)
		{
			qDebug() << "lookup main window from " << nPort << " failed.";
			return;
		}
 
		win->realYuvCallback(pBuf, nSize, pFrameInfo->nStamp, pFrameInfo->nWidth, pFrameInfo->nHeight);
	}
}
 
void MainWindow::realYuvCallback(const char* pBuf, int len, int64_t nStamp, int width, int height)
{
	cv::Mat dst(height, width, CV_8UC3);
	cv::Mat src(height + height / 2, width, CV_8UC1, (uchar*)pBuf);
	cv::cvtColor(src, dst, CV_YUV2RGBA_YV12); // CV_YUV2BGR_YV12);
	cv::line(dst, cv::Point(0, 0), cv::Point(100, 100), cv::Scalar(255, 0, 0), 10);
 
	vren_.Render_d3d(dst);
}
 
void CALLBACK fRealDataCallBack(LONG lRealHandle, DWORD dwDataType, BYTE* pBuffer, DWORD dwBufSize, void* pUser)
{
	MainWindow* pThis = (MainWindow*)pUser;
	pThis->realDataCallback(lRealHandle, dwDataType, pBuffer, dwBufSize);
}
 
void MainWindow::realDataCallback(LONG lRealHandle, DWORD dwDataType, BYTE* pBuffer, DWORD dwBufSize)
{
	DWORD dRet = 0;
	BOOL inData = FALSE;
 
	switch (dwDataType)
	{
	case NET_DVR_SYSHEAD:
		if (!PlayM4_GetPort(&port_))
		{
			break;
		}
 
		port2MainWindow[port_] = this;
		playWnd_ = (HWND)ui->widgetVideo->winId();
		vren_.SetParam(playWnd_);
 
		if (!PlayM4_OpenStream(port_, pBuffer, dwBufSize, 1024 * 1024))
		{
			dRet = PlayM4_GetLastError(port_);
			break;
		}
 
		if (!PlayM4_SetDecCallBackEx(port_, DecCBFun, NULL, NULL))
		{
			dRet = PlayM4_GetLastError(port_);
			break;
		}
 
		if (!PlayM4_Play(port_, NULL)) // playWnd_))
		{
			dRet = PlayM4_GetLastError(port_);
			break;
		}
	}
}

vren_thread.h

Read more

Compile gstreamer's test-launch in Windows

Include paths

1
2
3
D:\work\gstreamer\gstreamer\installed\include\gstreamer-1.0
D:\work\gstreamer\gstreamer\installed\include\glib-2.0
D:\work\gstreamer\gstreamer\installed\lib\glib-2.0\include

Lib paths

1
2
D:\work\gstreamer\gstreamer\installed\lib\gstreamer-1.0
D:\work\gstreamer\gstreamer\installed\lib

Libs:

1
2
3
4
5
6
7
8
9
10
11
gstrtspserver-1.0.lib
gstgio.lib
gstges.lib
gstmediafoundation.lib
gstwic.lib
gstadaptivedemux2.lib
gstd3d11.lib
gstnetsim.lib
gstreamer-1.0.lib
glib-2.0.lib
gobject-2.0.lib

MacOs 3d presentation in VMWare

Although VMWare can't support hardware accelerated 3d rendering for macos, I happened to see Garageband running in my old laptop, it indicates 'macOS Catalina (Version 10.15.4)' supports middle layer OpenGL software simulation, while up-to-date macOS deviates from this style.