I wish to use the FFmpeg C++ library to simultaneously send the two following h264 video streams: RTP h264 to Server A RTMP h264 to Server B Note: The h264 encoding will only happen once per frame. The same video content will be sent to both streams. I wish the RTP latency to be as ..
I try to reduce the loss of connections between 2 peers. They are on the same local network. Looking through the source code, I found parameters that set values used to check if the connection is still writable : ice_transport_internal.h // The min time period for which a candidate pair must wait for response to ..
I am completely new to the webrtc, i have taken few open source webrtc codes most of the were based on the java script, i want to know the architecture of webrtc c++ code i have traversed from pulse audio to audio processing module, i want to know how this is linked to the peer ..
i am trying to see the webrtc c++ source flow i have found the capture point at pulse audio and its link towards audio processing module, as webrtc gives a peer connection application in the examples i am using that for debugging, now i want to find out how this peerconnection application(peer connection client and ..
I’m trying to make server side audio streaming application. I decided to use WebRTC web + native C++ Currently I done full initialization and connection between web and native (with datachannels) but now I have problem with audio. After some research I tried to receive audio from client. In OnAddStream callback I’m creating AudioTrackSinkInterface with ..
I’m building a personal project that makes a one-to-many connection with other users, this what my plan looks right now: User Initiates a P2P connection with the C++ Server The C++ Server Initiates a P2P Connection with other clients and forwards the stream of the broadcaster I’m planning to use the WebRTC C++ API available ..
For weeks I have been researching trying to find an open source video conferencing platform that allows integration from native clients. I am working on a solution that would need to communicate from a native app via a windows dll (at least initially). Ultimately, I want any OS to talk to it natively. A lot ..
I try to build my jni class with a single line that tries to call webrtc audio processing. webrtc::AudioProcessing* apm = webrtc::AudioProcessingBuilder().Create(); I also try to link the libjingle_peerconnection_so.so to my project so that I could use implementation of the audio processing from this common library. My Android.mk looks as follows: LOCAL_PATH := $(call my-dir) ..
With respect to flutter webrtc using demo app (https://github.com/flutter-webrtc/flutter-webrtc-demo), I have integrated WebRTC code in my app and it works fine with my macOS, and android. However, I am facing issue with Windows. I am not able to communicate between windows and mac. Macos to macos works well. While debugging i found that RTCVideoValue width ..