How to add EXT-X-DISCONTINUITY tag when pushed live stream has changed in wowza
How to add EXT-X-DISCONTINUITY tag when something of source stream changed on live like follow:
o file format
o number and type of tracks
o encoding parameters
o encoding sequence
o timestamp sequence
See also questions close to this topic
Livestream programatically generated audio/video over RTMP
I'm looking for a way to stream data generated by a program over RTMP without saving it to an intermediate file to then stream with ffmpeg. For example, if I have a program to run that constantly generates white noise as samples of audio(e.g. a function that outputs a random sample run within an infinite loop), and, if possible, a screen of a solid, random color, is there a way for me to this audio over RTMP more or less as it's being generated? I would imagine, if this is possible, it may involve something along the lines of building up buffers of a fixed number of audio samples and then streaming it somehow over RTMP.
HTML5 Live Streaming NGINX RMTP dash.js
I have setup an nginx server with rtmp module to be able to watch livestreams of my clients captured by OBS on my browser. I am using dashplayer but have huge problems to play the streams on internet explorer. IE keeps stalling the video for more than 30 seconds just to resume for 3 seconds before stalling again - firefox has no problems at all. Is there any way to reduce the stalling on nginx.conf or my player settings? Has anyone experienced such performance issues before? Here are my configs:
videojs HLS plugin : how to change the xhr uri
I have been looking every where but couldn't find an answer:
I am using video.js + videojs-HLS plugin to stream playlist.m3u8.
the timeout function within the plugin make a request uri to (for example):
I need to change this automatically generated uri to (for example) :
I tried to hook "videojs.Hls.xhr.beforeRequest" with no effect.
Thank you very much !!!!! Please help
Does not work -hls_playlist in ffmpeg in the -dash section
I will use this assembly ffmpeg
ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers built with gcc 6.2.1 (Alpine 6.2.1) 20160822
I'm trying to do dash and hls at the same time.
In the documentation from here http://ffmpeg.org/ffmpeg-all.html#dash-2
the key is:
-hls_playlist hls_playlist Generate HLS playlist files as well. The master playlist is generated with the filename master.m3u8. One media playlist file is generated for each stream with filenames media_0.m3u8, media_1.m3u8, etc.
those. mpd and m3u8 must be created at the same time.
ffmpeg -re \ -analyzeduration 20000000 \ -i udp: //22.214.171.124: 1234? overrun_nonfatal = 1 \ -map 0 \ -c copy \ -f dash \ -min_seg_duration 5000000 \ -window_size 10 \ -extra_window_size 10 \ -remove_at_exit 1 \ -hls_playlist 1 \ -use_timeline 1 \ -use_template 1 \ -vtag avc1 \ -atag mp4a \ cam01.mpd
and get an error:
Unrecognized option 'hls_playlist'. Error splitting the argument list: Option not found`
everything works correctly ...
What could be the problem ?
Create HLS stream from MP4 File
I have a bunch of pre-generated MP4 files. I want to pass them in somewhere and get a Simulated HLS stream as my output [m3u8 file and a bunch of ts files]. Are there any easy ways to do this?
Playing Wowza Live Stream on Xamarin.Android Application
I am developing an Android application (and eventually iOS) in which I need users to be able to watch a live stream. My live stream is currently being streamed to the Wowza Streaming Cloud from the camera on my phone using the Wowza GoCoder mobile app, but I can not figure out how to take the RTSP link given to me, which is of the form, rtsp://xxxxx.cloud.wowza.com/app-xxxx, and play it through the Android app. I have tried looking around for resources on how to accomplish this, but have not been successful so far. I tried playing the stream with the VideoView component like this, but I get a "Can't play this video." error:
var videoView = FindViewById<VideoView>(Resource.Id.SampleVideoView); var uri = Android.Net.Uri.Parse("rtsp://xxxxx.cloud.wowza.com/app-xxxx"); videoView.SetVideoURI(uri); videoView.Visibility = ViewStates.Visible; videoView.Start();
It seems like this should be possible according to the Android Supported Media Formats. Links such as the following, rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov, work fine. If anyone could give me some guidance on how to display a live stream using Xamarin.Android (with Wowza or not), it would be greatly appreciated. Just to clarify, I do not need users of the app to livestream video themselves, I just need users to be able to watch my livestream on the application.
If this helps, I am looking to accomplish something similar to the HQ Trivia app, in which all users view a live stream simultaneously.
Video file with multiple audio is not playing in wowza video on demand
I am using wowza streaming engine (Version 4.7) for VOD Streaming. My requirement is to play video file with multiple audios. I setup a plugin from github
for multiple audio tracks.
I also setup
SMIL filebut multiple audio is not working. One another problem is my video.mp4 file with multiple audios is playing good in VLC but creating noise in WOWZA.
Please share a solution with me. Example related to VOD with multiple audios will be really helpful.
MY OS is Windows 10 and i am trying to play video in Adobe HDS.
Area: Playback Engine Version: 4.7
Live streaming with DJI iOS SDK + Wowza SDK
I want to do live streaming with DJI iOS SDK with Wowza go coder SDK. Can any one help me for that, how its work, what will be the process, some sample code example.