-
Notifications
You must be signed in to change notification settings - Fork 14
Update RTSP tutorial #101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update RTSP tutorial #101
Changes from 4 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,53 +1,43 @@ | ||
| In the tutorial we won't explain how to implement the solution from the ground up - instead, we will run the existing code from [Membrane demos](https://github.com/membraneframework/membrane_demo). | ||
|
|
||
| To run the RTSP to HLS converter first clone the demos repo: | ||
| ```console | ||
| ```bash | ||
| git clone https://github.com/membraneframework/membrane_demo.git | ||
| ``` | ||
|
|
||
| ```console | ||
| ```bash | ||
| cd membrane_demo/rtsp_to_hls | ||
| ``` | ||
|
|
||
| Install the dependencies | ||
| ```console | ||
| ```bash | ||
| mix deps.get | ||
| ``` | ||
|
|
||
| Make sure you have those libraries installed as well: | ||
| - gcc | ||
| - libc-dev | ||
| - ffmpeg | ||
|
|
||
| On ubuntu: | ||
| ```console | ||
| apt-get install gcc libc-dev ffmpeg | ||
| ``` | ||
|
|
||
| Take a look inside the `lib/application.ex` file. It's responsible for starting the pipeline. | ||
| We need to give a few arguments to the pipeline: | ||
| ```elixir | ||
| @rtsp_stream_url "rtsp://rtsp.membrane.work:554/testsrc.264" | ||
| @output_path "hls_output" | ||
| @rtp_port 20000 | ||
| rtsp_stream_url = "rtsp://localhost:30001" | ||
| output_path = "hls_output" | ||
| rtp_port = 20000 | ||
| ``` | ||
|
|
||
| The `@output_path` attribute defines the storage directory for hls files and the `@rtp_port` defines on which port we will be expecting the rtp stream, once the RTSP connection is established. | ||
| The `output_path` attribute defines the storage directory for hls files and the `rtp_port` defines on which port we will be expecting the rtp stream, once the RTSP connection is established. | ||
|
|
||
| The `@rtsp_stream_url` attribute contains the address of the stream, which we will be converting. It is a sample stream prepared for the purpose of the demo. | ||
| The `rtsp_stream_url` attribute contains the address of the stream, which we will be converting. If you want to receive a stream from some accessible RTSP server, you can pass it's URL here. In this demo we'll run our own, simple server, using port 30001: | ||
|
|
||
| ```bash | ||
| mix run server.exs | ||
| ``` | ||
|
|
||
| Now we can start the application: | ||
| ```console | ||
| ```bash | ||
| mix run --no-halt | ||
| ``` | ||
|
|
||
| The pipeline will start playing, after a couple of seconds the HLS files should appear in the `@output_path` directory. In order to play the stream we need to first serve them. We can do it using simple python server. | ||
|
|
||
| ```console | ||
| python3 -m http.server 8000 | ||
| ``` | ||
| The pipeline will start playing, after a couple of seconds the HLS files should appear in the `@output_path` directory. | ||
|
|
||
| Then we can play the stream using [ffmpeg](https://ffmpeg.org/), by pointing to the location of the manifest file: | ||
| ```console | ||
| ```bash | ||
| ffplay http://YOUR_MACHINE_IP:8000/rtsp_to_hls/hls_output/index.m3u8 | ||
| ``` |
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,55 @@ | ||
| As explained in the [Architecture chapter](08_RTSP_Architecture.md), the pipeline will consist of an RTSP Source and an HLS Sink Bin. For now we won't connect this elements in any way, since we don't have information about what tracks we'll receive from the RTSP server which we're connecting with. | ||
|
|
||
| ##### lib/pipeline.ex | ||
| ```elixir | ||
| @impl true | ||
| def handle_init(_context, options) do | ||
| spec = [ | ||
| child(:source, %Membrane.RTSP.Source{ | ||
| transport: {:udp, options.port, options.port + 5}, | ||
|
||
| allowed_media_types: [:video, :audio], | ||
| stream_uri: options.stream_url, | ||
| on_connection_closed: :send_eos | ||
| }), | ||
| child(:hls, %Membrane.HTTPAdaptiveStream.SinkBin{ | ||
| target_window_duration: Membrane.Time.seconds(120), | ||
varsill marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| manifest_module: Membrane.HTTPAdaptiveStream.HLS, | ||
| storage: %Membrane.HTTPAdaptiveStream.Storages.FileStorage{ | ||
| directory: options.output_path | ||
| } | ||
| }) | ||
| ] | ||
|
|
||
| {[spec: spec], %{parent_pid: options.parent_pid}} | ||
| end | ||
| ``` | ||
|
|
||
| Once we receive the `{:set_up_tracks, tracks}` notification from the source we have the information what tracks have been set up during connection establishment and what we should expect. First we filter these tracks, so that we have at most one video and audio track each. Then we can create specs that will connect output pads of the source with input pads of the sink appropriately - audio to audio and video to video. | ||
|
|
||
| ##### lib/pipeline.ex | ||
| ```elixir | ||
| @impl true | ||
| def handle_child_notification({:set_up_tracks, tracks}, :source, _ctx, state) do | ||
| track_specs = | ||
| Enum.uniq_by(tracks, & &1.type) | ||
| |> Enum.filter(&(&1.type in [:audio, :video])) | ||
| |> Enum.map(fn track -> | ||
| encoding = | ||
| case track do | ||
| %{type: :audio} -> :AAC | ||
| %{type: :video} -> :H264 | ||
| end | ||
|
|
||
| get_child(:source) | ||
| |> via_out(Pad.ref(:output, track.control_path)) | ||
| |> via_in(:input, | ||
| options: [encoding: encoding, segment_duration: Membrane.Time.seconds(4)] | ||
| ) | ||
| |> get_child(:hls) | ||
| end) | ||
|
|
||
| {[spec: track_specs], state} | ||
| end | ||
| ``` | ||
|
|
||
| By doing this we are prepared to receive the streams when a `PLAY` request is eventually sent by the source and the server starts streaming. | ||
This file was deleted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.