WebKit Bugzilla
New
Browse
Search+
Log In
×
Sign in with GitHub
or
Remember my login
Create Account
·
Forgot Password
Forgotten password account recovery
[patch]
Patch
bug-162874-20161016183159.patch (text/plain), 47.98 KB, created by
Enrique Ocaña
on 2016-10-16 11:35:31 PDT
(
hide
)
Description:
Patch
Filename:
MIME Type:
Creator:
Enrique Ocaña
Created:
2016-10-16 11:35:31 PDT
Size:
47.98 KB
patch
obsolete
>Subversion Revision: 207345 >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index a09daa00f8fa060050ca1bea12b41722252c63f0..10d826f921fa54b42c0d8a55979702c4df8faad6 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,5 +1,68 @@ > 2016-10-03 Enrique Ocaña González <eocanha@igalia.com> > >+ [GStreamer][MSE][EME] Append Pipeline >+ https://bugs.webkit.org/show_bug.cgi?id=162874 >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ This patch is co-authored with Xabier Rodriguez-Calvar <calvaris@igalia.com> (data starve >+ and last sample detection, debug features) and Philippe Normand <philn@igalia.com> (EME >+ support). >+ >+ * platform/graphics/gstreamer/mse/AppendPipeline.cpp: Added. >+ (WebCore::dumpAppendState): >+ (WebCore::appendPipelineElementMessageCallback): >+ (WebCore::appendPipelineApplicationMessageCallback): >+ (WebCore::AppendPipeline::AppendPipeline): >+ (WebCore::AppendPipeline::~AppendPipeline): >+ (WebCore::AppendPipeline::clearPlayerPrivate): >+ (WebCore::AppendPipeline::handleElementMessage): >+ (WebCore::AppendPipeline::handleApplicationMessage): >+ (WebCore::AppendPipeline::handleAppsrcNeedDataReceived): >+ (WebCore::AppendPipeline::handleAppsrcAtLeastABufferLeft): >+ (WebCore::AppendPipeline::id): >+ (WebCore::AppendPipeline::setAppendState): >+ (WebCore::AppendPipeline::parseDemuxerSrcPadCaps): >+ (WebCore::AppendPipeline::appsinkCapsChanged): >+ (WebCore::AppendPipeline::checkEndOfAppend): >+ (WebCore::AppendPipeline::appsinkNewSample): >+ (WebCore::AppendPipeline::appsinkEOS): >+ (WebCore::AppendPipeline::didReceiveInitializationSegment): >+ (WebCore::AppendPipeline::trackId): >+ (WebCore::AppendPipeline::resetPipeline): >+ (WebCore::AppendPipeline::setAppsrcDataLeavingProbe): >+ (WebCore::AppendPipeline::removeAppsrcDataLeavingProbe): >+ (WebCore::AppendPipeline::abort): >+ (WebCore::AppendPipeline::pushNewBuffer): >+ (WebCore::AppendPipeline::reportAppsrcAtLeastABufferLeft): >+ (WebCore::AppendPipeline::reportAppsrcNeedDataReceived): >+ (WebCore::AppendPipeline::handleNewAppsinkSample): >+ (WebCore::AppendPipeline::connectDemuxerSrcPadToAppsinkFromAnyThread): >+ (WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink): >+ (WebCore::AppendPipeline::disconnectDemuxerSrcPadFromAppsinkFromAnyThread): >+ (WebCore::appendPipelineAppsinkCapsChanged): >+ (WebCore::appendPipelineAppsrcDataLeaving): >+ (WebCore::appendPipelinePadProbeDebugInformation): >+ (WebCore::appendPipelineAppsrcNeedData): >+ (WebCore::appendPipelineDemuxerPadAdded): >+ (WebCore::appendPipelineDemuxerPadRemoved): >+ (WebCore::appendPipelineAppsinkNewSample): >+ (WebCore::appendPipelineAppsinkEOS): >+ * platform/graphics/gstreamer/mse/AppendPipeline.h: Added. >+ (WebCore::AppendPipeline::appendState): >+ (WebCore::AppendPipeline::mediaSourceClient): >+ (WebCore::AppendPipeline::sourceBufferPrivate): >+ (WebCore::AppendPipeline::bus): >+ (WebCore::AppendPipeline::pipeline): >+ (WebCore::AppendPipeline::appsrc): >+ (WebCore::AppendPipeline::appsink): >+ (WebCore::AppendPipeline::demuxerSrcPadCaps): >+ (WebCore::AppendPipeline::appsinkCaps): >+ (WebCore::AppendPipeline::track): >+ (WebCore::AppendPipeline::streamType): >+ >+2016-10-03 Enrique Ocaña González <eocanha@igalia.com> >+ > [GStreamer] Drain query support > https://bugs.webkit.org/show_bug.cgi?id=162872 > >diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp >new file mode 100644 >index 0000000000000000000000000000000000000000..290e82b947f41c267210eb23d067d5e024ca8a15 >--- /dev/null >+++ b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp >@@ -0,0 +1,1092 @@ >+/* >+ * Copyright (C) 2016 Metrological Group B.V. >+ * Copyright (C) 2016 Igalia S.L >+ * >+ * This library is free software; you can redistribute it and/or >+ * modify it under the terms of the GNU Library General Public >+ * License as published by the Free Software Foundation; either >+ * version 2 of the License, or (at your option) any later version. >+ * >+ * This library is distributed in the hope that it will be useful, >+ * but WITHOUT ANY WARRANTY; without even the implied warranty of >+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU >+ * Library General Public License for more details. >+ * >+ * You should have received a copy of the GNU Library General Public License >+ * aint with this library; see the file COPYING.LIB. If not, write to >+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, >+ * Boston, MA 02110-1301, USA. >+ */ >+ >+#include "config.h" >+#include "AppendPipeline.h" >+ >+#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE) && ENABLE(VIDEO_TRACK) >+ >+#include "AudioTrackPrivateGStreamer.h" >+#include "GRefPtrGStreamer.h" >+#include "GStreamerMediaDescription.h" >+#include "GStreamerMediaSample.h" >+#include "GStreamerUtilities.h" >+#include "InbandTextTrackPrivateGStreamer.h" >+#include "MediaDescription.h" >+#include "SourceBufferPrivateGStreamer.h" >+#include "VideoTrackPrivateGStreamer.h" >+ >+#include <gst/app/gstappsink.h> >+#include <gst/app/gstappsrc.h> >+#include <gst/gst.h> >+#include <gst/pbutils/pbutils.h> >+#include <gst/video/video.h> >+#include <wtf/Condition.h> >+ >+namespace WebCore { >+ >+#if !LOG_DISABLED >+struct PadProbeInformation { >+ AppendPipeline* appendPipeline; >+ const char* description; >+ gulong probeId; >+}; >+#endif >+ >+static const char* dumpAppendState(AppendPipeline::AppendState appendState) >+{ >+ switch (appendState) { >+ case AppendPipeline::AppendState::Invalid: >+ return "Invalid"; >+ case AppendPipeline::AppendState::NotStarted: >+ return "NotStarted"; >+ case AppendPipeline::AppendState::Ongoing: >+ return "Ongoing"; >+ case AppendPipeline::AppendState::KeyNegotiation: >+ return "KeyNegotiation"; >+ case AppendPipeline::AppendState::DataStarve: >+ return "DataStarve"; >+ case AppendPipeline::AppendState::Sampling: >+ return "Sampling"; >+ case AppendPipeline::AppendState::LastSample: >+ return "LastSample"; >+ case AppendPipeline::AppendState::Aborting: >+ return "Aborting"; >+ default: >+ return "(unknown)"; >+ } >+} >+ >+static void appendPipelineAppsrcNeedData(GstAppSrc*, guint, AppendPipeline*); >+static void appendPipelineDemuxerPadAdded(GstElement*, GstPad*, AppendPipeline*); >+static void appendPipelineDemuxerPadRemoved(GstElement*, GstPad*, AppendPipeline*); >+static void appendPipelineAppsinkCapsChanged(GObject*, GParamSpec*, AppendPipeline*); >+static GstPadProbeReturn appendPipelineAppsrcDataLeaving(GstPad*, GstPadProbeInfo*, AppendPipeline*); >+#if !LOG_DISABLED >+static GstPadProbeReturn appendPipelinePadProbeDebugInformation(GstPad*, GstPadProbeInfo*, struct PadProbeInformation*); >+#endif >+static GstFlowReturn appendPipelineAppsinkNewSample(GstElement*, AppendPipeline*); >+static void appendPipelineAppsinkEOS(GstElement*, AppendPipeline*); >+ >+static void appendPipelineElementMessageCallback(GstBus*, GstMessage* message, AppendPipeline* appendPipeline) >+{ >+ appendPipeline->handleElementMessage(message); >+} >+ >+static void appendPipelineApplicationMessageCallback(GstBus*, GstMessage* message, AppendPipeline* appendPipeline) >+{ >+ appendPipeline->handleApplicationMessage(message); >+} >+ >+AppendPipeline::AppendPipeline(PassRefPtr<MediaSourceClientGStreamerMSE> mediaSourceClient, PassRefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate, MediaPlayerPrivateGStreamerMSE* playerPrivate) >+ : m_mediaSourceClient(mediaSourceClient) >+ , m_sourceBufferPrivate(sourceBufferPrivate) >+ , m_playerPrivate(playerPrivate) >+ , m_id(0) >+ , m_appsrcAtLeastABufferLeft(false) >+ , m_appsrcNeedDataReceived(false) >+ , m_appsrcDataLeavingProbeId(0) >+ , m_appendState(NotStarted) >+ , m_abortPending(false) >+ , m_streamType(Unknown) >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ GST_TRACE("Creating AppendPipeline (%p)", this); >+ >+ // FIXME: give a name to the pipeline, maybe related with the track it's managing. >+ // The track name is still unknown at this time, though. >+ m_pipeline = adoptGRef(gst_pipeline_new(nullptr)); >+ >+ m_bus = adoptGRef(gst_pipeline_get_bus(GST_PIPELINE(m_pipeline.get()))); >+ gst_bus_add_signal_watch(m_bus.get()); >+ gst_bus_enable_sync_message_emission(m_bus.get()); >+ >+ g_signal_connect(m_bus.get(), "sync-message::element", G_CALLBACK(appendPipelineElementMessageCallback), this); >+ g_signal_connect(m_bus.get(), "message::application", G_CALLBACK(appendPipelineApplicationMessageCallback), this); >+ >+ // We assign the created instances here instead of adoptRef() because gst_bin_add_many() >+ // below will already take the initial reference and we need an additional one for us. >+ m_appsrc = gst_element_factory_make("appsrc", nullptr); >+ m_demux = gst_element_factory_make("qtdemux", nullptr); >+ m_appsink = gst_element_factory_make("appsink", nullptr); >+ >+ g_object_set(G_OBJECT(m_demux.get()), "always-honor-tfdt", TRUE, nullptr); >+ gst_app_sink_set_emit_signals(GST_APP_SINK(m_appsink.get()), TRUE); >+ gst_base_sink_set_sync(GST_BASE_SINK(m_appsink.get()), FALSE); >+ >+ GRefPtr<GstPad> appsinkPad = adoptGRef(gst_element_get_static_pad(m_appsink.get(), "sink")); >+ g_signal_connect(appsinkPad.get(), "notify::caps", G_CALLBACK(appendPipelineAppsinkCapsChanged), this); >+ >+ setAppsrcDataLeavingProbe(); >+ >+#if !LOG_DISABLED >+ GRefPtr<GstPad> demuxerPad = adoptGRef(gst_element_get_static_pad(m_demux.get(), "sink")); >+ m_demuxerDataEnteringPadProbeInformation.appendPipeline = this; >+ m_demuxerDataEnteringPadProbeInformation.description = "demuxer data entering"; >+ m_demuxerDataEnteringPadProbeInformation.probeId = gst_pad_add_probe(demuxerPad.get(), GST_PAD_PROBE_TYPE_BUFFER, reinterpret_cast<GstPadProbeCallback>(appendPipelinePadProbeDebugInformation), &m_demuxerDataEnteringPadProbeInformation, nullptr); >+ m_appsinkDataEnteringPadProbeInformation.appendPipeline = this; >+ m_appsinkDataEnteringPadProbeInformation.description = "appsink data entering"; >+ m_appsinkDataEnteringPadProbeInformation.probeId = gst_pad_add_probe(appsinkPad.get(), GST_PAD_PROBE_TYPE_BUFFER, reinterpret_cast<GstPadProbeCallback>(appendPipelinePadProbeDebugInformation), &m_appsinkDataEnteringPadProbeInformation, nullptr); >+#endif >+ >+ // These signals won't be connected outside of the lifetime of "this". >+ g_signal_connect(m_appsrc.get(), "need-data", G_CALLBACK(appendPipelineAppsrcNeedData), this); >+ g_signal_connect(m_demux.get(), "pad-added", G_CALLBACK(appendPipelineDemuxerPadAdded), this); >+ g_signal_connect(m_demux.get(), "pad-removed", G_CALLBACK(appendPipelineDemuxerPadRemoved), this); >+ g_signal_connect(m_appsink.get(), "new-sample", G_CALLBACK(appendPipelineAppsinkNewSample), this); >+ g_signal_connect(m_appsink.get(), "eos", G_CALLBACK(appendPipelineAppsinkEOS), this); >+ >+ // Add_many will take ownership of a reference. That's why we used an assignment before. >+ gst_bin_add_many(GST_BIN(m_pipeline.get()), m_appsrc.get(), m_demux.get(), nullptr); >+ gst_element_link(m_appsrc.get(), m_demux.get()); >+ >+ gst_element_set_state(m_pipeline.get(), GST_STATE_READY); >+}; >+ >+AppendPipeline::~AppendPipeline() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ m_newSampleLock.lock(); >+ setAppendState(Invalid); >+ m_newSampleCondition.notifyOne(); >+ m_newSampleLock.unlock(); >+ >+ m_padAddRemoveLock.lock(); >+ m_playerPrivate = nullptr; >+ m_padAddRemoveCondition.notifyOne(); >+ m_padAddRemoveLock.unlock(); >+ >+ GST_TRACE("Destroying AppendPipeline (%p)", this); >+ >+ // FIXME: Maybe notify appendComplete here? >+ >+ if (m_pipeline) { >+ ASSERT(m_bus); >+ gst_bus_remove_signal_watch(m_bus.get()); >+ gst_element_set_state(m_pipeline.get(), GST_STATE_NULL); >+ m_pipeline = nullptr; >+ } >+ >+ if (m_appsrc) { >+ removeAppsrcDataLeavingProbe(); >+ g_signal_handlers_disconnect_by_data(m_appsrc.get(), this); >+ m_appsrc = nullptr; >+ } >+ >+ if (m_demux) { >+#if !LOG_DISABLED >+ GRefPtr<GstPad> demuxerPad = adoptGRef(gst_element_get_static_pad(m_demux.get(), "sink")); >+ gst_pad_remove_probe(demuxerPad.get(), m_demuxerDataEnteringPadProbeInformation.probeId); >+#endif >+ >+ g_signal_handlers_disconnect_by_data(m_demux.get(), this); >+ m_demux = nullptr; >+ } >+ >+ if (m_appsink) { >+ GRefPtr<GstPad> appsinkPad = adoptGRef(gst_element_get_static_pad(m_appsink.get(), "sink")); >+ g_signal_handlers_disconnect_by_data(appsinkPad.get(), this); >+ g_signal_handlers_disconnect_by_data(m_appsink.get(), this); >+ >+#if !LOG_DISABLED >+ gst_pad_remove_probe(appsinkPad.get(), m_appsinkDataEnteringPadProbeInformation.probeId); >+#endif >+ >+ m_appsink = nullptr; >+ } >+ >+ if (m_appsinkCaps) >+ m_appsinkCaps = nullptr; >+ >+ if (m_demuxerSrcPadCaps) >+ m_demuxerSrcPadCaps = nullptr; >+}; >+ >+void AppendPipeline::clearPlayerPrivate() >+{ >+ ASSERT(WTF::isMainThread()); >+ GST_DEBUG("cleaning private player"); >+ >+ m_newSampleLock.lock(); >+ // Make sure that AppendPipeline won't process more data from now on and >+ // instruct handleNewSample to abort itself from now on as well. >+ setAppendState(Invalid); >+ >+ // Awake any pending handleNewSample operation in the streaming thread. >+ m_newSampleCondition.notifyOne(); >+ m_newSampleLock.unlock(); >+ >+ m_padAddRemoveLock.lock(); >+ m_playerPrivate = nullptr; >+ m_padAddRemoveCondition.notifyOne(); >+ m_padAddRemoveLock.unlock(); >+ >+ // And now that no handleNewSample operations will remain stalled waiting >+ // for the main thread, stop the pipeline. >+ if (m_pipeline) >+ gst_element_set_state(m_pipeline.get(), GST_STATE_NULL); >+} >+ >+void AppendPipeline::handleElementMessage(GstMessage* message) >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ const GstStructure* structure = gst_message_get_structure(message); >+ if (gst_structure_has_name(structure, "drm-key-needed")) >+ setAppendState(AppendPipeline::AppendState::KeyNegotiation); >+ >+ // MediaPlayerPrivateGStreamerBase will take care of setting up encryption. >+ if (m_playerPrivate) >+ m_playerPrivate->handleSyncMessage(message); >+} >+ >+void AppendPipeline::handleApplicationMessage(GstMessage* message) >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ const GstStructure* structure = gst_message_get_structure(message); >+ >+ if (gst_structure_has_name(structure, "appsrc-need-data")) { >+ handleAppsrcNeedDataReceived(); >+ return; >+ } >+ >+ if (gst_structure_has_name(structure, "appsrc-buffer-left")) { >+ handleAppsrcAtLeastABufferLeft(); >+ return; >+ } >+ >+ if (gst_structure_has_name(structure, "demuxer-connect-to-appsink")) { >+ GRefPtr<GstPad> demuxerSrcPad; >+ gst_structure_get(structure, "demuxer-src-pad", G_TYPE_OBJECT, &demuxerSrcPad.outPtr(), nullptr); >+ ASSERT(demuxerSrcPad); >+ connectDemuxerSrcPadToAppsink(demuxerSrcPad.get()); >+ return; >+ } >+ >+ if (gst_structure_has_name(structure, "appsink-caps-changed")) { >+ appsinkCapsChanged(); >+ return; >+ } >+ >+ if (gst_structure_has_name(structure, "appsink-new-sample")) { >+ GRefPtr<GstSample> newSample; >+ gst_structure_get(structure, "new-sample", GST_TYPE_SAMPLE, &newSample.outPtr(), nullptr); >+ >+ appsinkNewSample(newSample.get()); >+ return; >+ } >+ >+ if (gst_structure_has_name(structure, "appsink-eos")) { >+ appsinkEOS(); >+ return; >+ } >+ >+ ASSERT_NOT_REACHED(); >+} >+ >+void AppendPipeline::handleAppsrcNeedDataReceived() >+{ >+ if (!m_appsrcAtLeastABufferLeft) { >+ GST_TRACE("discarding until at least a buffer leaves appsrc"); >+ return; >+ } >+ >+ ASSERT(m_appendState == Ongoing || m_appendState == Sampling); >+ ASSERT(!m_appsrcNeedDataReceived); >+ >+ GST_TRACE("received need-data from appsrc"); >+ >+ m_appsrcNeedDataReceived = true; >+ checkEndOfAppend(); >+} >+ >+void AppendPipeline::handleAppsrcAtLeastABufferLeft() >+{ >+ m_appsrcAtLeastABufferLeft = true; >+ GST_TRACE("received buffer-left from appsrc"); >+#if LOG_DISABLED >+ removeAppsrcDataLeavingProbe(); >+#endif >+} >+ >+gint AppendPipeline::id() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ static gint totalAudio = 0; >+ static gint totalVideo = 0; >+ static gint totalText = 0; >+ >+ if (m_id) >+ return m_id; >+ >+ switch (m_streamType) { >+ case Audio: >+ m_id = ++totalAudio; >+ break; >+ case Video: >+ m_id = ++totalVideo; >+ break; >+ case Text: >+ m_id = ++totalText; >+ break; >+ case Unknown: >+ case Invalid: >+ GST_ERROR("Trying to get id for a pipeline of Unknown/Invalid type"); >+ ASSERT_NOT_REACHED(); >+ break; >+ } >+ >+ GST_DEBUG("streamType=%d, id=%d", static_cast<int>(m_streamType), m_id); >+ >+ return m_id; >+} >+ >+void AppendPipeline::setAppendState(AppendState newAppendState) >+{ >+ ASSERT(WTF::isMainThread()); >+ // Valid transitions: >+ // NotStarted-->Ongoing-->DataStarve-->NotStarted >+ // | | `->Aborting-->NotStarted >+ // | `->Sampling-···->Sampling-->LastSample-->NotStarted >+ // | | `->Aborting-->NotStarted >+ // | `->KeyNegotiation-->Ongoing-->[...] >+ // `->Aborting-->NotStarted >+ AppendState oldAppendState = m_appendState; >+ AppendState nextAppendState = Invalid; >+ >+ if (oldAppendState != newAppendState) >+ GST_TRACE("%s --> %s", dumpAppendState(oldAppendState), dumpAppendState(newAppendState)); >+ >+ bool ok = false; >+ >+ switch (oldAppendState) { >+ case NotStarted: >+ switch (newAppendState) { >+ case Ongoing: >+ ok = true; >+ gst_element_set_state(m_pipeline.get(), GST_STATE_PLAYING); >+ break; >+ case NotStarted: >+ ok = true; >+ if (m_pendingBuffer) { >+ GST_TRACE("pushing pending buffer %p", m_pendingBuffer.get()); >+ gst_app_src_push_buffer(GST_APP_SRC(appsrc()), m_pendingBuffer.leakRef()); >+ nextAppendState = Ongoing; >+ } >+ break; >+ case Aborting: >+ ok = true; >+ nextAppendState = NotStarted; >+ break; >+ case Invalid: >+ ok = true; >+ break; >+ default: >+ break; >+ } >+ break; >+ case KeyNegotiation: >+ switch (newAppendState) { >+ case Ongoing: >+ case Invalid: >+ ok = true; >+ break; >+ default: >+ break; >+ } >+ break; >+ case Ongoing: >+ switch (newAppendState) { >+ case KeyNegotiation: >+ case Sampling: >+ case Invalid: >+ ok = true; >+ break; >+ case DataStarve: >+ ok = true; >+ GST_DEBUG("received all pending samples"); >+ m_sourceBufferPrivate->didReceiveAllPendingSamples(); >+ if (m_abortPending) >+ nextAppendState = Aborting; >+ else >+ nextAppendState = NotStarted; >+ break; >+ default: >+ break; >+ } >+ break; >+ case DataStarve: >+ switch (newAppendState) { >+ case NotStarted: >+ case Invalid: >+ ok = true; >+ break; >+ case Aborting: >+ ok = true; >+ nextAppendState = NotStarted; >+ break; >+ default: >+ break; >+ } >+ break; >+ case Sampling: >+ switch (newAppendState) { >+ case Sampling: >+ case Invalid: >+ ok = true; >+ break; >+ case LastSample: >+ ok = true; >+ GST_DEBUG("received all pending samples"); >+ m_sourceBufferPrivate->didReceiveAllPendingSamples(); >+ if (m_abortPending) >+ nextAppendState = Aborting; >+ else >+ nextAppendState = NotStarted; >+ break; >+ default: >+ break; >+ } >+ break; >+ case LastSample: >+ switch (newAppendState) { >+ case NotStarted: >+ case Invalid: >+ ok = true; >+ break; >+ case Aborting: >+ ok = true; >+ nextAppendState = NotStarted; >+ break; >+ default: >+ break; >+ } >+ break; >+ case Aborting: >+ switch (newAppendState) { >+ case NotStarted: >+ ok = true; >+ resetPipeline(); >+ m_abortPending = false; >+ nextAppendState = NotStarted; >+ break; >+ case Invalid: >+ ok = true; >+ break; >+ default: >+ break; >+ } >+ break; >+ case Invalid: >+ ok = true; >+ break; >+ } >+ >+ if (ok) >+ m_appendState = newAppendState; >+ else >+ GST_ERROR("Invalid append state transition %s --> %s", dumpAppendState(oldAppendState), dumpAppendState(newAppendState)); >+ >+ ASSERT(ok); >+ >+ if (nextAppendState != Invalid) >+ setAppendState(nextAppendState); >+} >+ >+void AppendPipeline::parseDemuxerSrcPadCaps(GstCaps* demuxerSrcPadCaps) >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ m_demuxerSrcPadCaps = adoptGRef(demuxerSrcPadCaps); >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Unknown; >+ >+ GstStructure* structure = gst_caps_get_structure(m_demuxerSrcPadCaps.get(), 0); >+ bool sizeConfigured = false; >+ >+#if GST_CHECK_VERSION(1, 5, 3) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >+ if (gst_structure_has_name(structure, "application/x-cenc")) { >+ // Any previous decriptor should have been removed from the pipeline by disconnectFromAppSinkFromStreamingThread() >+ ASSERT(!m_decryptor); >+ >+ m_decryptor = adoptGRef(WebCore::createGstDecryptor(gst_structure_get_string(structure, "protection-system"))); >+ if (!m_decryptor) { >+ GST_ERROR("decryptor not found for caps: %" GST_PTR_FORMAT, m_demuxerSrcPadCaps.get()); >+ return; >+ } >+ >+ const gchar* originalMediaType = gst_structure_get_string(structure, "original-media-type"); >+ >+ if (g_str_has_prefix(originalMediaType, "video/")) { >+ int width = 0; >+ int height = 0; >+ float finalHeight = 0; >+ >+ gst_structure_get_int(structure, "width", &width); >+ if (gst_structure_get_int(structure, "height", &height)) { >+ gint ratioNumerator = 1; >+ gint ratioDenominator = 1; >+ >+ gst_structure_get_fraction(structure, "pixel-aspect-ratio", &ratioNumerator, &ratioDenominator); >+ finalHeight = height * ((float) ratioDenominator / (float) ratioNumerator); >+ } >+ >+ m_presentationSize = WebCore::FloatSize(width, finalHeight); >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Video; >+ } else { >+ m_presentationSize = WebCore::FloatSize(); >+ if (g_str_has_prefix(originalMediaType, "audio/")) >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Audio; >+ else if (g_str_has_prefix(originalMediaType, "text/")) >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Text; >+ } >+ sizeConfigured = true; >+ } >+#endif >+ >+ if (!sizeConfigured) { >+ const gchar* structureName = gst_structure_get_name(structure); >+ GstVideoInfo info; >+ >+ if (g_str_has_prefix(structureName, "video/") && gst_video_info_from_caps(&info, demuxerSrcPadCaps)) { >+ float width, height; >+ >+ width = info.width; >+ height = info.height * ((float) info.par_d / (float) info.par_n); >+ >+ m_presentationSize = WebCore::FloatSize(width, height); >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Video; >+ } else { >+ m_presentationSize = WebCore::FloatSize(); >+ if (g_str_has_prefix(structureName, "audio/")) >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Audio; >+ else if (g_str_has_prefix(structureName, "text/")) >+ m_streamType = WebCore::MediaSourceStreamTypeGStreamer::Text; >+ } >+ } >+} >+ >+void AppendPipeline::appsinkCapsChanged() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ if (!m_appsink) >+ return; >+ >+ GRefPtr<GstPad> pad = adoptGRef(gst_element_get_static_pad(m_appsink.get(), "sink")); >+ GRefPtr<GstCaps> caps = adoptGRef(gst_pad_get_current_caps(pad.get())); >+ >+ if (!caps) >+ return; >+ >+ // This means that we're right after a new track has appeared. Otherwise, it's a caps change inside the same track. >+ bool previousCapsWereNull = !m_appsinkCaps; >+ >+ if (gst_caps_replace(&m_appsinkCaps.outPtr(), caps.get())) { >+ if (m_playerPrivate && previousCapsWereNull) >+ m_playerPrivate->trackDetected(this, m_oldTrack, m_track); >+ didReceiveInitializationSegment(); >+ gst_element_set_state(m_pipeline.get(), GST_STATE_PLAYING); >+ } >+} >+ >+void AppendPipeline::checkEndOfAppend() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ if (!m_appsrcNeedDataReceived || (m_appendState != Ongoing && m_appendState != Sampling)) >+ return; >+ >+ GST_TRACE("end of append data mark was received"); >+ >+ switch (m_appendState) { >+ case Ongoing: >+ GST_TRACE("DataStarve"); >+ m_appsrcNeedDataReceived = false; >+ setAppendState(DataStarve); >+ break; >+ case Sampling: >+ GST_TRACE("LastSample"); >+ m_appsrcNeedDataReceived = false; >+ setAppendState(LastSample); >+ break; >+ default: >+ ASSERT_NOT_REACHED(); >+ break; >+ } >+} >+ >+void AppendPipeline::appsinkNewSample(GstSample* sample) >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ m_newSampleLock.lock(); >+ >+ // Ignore samples if we're not expecting them. Refuse processing if we're in Invalid state. >+ if (m_appendState != Ongoing && m_appendState != Sampling) { >+ GST_WARNING("Unexpected sample, appendState=%s", dumpAppendState(m_appendState)); >+ // FIXME: Return ERROR and find a more robust way to detect that all the >+ // data has been processed, so we don't need to resort to these hacks. >+ // All in all, return OK, even if it's not the proper thing to do. We don't want to break the demuxer. >+ m_flowReturn = GST_FLOW_OK; >+ m_newSampleCondition.notifyOne(); >+ return; >+ } >+ >+ RefPtr<GStreamerMediaSample> mediaSample = WebCore::GStreamerMediaSample::create(sample, m_presentationSize, trackId()); >+ >+ GST_TRACE("append: trackId=%s PTS=%f presentationSize=%.0fx%.0f", mediaSample->trackID().string().utf8().data(), mediaSample->presentationTime().toFloat(), mediaSample->presentationSize().width(), mediaSample->presentationSize().height()); >+ >+ // If we're beyond the duration, ignore this sample and the remaining ones. >+ MediaTime duration = m_mediaSourceClient->duration(); >+ if (duration.isValid() && !duration.indefiniteTime() && mediaSample->presentationTime() > duration) { >+ GST_DEBUG("Detected sample (%f) beyond the duration (%f), declaring LastSample", mediaSample->presentationTime().toFloat(), duration.toFloat()); >+ setAppendState(LastSample); >+ m_flowReturn = GST_FLOW_OK; >+ m_newSampleCondition.notifyOne(); >+ return; >+ } >+ >+ // Add a gap sample if a gap is detected before the first sample. >+ if (mediaSample->decodeTime() == MediaTime::zeroTime() >+ && mediaSample->presentationTime() > MediaTime::zeroTime() >+ && mediaSample->presentationTime() <= MediaTime::createWithDouble(0.1)) { >+ GST_DEBUG("Adding gap offset"); >+ mediaSample->applyPtsOffset(MediaTime::zeroTime()); >+ } >+ >+ m_sourceBufferPrivate->didReceiveSample(mediaSample); >+ setAppendState(Sampling); >+ m_flowReturn = GST_FLOW_OK; >+ m_newSampleCondition.notifyOne(); >+ m_newSampleLock.unlock(); >+ >+ checkEndOfAppend(); >+} >+ >+void AppendPipeline::appsinkEOS() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ switch (m_appendState) { >+ case Aborting: >+ // Ignored. Operation completion will be managed by the Aborting->NotStarted transition. >+ return; >+ case Ongoing: >+ // Finish Ongoing and Sampling states. >+ setAppendState(DataStarve); >+ break; >+ case Sampling: >+ setAppendState(LastSample); >+ break; >+ default: >+ GST_DEBUG("Unexpected EOS"); >+ break; >+ } >+} >+ >+void AppendPipeline::didReceiveInitializationSegment() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ WebCore::SourceBufferPrivateClient::InitializationSegment initializationSegment; >+ >+ GST_DEBUG("Notifying SourceBuffer for track %s", m_track->id().string().utf8().data()); >+ initializationSegment.duration = m_mediaSourceClient->duration(); >+ switch (m_streamType) { >+ case Audio: { >+ WebCore::SourceBufferPrivateClient::InitializationSegment::AudioTrackInformation info; >+ info.track = static_cast<AudioTrackPrivateGStreamer*>(m_track.get()); >+ info.description = WebCore::GStreamerMediaDescription::create(m_demuxerSrcPadCaps.get()); >+ initializationSegment.audioTracks.append(info); >+ break; >+ } >+ case Video: { >+ WebCore::SourceBufferPrivateClient::InitializationSegment::VideoTrackInformation info; >+ info.track = static_cast<VideoTrackPrivateGStreamer*>(m_track.get()); >+ info.description = WebCore::GStreamerMediaDescription::create(m_demuxerSrcPadCaps.get()); >+ initializationSegment.videoTracks.append(info); >+ break; >+ } >+ default: >+ GST_ERROR("Unsupported or unknown stream type"); >+ ASSERT_NOT_REACHED(); >+ break; >+ } >+ >+ m_sourceBufferPrivate->didReceiveInitializationSegment(initializationSegment); >+} >+ >+AtomicString AppendPipeline::trackId() >+{ >+ ASSERT(WTF::isMainThread()); >+ >+ if (!m_track) >+ return AtomicString(); >+ >+ return m_track->id(); >+} >+ >+void AppendPipeline::resetPipeline() >+{ >+ ASSERT(WTF::isMainThread()); >+ GST_DEBUG("resetting pipeline"); >+ m_appsrcAtLeastABufferLeft = false; >+ setAppsrcDataLeavingProbe(); >+ >+ m_newSampleLock.lock(); >+ m_newSampleCondition.notifyOne(); >+ gst_element_set_state(m_pipeline.get(), GST_STATE_READY); >+ gst_element_get_state(m_pipeline.get(), nullptr, nullptr, 0); >+ m_newSampleLock.unlock(); >+ >+#if (!(LOG_DISABLED || GST_DISABLE_GST_DEBUG)) >+ { >+ // This is here for debugging purposes. It does not make sense to have it as class member. >+ WTF::String dotFileName = String::format("reset-pipeline-%d", ++i); >+ gst_debug_bin_to_dot_file(GST_BIN(m_pipeline.get()), GST_DEBUG_GRAPH_SHOW_ALL, dotFileName.utf8().data()); >+ } >+#endif >+ >+} >+ >+void AppendPipeline::setAppsrcDataLeavingProbe() >+{ >+ if (m_appsrcDataLeavingProbeId) >+ return; >+ >+ GST_TRACE("setting appsrc data leaving probe"); >+ >+ GRefPtr<GstPad> appsrcPad = adoptGRef(gst_element_get_static_pad(m_appsrc.get(), "src")); >+ m_appsrcDataLeavingProbeId = gst_pad_add_probe(appsrcPad.get(), GST_PAD_PROBE_TYPE_BUFFER, reinterpret_cast<GstPadProbeCallback>(appendPipelineAppsrcDataLeaving), this, nullptr); >+} >+ >+void AppendPipeline::removeAppsrcDataLeavingProbe() >+{ >+ if (!m_appsrcDataLeavingProbeId) >+ return; >+ >+ GST_TRACE("removing appsrc data leaving probe"); >+ >+ GRefPtr<GstPad> appsrcPad = adoptGRef(gst_element_get_static_pad(m_appsrc.get(), "src")); >+ gst_pad_remove_probe(appsrcPad.get(), m_appsrcDataLeavingProbeId); >+ m_appsrcDataLeavingProbeId = 0; >+} >+ >+void AppendPipeline::abort() >+{ >+ ASSERT(WTF::isMainThread()); >+ GST_DEBUG("aborting"); >+ >+ m_pendingBuffer.clear(); >+ >+ // Abort already ongoing. >+ if (m_abortPending) >+ return; >+ >+ m_abortPending = true; >+ if (m_appendState == NotStarted) >+ setAppendState(Aborting); >+ // Else, the automatic state transitions will take care when the ongoing append finishes. >+} >+ >+GstFlowReturn AppendPipeline::pushNewBuffer(GstBuffer* buffer) >+{ >+ GstFlowReturn result; >+ >+ if (m_abortPending) { >+ m_pendingBuffer = adoptGRef(buffer); >+ result = GST_FLOW_OK; >+ } else { >+ setAppendState(AppendPipeline::Ongoing); >+ GST_TRACE("pushing new buffer %p", buffer); >+ result = gst_app_src_push_buffer(GST_APP_SRC(appsrc()), buffer); >+ } >+ >+ return result; >+} >+ >+void AppendPipeline::reportAppsrcAtLeastABufferLeft() >+{ >+ GST_TRACE("buffer left appsrc, reposting to bus"); >+ GstStructure* structure = gst_structure_new_empty("appsrc-buffer-left"); >+ GstMessage* message = gst_message_new_application(GST_OBJECT(m_appsrc.get()), structure); >+ gst_bus_post(m_bus.get(), message); >+} >+ >+void AppendPipeline::reportAppsrcNeedDataReceived() >+{ >+ GST_TRACE("received need-data signal at appsrc, reposting to bus"); >+ GstStructure* structure = gst_structure_new_empty("appsrc-need-data"); >+ GstMessage* message = gst_message_new_application(GST_OBJECT(m_appsrc.get()), structure); >+ gst_bus_post(m_bus.get(), message); >+} >+ >+GstFlowReturn AppendPipeline::handleNewAppsinkSample(GstElement* appsink) >+{ >+ ASSERT(!WTF::isMainThread()); >+ >+ // Even if we're disabled, it's important to pull the sample out anyway to >+ // avoid deadlocks when changing to GST_STATE_NULL having a non empty appsink. >+ GRefPtr<GstSample> sample = adoptGRef(gst_app_sink_pull_sample(GST_APP_SINK(appsink))); >+ LockHolder locker(m_newSampleLock); >+ >+ if (!m_playerPrivate || m_appendState == Invalid) { >+ GST_WARNING("AppendPipeline has been disabled, ignoring this sample"); >+ return GST_FLOW_ERROR; >+ } >+ >+ if (!(!m_playerPrivate || m_appendState == Invalid)) { >+ GstStructure* structure = gst_structure_new("appsink-new-sample", "new-sample", GST_TYPE_SAMPLE, sample.get(), nullptr); >+ GstMessage* message = gst_message_new_application(GST_OBJECT(appsink), structure); >+ gst_bus_post(m_bus.get(), message); >+ GST_TRACE("appsink-new-sample message posted to bus"); >+ >+ m_newSampleCondition.wait(m_newSampleLock); >+ // We've been awaken because the sample was processed or because of >+ // an exceptional condition (entered in Invalid state, destructor, etc.). >+ // We can't reliably delete info here, appendPipelineAppsinkNewSampleMainThread will do it. >+ } >+ >+ return m_flowReturn; >+} >+ >+void AppendPipeline::connectDemuxerSrcPadToAppsinkFromAnyThread(GstPad* demuxerSrcPad) >+{ >+ if (!m_appsink) >+ return; >+ >+ GST_DEBUG("connecting to appsink"); >+ >+ GRefPtr<GstPad> sinkSinkPad = adoptGRef(gst_element_get_static_pad(m_appsink.get(), "sink")); >+ >+ // Only one stream per demuxer is supported. >+ ASSERT(!gst_pad_is_linked(sinkSinkPad.get())); >+ >+ gint64 timeLength = 0; >+ if (gst_element_query_duration(m_demux.get(), GST_FORMAT_TIME, &timeLength) >+ && static_cast<guint64>(timeLength) != GST_CLOCK_TIME_NONE) >+ m_initialDuration = MediaTime(GST_TIME_AS_USECONDS(timeLength), G_USEC_PER_SEC); >+ else >+ m_initialDuration = MediaTime::positiveInfiniteTime(); >+ >+ if (WTF::isMainThread()) >+ connectDemuxerSrcPadToAppsink(demuxerSrcPad); >+ else { >+ // Call connectDemuxerSrcPadToAppsink() in the main thread and wait. >+ LockHolder locker(m_padAddRemoveLock); >+ if (!m_playerPrivate) >+ return; >+ >+ GstStructure* structure = gst_structure_new("demuxer-connect-to-appsink", "demuxer-src-pad", G_TYPE_OBJECT, demuxerSrcPad, nullptr); >+ GstMessage* message = gst_message_new_application(GST_OBJECT(m_demux.get()), structure); >+ gst_bus_post(m_bus.get(), message); >+ GST_TRACE("demuxer-connect-to-appsink message posted to bus"); >+ >+ m_padAddRemoveCondition.wait(m_padAddRemoveLock); >+ } >+ >+ // Must be done in the thread we were called from (usually streaming thread). >+ bool isData = false; >+ >+ switch (m_streamType) { >+ case WebCore::MediaSourceStreamTypeGStreamer::Audio: >+ case WebCore::MediaSourceStreamTypeGStreamer::Video: >+ case WebCore::MediaSourceStreamTypeGStreamer::Text: >+ isData = true; >+ break; >+ default: >+ break; >+ } >+ >+ if (isData) { >+ // FIXME: Only add appsink one time. This method can be called several times. >+ GRefPtr<GstObject> parent = adoptGRef(gst_element_get_parent(m_appsink.get())); >+ if (!parent) >+ gst_bin_add(GST_BIN(m_pipeline.get()), m_appsink.get()); >+ >+#if ENABLE(LEGACY_ENCRYPTED_MEDIA) >+ if (m_decryptor) { >+ gst_object_ref(m_decryptor.get()); >+ gst_bin_add(GST_BIN(m_pipeline.get()), m_decryptor.get()); >+ >+ GRefPtr<GstPad> decryptorSinkPad = adoptGRef(gst_element_get_static_pad(m_decryptor.get(), "sink")); >+ gst_pad_link(demuxerSrcPad, decryptorSinkPad.get()); >+ >+ GRefPtr<GstPad> decryptorSrcPad = adoptGRef(gst_element_get_static_pad(m_decryptor.get(), "src")); >+ gst_pad_link(decryptorSrcPad.get(), sinkSinkPad.get()); >+ >+ gst_element_sync_state_with_parent(m_appsink.get()); >+ gst_element_sync_state_with_parent(m_decryptor.get()); >+ } else { >+#endif >+ gst_pad_link(demuxerSrcPad, sinkSinkPad.get()); >+ gst_element_sync_state_with_parent(m_appsink.get()); >+#if ENABLE(LEGACY_ENCRYPTED_MEDIA) >+ } >+#endif >+ gst_element_set_state(m_pipeline.get(), GST_STATE_PAUSED); >+ } >+} >+ >+void AppendPipeline::connectDemuxerSrcPadToAppsink(GstPad* demuxerSrcPad) >+{ >+ ASSERT(WTF::isMainThread()); >+ GST_DEBUG("Connecting to appsink"); >+ >+ LockHolder locker(m_padAddRemoveLock); >+ GRefPtr<GstPad> sinkSinkPad = adoptGRef(gst_element_get_static_pad(m_appsink.get(), "sink")); >+ >+ // Only one stream per demuxer is supported. >+ ASSERT(!gst_pad_is_linked(sinkSinkPad.get())); >+ >+ GRefPtr<GstCaps> caps = adoptGRef(gst_pad_get_current_caps(GST_PAD(demuxerSrcPad))); >+ >+ if (!caps || m_appendState == Invalid || !m_playerPrivate) { >+ m_padAddRemoveCondition.notifyOne(); >+ return; >+ } >+ >+#ifndef GST_DISABLE_GST_DEBUG >+ { >+ GUniquePtr<gchar> strcaps(gst_caps_to_string(caps.get())); >+ GST_DEBUG("%s", strcaps.get()); >+ } >+#endif >+ >+ if (m_initialDuration > m_mediaSourceClient->duration()) >+ m_mediaSourceClient->durationChanged(m_initialDuration); >+ >+ m_oldTrack = m_track; >+ >+ parseDemuxerSrcPadCaps(gst_caps_ref(caps.get())); >+ >+ switch (m_streamType) { >+ case WebCore::MediaSourceStreamTypeGStreamer::Audio: >+ if (m_playerPrivate) >+ m_track = WebCore::AudioTrackPrivateGStreamer::create(m_playerPrivate->pipeline(), id(), sinkSinkPad.get()); >+ break; >+ case WebCore::MediaSourceStreamTypeGStreamer::Video: >+ if (m_playerPrivate) >+ m_track = WebCore::VideoTrackPrivateGStreamer::create(m_playerPrivate->pipeline(), id(), sinkSinkPad.get()); >+ break; >+ case WebCore::MediaSourceStreamTypeGStreamer::Text: >+ m_track = WebCore::InbandTextTrackPrivateGStreamer::create(id(), sinkSinkPad.get()); >+ break; >+ default: >+ // No useful data, but notify anyway to complete the append operation. >+ GST_DEBUG("Received all pending samples (no data)"); >+ m_sourceBufferPrivate->didReceiveAllPendingSamples(); >+ break; >+ } >+ >+ m_padAddRemoveCondition.notifyOne(); >+} >+ >+void AppendPipeline::disconnectDemuxerSrcPadFromAppsinkFromAnyThread() >+{ >+ GST_DEBUG("Disconnecting appsink"); >+ >+ // Must be done in the thread we were called from (usually streaming thread). >+#if ENABLE(LEGACY_ENCRYPTED_MEDIA) >+ if (m_decryptor) { >+ gst_element_unlink(m_decryptor.get(), m_appsink.get()); >+ gst_element_unlink(m_demux.get(), m_decryptor.get()); >+ gst_element_set_state(m_decryptor.get(), GST_STATE_NULL); >+ gst_bin_remove(GST_BIN(m_pipeline.get()), m_decryptor.get()); >+ } else >+#endif >+ gst_element_unlink(m_demux.get(), m_appsink.get()); >+} >+ >+static void appendPipelineAppsinkCapsChanged(GObject* appsinkPad, GParamSpec*, AppendPipeline* appendPipeline) >+{ >+ GstStructure* structure = gst_structure_new_empty("appsink-caps-changed"); >+ GstMessage* message = gst_message_new_application(GST_OBJECT(appsinkPad), structure); >+ gst_bus_post(appendPipeline->bus(), message); >+ GST_TRACE("appsink-caps-changed message posted to bus"); >+} >+ >+static GstPadProbeReturn appendPipelineAppsrcDataLeaving(GstPad*, GstPadProbeInfo* info, AppendPipeline* appendPipeline) >+{ >+ ASSERT(GST_PAD_PROBE_INFO_TYPE(info) & GST_PAD_PROBE_TYPE_BUFFER); >+ >+ GstBuffer* buffer = GST_PAD_PROBE_INFO_BUFFER(info); >+ gsize bufferSize = gst_buffer_get_size(buffer); >+ >+ GST_TRACE("buffer of size %" G_GSIZE_FORMAT " going thru", bufferSize); >+ >+ appendPipeline->reportAppsrcAtLeastABufferLeft(); >+ >+ return GST_PAD_PROBE_OK; >+} >+ >+#if !LOG_DISABLED >+static GstPadProbeReturn appendPipelinePadProbeDebugInformation(GstPad*, GstPadProbeInfo* info, struct PadProbeInformation* padProbeInformation) >+{ >+ ASSERT(GST_PAD_PROBE_INFO_TYPE(info) & GST_PAD_PROBE_TYPE_BUFFER); >+ GstBuffer* buffer = GST_PAD_PROBE_INFO_BUFFER(info); >+ GST_TRACE("%s: buffer of size %" G_GSIZE_FORMAT " going thru", padProbeInformation->description, gst_buffer_get_size(buffer)); >+ return GST_PAD_PROBE_OK; >+} >+#endif >+ >+static void appendPipelineAppsrcNeedData(GstAppSrc*, guint, AppendPipeline* appendPipeline) >+{ >+ appendPipeline->reportAppsrcNeedDataReceived(); >+} >+ >+static void appendPipelineDemuxerPadAdded(GstElement*, GstPad* demuxerSrcPad, AppendPipeline* appendPipeline) >+{ >+ appendPipeline->connectDemuxerSrcPadToAppsinkFromAnyThread(demuxerSrcPad); >+} >+ >+static void appendPipelineDemuxerPadRemoved(GstElement*, GstPad*, AppendPipeline* appendPipeline) >+{ >+ appendPipeline->disconnectDemuxerSrcPadFromAppsinkFromAnyThread(); >+} >+ >+static GstFlowReturn appendPipelineAppsinkNewSample(GstElement* appsink, AppendPipeline* appendPipeline) >+{ >+ return appendPipeline->handleNewAppsinkSample(appsink); >+} >+ >+static void appendPipelineAppsinkEOS(GstElement*, AppendPipeline* appendPipeline) >+{ >+ if (WTF::isMainThread()) >+ appendPipeline->appsinkEOS(); >+ else { >+ GstStructure* structure = gst_structure_new_empty("appsink-eos"); >+ GstMessage* message = gst_message_new_application(GST_OBJECT(appendPipeline->appsink()), structure); >+ gst_bus_post(appendPipeline->bus(), message); >+ GST_TRACE("appsink-eos message posted to bus"); >+ } >+ >+ GST_DEBUG("%s main thread", (WTF::isMainThread()) ? "Is" : "Not"); >+} >+ >+ >+ >+} // namespace WebCore. >+ >+#endif // USE(GSTREAMER) >diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h >new file mode 100644 >index 0000000000000000000000000000000000000000..ae4b1f5e1aaf081f0d4d23d0cb25af4c34f4f132 >--- /dev/null >+++ b/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h >@@ -0,0 +1,148 @@ >+/* >+ * Copyright (C) 2016 Metrological Group B.V. >+ * Copyright (C) 2016 Igalia S.L >+ * >+ * This library is free software; you can redistribute it and/or >+ * modify it under the terms of the GNU Library General Public >+ * License as published by the Free Software Foundation; either >+ * version 2 of the License, or (at your option) any later version. >+ * >+ * This library is distributed in the hope that it will be useful, >+ * but WITHOUT ANY WARRANTY; without even the implied warranty of >+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU >+ * Library General Public License for more details. >+ * >+ * You should have received a copy of the GNU Library General Public License >+ * aint with this library; see the file COPYING.LIB. If not, write to >+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, >+ * Boston, MA 02110-1301, USA. >+ */ >+ >+#ifndef AppendPipeline_h >+#define AppendPipeline_h >+ >+#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE) && ENABLE(VIDEO_TRACK) >+ >+#include "GRefPtrGStreamer.h" >+#include "MediaPlayerPrivateGStreamerMSE.h" >+#include "MediaSourceClientGStreamerMSE.h" >+#include "SourceBufferPrivateGStreamer.h" >+ >+#include <gst/gst.h> >+#include <wtf/Condition.h> >+ >+namespace WebCore { >+ >+class AppendPipeline : public ThreadSafeRefCounted<AppendPipeline> { >+public: >+ enum AppendState { Invalid, NotStarted, Ongoing, KeyNegotiation, DataStarve, Sampling, LastSample, Aborting }; >+ >+ AppendPipeline(PassRefPtr<MediaSourceClientGStreamerMSE>, PassRefPtr<SourceBufferPrivateGStreamer>, MediaPlayerPrivateGStreamerMSE*); >+ virtual ~AppendPipeline(); >+ >+ void handleElementMessage(GstMessage*); >+ void handleApplicationMessage(GstMessage*); >+ >+ gint id(); >+ AppendState appendState() { return m_appendState; } >+ void setAppendState(AppendState); >+ >+ GstFlowReturn handleNewAppsinkSample(GstElement*); >+ GstFlowReturn pushNewBuffer(GstBuffer*); >+ >+ // Takes ownership of caps. >+ void parseDemuxerSrcPadCaps(GstCaps*); >+ void appsinkCapsChanged(); >+ void appsinkNewSample(GstSample*); >+ void appsinkEOS(); >+ void didReceiveInitializationSegment(); >+ AtomicString trackId(); >+ void abort(); >+ >+ void clearPlayerPrivate(); >+ RefPtr<MediaSourceClientGStreamerMSE> mediaSourceClient() { return m_mediaSourceClient; } >+ RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate() { return m_sourceBufferPrivate; } >+ GstBus* bus() { return m_bus.get(); } >+ GstElement* pipeline() { return m_pipeline.get(); } >+ GstElement* appsrc() { return m_appsrc.get(); } >+ GstElement* appsink() { return m_appsink.get(); } >+ GstCaps* demuxerSrcPadCaps() { return m_demuxerSrcPadCaps.get(); } >+ GstCaps* appsinkCaps() { return m_appsinkCaps.get(); } >+ RefPtr<WebCore::TrackPrivateBase> track() { return m_track; } >+ WebCore::MediaSourceStreamTypeGStreamer streamType() { return m_streamType; } >+ >+ void disconnectDemuxerSrcPadFromAppsinkFromAnyThread(); >+ void connectDemuxerSrcPadToAppsinkFromAnyThread(GstPad*); >+ void connectDemuxerSrcPadToAppsink(GstPad*); >+ >+ void reportAppsrcAtLeastABufferLeft(); >+ void reportAppsrcNeedDataReceived(); >+ >+private: >+ void resetPipeline(); >+ void checkEndOfAppend(); >+ void handleAppsrcAtLeastABufferLeft(); >+ void handleAppsrcNeedDataReceived(); >+ void removeAppsrcDataLeavingProbe(); >+ void setAppsrcDataLeavingProbe(); >+ >+private: >+ RefPtr<MediaSourceClientGStreamerMSE> m_mediaSourceClient; >+ RefPtr<SourceBufferPrivateGStreamer> m_sourceBufferPrivate; >+ MediaPlayerPrivateGStreamerMSE* m_playerPrivate; >+ >+ // (m_mediaType, m_id) is unique. >+ gint m_id; >+ >+ MediaTime m_initialDuration; >+ >+ GstFlowReturn m_flowReturn; >+ >+ GRefPtr<GstElement> m_pipeline; >+ GRefPtr<GstBus> m_bus; >+ GRefPtr<GstElement> m_appsrc; >+ GRefPtr<GstElement> m_demux; >+#if ENABLE(LEGACY_ENCRYPTED_MEDIA) >+ GRefPtr<GstElement> m_decryptor; >+#endif >+ // The demuxer has one src stream only, so only one appsink is needed and linked to it. >+ GRefPtr<GstElement> m_appsink; >+ >+ Lock m_newSampleLock; >+ Condition m_newSampleCondition; >+ Lock m_padAddRemoveLock; >+ Condition m_padAddRemoveCondition; >+ >+ GRefPtr<GstCaps> m_appsinkCaps; >+ GRefPtr<GstCaps> m_demuxerSrcPadCaps; >+ FloatSize m_presentationSize; >+ >+ bool m_appsrcAtLeastABufferLeft; >+ bool m_appsrcNeedDataReceived; >+ >+ gulong m_appsrcDataLeavingProbeId; >+#if !LOG_DISABLED >+ struct PadProbeInformation m_demuxerDataEnteringPadProbeInformation; >+ struct PadProbeInformation m_appsinkDataEnteringPadProbeInformation; >+#endif >+ >+ // Keeps track of the states of append processing, to avoid performing actions inappropriate for the current state >+ // (eg: processing more samples when the last one has been detected, etc.). See setAppendState() for valid >+ // transitions. >+ AppendState m_appendState; >+ >+ // Aborts can only be completed when the normal sample detection has finished. Meanwhile, the willing to abort is >+ // expressed in this field. >+ bool m_abortPending; >+ >+ WebCore::MediaSourceStreamTypeGStreamer m_streamType; >+ RefPtr<WebCore::TrackPrivateBase> m_oldTrack; >+ RefPtr<WebCore::TrackPrivateBase> m_track; >+ >+ GRefPtr<GstBuffer> m_pendingBuffer; >+}; >+ >+} // namespace WebCore. >+ >+#endif // USE(GSTREAMER) >+#endif
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 162874
:
290503
|
290504
|
291756
|
291757
|
292248
|
292401
|
292738
|
292888