Giter Site home page Giter Site logo

dash-live-source-simulator's People

Contributors

adithyanilangovan avatar joywelt avatar kenthmobitv avatar niteeshbhat avatar paulrutland avatar tobbee avatar tobbemobitv avatar zangue avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dash-live-source-simulator's Issues

A question on the format of minimumUpdatePeriod

Hi,
Some contents provided by Dash-Industry-Forum show the format of minimumUpdatePeriod like below.

The contents are in the following links.
http://vm2.dashif.org/livesim/testpic_2s/Manifest.mpd
http://vm2.dashif.org/livesim-dev/testpic_2s/Manifest.mpd

And if you open the mpd files, you can see that minimumUpdatePeriod are written like below.
minimumUpdatePeriod="P100Y"

As I know, it's usually represented "PT100Y", including 'T', which means presented timestamp.
However, it doesn't include 'T' .
I'm wondering if this is written in Dash specification. Can you share the document if you have?

Live sim redirecting to plain http

When I request https://livesim.dashif.org/livesim/utc_head/testpic_2s/Manifest.mpd, I get redirected to the same URL with plain "http". This results in a mixed content error and a failure in the Shaka Player demo: https://shaka-player-demo.appspot.com/demo/#asset=https://livesim.dashif.org/livesim/utc_head/testpic_2s/Manifest.mpd

Is something misconfigured on the live sim server?

This doesn't happen with https://livesim.dashif.org/dash/vod/testpic_2s/img_subs.mpd for some reason.

Thumbnail test vector -Conformance issue

http://vm2.dashif.org/livesim-dev/testpic_2s/Manifest_thumbs.mpd.

This vector fails conformance test with error report as

Line:Col[44:76]:cvc-datatype-valid.1.2.1: 'thumbs' is not a valid value for 'integer'.
Line:Col[44:76]:cvc-attribute.3: The value 'thumbs' of attribute 'id' on element 'AdaptationSet' is not valid with respect to its type, 'unsignedInt'.
Unexpected error: For input string: "thumbs"
MPD validation not successful - DASH is not valid!

It turns out from DASH schema that AdaptationSet @id should be unsignedInt .. but here its string.

Also EssentialProperty schemeIdUri for thumbnail should be 'http://dashif.org/guidelines/thumbnail_tile'

BaseURL does not preserve protocol

BaseURLs inserted in the manifest always have the protocol "http", even when the MPD url has protocol "https". This needs to be changed if servers configured to insert BaseURL should work with https.

The usage of BaseURL is controlled by the SET_BASEURL constant in mpdprocessor.py.

Bug? Very high buffering during live session.

Hi,

recently I have been testing live streaming with dash.js plaver v1.6.0 using the live source provided by the simulator (http://vm2.dashif.org/livesim/testpic_2s/Manifest.mpd) and I have observed buffering of up to 30 seconds (see picture) which is a bit curious for a live session.
I then tried with another live source (http://bitlivedemo-a.akamaihd.net/mpds/stream.php?streamkey=bitcodin) and the buffer level was constantly 0 and never exceeded 2 seconds (segment length).
So, I was wondering if there might be some issue with the simulator?
livebuffer

Add segmentTimeline example

All examples use segmentTemplate with $number$ substitution. Since DASH-AVC/264 also support s segmentTimeline addressing, it would be great to have a reference example that used $time$ substitution. Would also be good if example included

  1. Segments that varied in duration some amount, with some use of the 'r' parameter
  2. A DVR window of at least 5min.

Support for non-zero startNumber

One test case for live is to have no startNumber attribute in the MPD. This corresponds to the default value of 1.

To handle this and other cases, it is suggested to add another URI parameter to start at any startNumber. For example, sn_X, if X = 1, there should be no startNumber in the Manifest.

Inconsistent timestamps across updates in SegmentTimeline

When playing this live stream:
https://livesim.dashif.org/livesim/segtimeline_1/utc_head/testpic_6s/Manifest.mpd

I find that the timestamps in the SegmentTimeline are inconsistent across updates. For example:

<SegmentTimeline>
<S d="288768" t="75776683106304" />
<S d="287744" />

The first segment starts at 75776683106304 and ends at 75776683106304 + 288768 = 75776683395072. After updating the manifest, the timestamp jumps:

<SegmentTimeline>
<S d="287744" t="75776683392000" />

75776683395072 jumped backward to 75776683392000, a difference of 3072 timescale units. The timescale is 48000, so the discrepancy is 3072 / 48000 seconds, or 64 ms.

It seems like this sort of thing shouldn't happen, especially given that the segment template is based on $Time$. When the first timestamp changes, every other segment timestamp changes, too. With every segment start time and URL changing at once, and with no startNumber to base things on, the segments before and after the update have nothing in common with each other.

Here are two complete manifests captured from the live sim from which I took the snippets above:

Is this a bug in the live sim? If not, how should I interpret the change in timestamps if there are no matching segment timestamps from one manifest update to the next?

P.S. Thanks for creating and maintaining this very useful tool!

Multiple moof/mdat pairs and sequence number to be unique

DashLiveSIM is using segment 1..n filename number in a fragment sequencenumber field. This is fine if m4s file had just one moof/mdat fragment pair. Having multiple pairs such as a low-latency live test then seqnum should be incremented inside the m4s file per fragment pair.

Should this numbering be a global or can it be reset(=step backward a little bit) after each m4s segment file? This is something to think about and affects the overall implementation. Global numbering need more changes in a source code.

The SequenceNumber value for a fragment that appears later in the timeline MUST be greater than the value for a fragment that appears earlier in the timeline, but SequenceNumber values for consecutive fragments are not required to be consecutive.

publishTime missing

According to the standard and the conformance software, if MPD is of type "dynamic" publishTime shall be defined. It's currently missing in develop and feature-segment-timeline (maybe also other) branches.

Check audio code for English language

As reported by a user:

For the following link

    https://livesim.dashif.org/livesim/testpic_2s/Manifest.mpd

The audio code is "eng" although according to RFC 5646 its only "en".

Timestamp problems video vs audio

Playing http://vm2.dashif.org/livesim/testpic_6s/Manifest.mpd on Firefox highlights an issue with timestamps generated for the audio streams.

Looking at the media segments, the baseMediaDecodeTimes on the audio stream for the same segment number are exactly 1.875 (=== 90000/48000) multiples of the video. Is the VOD -> live timestamp conversion for audio using the wrong timescale for audio or something?

A similar issue was noted Dash-Industry-Forum/dash.js#649 (comment)

Chrome doesn't seem to care and does its best to play anyway

Multi-period with timeline playback issue

Hi,

Shaka Player complains about this manifest and does not download any segment: http://vm2.dashif.org/livesim-dev/utc_direct/segtimeline_1/periods_30/testpic_6s/multiaudio.mpd

When it was trying to play the following snapshot of the stream (multiaudio.txt), it reported the following debug logs (shaka.txt).

However, if you try with the following url, it plays fine: http://vm2.dashif.org/livesim-dev/utc_direct/segtimeline_1/testpic_6s/multiaudio.mpd.

As I believe that the only difference between the two streams is multi-periodicity, there may be an issue in the generated manifest using "periods_30".

I could not play multiple periods in DASH player

Hi dear,

I created MPD file with two periods but when first period is about to finish the streaming start to freeze and stops playing. Could you please help me ? What is wrong with my MPD file ? I attached MPD file and media segments for testing. Thank you in advance.
test_data.zip

Vodanalyzer for multifragment(moof/mdat) segment duration

If input vod segments has pre-generated multifragment(multiple moof/mdat pairs) then mediasegmentfilter does not calculate the correct segment duration. It always return the last fragment duration. See this fix to keep duration increasing across all fragments.

dashlivesim/dashlib/mediasegmentfilter.py:

  def process_trun(self, data):
    ...
    sample_flags_present = flags & 0x400
    sample_comp_time_present = flags & 0x800
    duration = 0 if self.duration==None else self.duration  ##make multifrag work
    for _ in range(sample_count):
    ...

See also Issue #68 multifragment sequencenumbering.

Multiplexed V1_1.m4s+A1_1.m4s manifest examples

https://github.com/Dash-Industry-Forum/dash-live-source-simulator/wiki

Multiplexed Content
For eMBMS, better robustness can be achieved by multiplexing audio and video segments. This can be done automatically by the server. It happens if the path to a segment has two underscores in the next to last path component, like .../V1__A1/123.m4s. In this case the two segments V1/123.m4s and V2/123.m4s are fetched and multiplexed. The corresponding thing happens for the init segments. For this to work, the MPD must be manually changed to have a multiplexed representation.

I guess this paragraph has a typo it should say V1/123.m4s and A1/123.m4s segment payloads(video+audio) are merged?

Do you have an example for the live stream how this manifest should be edited. If one segment file contains data for both video+audio payload how is AdaptationSet/Representation elements be written?

normal live: http://livesim.dashif.org/livesim/testpic_2s/Manifest.mpd

Content generation guidelines

Hi,

I'm trying to use the simulator with vod dataset from http://www-itec.uni-klu.ac.at/ftp/datasets/DASHDataset2014 to test live source with multiple representations without success: the vod content analyzer would always failed at some point.

I have noticed that it is because it requires some set of informations to be present at MPD and media segment level, e.g, manifest attributes like period id, represenation contentType among other. It also seems to parse the media segments to get some infos from metadata/boxes.

So my questions is: are they any guidelines for content generation to make sure there is enough infos for the vod analyzer to run successfully. Or could you guys make available what tools and configs you used to generate data available at http://vm2.dashif.org/dash/vod/testpic_2s/ ?

Thanks

UTCTiming test vector not conforming

The provided UTCTiming vector:
http://vm2.dashif.org/livesim/utc_direct-head/testpic_2s/Manifest.mpd
raises error in the conformance software:
http://dashif.org/conformance.html

The following error is shown:

Line:Col[15:89]:cvc-complex-type.2.4.a: Invalid content was found starting with element 'UTCTiming'. One of '{"urn:mpeg:dash:schema:mpd:2011":ProgramInformation, "urn:mpeg:dash:schema:mpd:2011":BaseURL, "urn:mpeg:dash:schema:mpd:2011":Location, "urn:mpeg:dash:schema:mpd:2011":Period}' is expected.
MPD validation not successful - DASH is not valid!

Upon further investigation of the provided test vector, it was found that the UTCTiming element appears before other elements such as Period and BaseURL.

But the DASH-MPD.xsd schema states that the UTCTiming element should be present after all the other elements. The simple fix is to use move the UTCTiming element to the end, after all the other elements.

I am working on to provide a PR for this issue.

[Documentation] Is there any document on how source content is created for https://livesim.dashif.org/

The DASH URL such as https://livesim.dashif.org/livesim-chunked/chunkdur_1/ato_7/testpic4_8s/Manifest300.mpd provides content with time code burned based on server hour clock so we can see how far behind live we are.
I don't have understanding of how to create this. Does anyone have information to replicate a DASH server like that quickly?
Regarding why I raised this as an issue, I was not sure how to ask for doubts regarding the project so raised one. If this is the right way to ask queries, please let me know right procedure to reach out.

support for inband MPD update event

can we add an option to insert inband MPD update events with given frequency? e.g inbandmpdupdate_30s would be inserting inband update event every 30 second.

H.265/HEVC support

Hi, I tried to test my video encoded with H.265/HEVC codec. It was okay to stream as a VOD streaming service, but it failed when I tried it through dash-live-source-simulator.

videoCodec (video/mp4;codecs="hvc1.1.6.L93.90") is not supported.

According to 'the Guidelines for Implementation', DASH supports both H.264/AVC and H.265/HEVC. So, my question is, does 'Dash-live-source-simulator' supports H.265/HEVC codec for live streaming service?, if not, how can I simulate the DASH live streaming service for UHD video?

Thanks.

MPD timescale not considered for vod analysis

The VoD analyzer uses segment duration from MPD and assumes it to be in seconds without considering the timescale value. This means e.g. for VoD MPD with @duration = 2000 and @timescale = 1000 the analyzer will set a duration of 2000 seconds for a segment. Leading to analysis failure due to drift detection in segment duration that actually aren't there.

Instead the analyzer should derivate the segment duration value in seconds using the timescale value, i.e @duration / @timescale

Tracks are disabled

Track header flags in initialization segment indicate that the track is disabled, which should not be the case (disabled tracks should be ignored during presentation). Is there any reason for this ?

jean.

SegmentTimeline and wraps

I think I see an inconsistency when using SegmentTimeline and when the testpic_2s source wraps around.

Although you can wait to request the manifest around the top of each hour, an immediate way to reproduce this seems to be to simulate a manifest pull right before the wrap around using the start_ parameter via:

START=$(($(date +%s)-3598)); wget -qO - http://livesim.dashif.org/livesim/start_$START/segtimeline_1/testpic_2s/Manifest.mpd

showing a timeline 296640000-323640000:

<S d="180000" r="150" t="296640000" />

and right after the wrap via

START=$(($(date +%s)-3602)); wget -qO - http://livesim.dashif.org/livesim/start_$START/segtimeline_1/testpic_2s/Manifest.mpd`

showing a timeline 297000000-2973600000:

<S d="180000" t="297000000" />
<S d="180000" />

I think in the above case, the first timeline entry needs a @r of roughly 149, because the previous manifest indicates the timeline ending at 323640000.

Scaling presentationTimeOffset by timescale

When using multiple periods, each period has a presentationTimeOffset attribute added to the SegmentTemplate.
This value should be scaled by the SegmentTemplate timescale, if present.
DASH specification says:

presentationTimeOffset : The value of the presentation time offset in seconds is the division of the value of this attribute and the value of the timescale attribute.

If you enable segment timeline output the problem is clear, audio and video adaptation sets get different timescales but use the same presentationTimeOffset value.
eg http://vm2.dashif.org/livesim-dev/segtimeline_1/periods_4/testpic_6s/Manifest.mpd

When not using segment timeline the timescale defaults to 1 and the presentationTimeOffset used is the same as the period start time.

I think the mpdprocessor needs to rescale the presentationTimeOffset when it sets the segment template timescale?

SCTE35 InbandEventStream has no @value attribute

SCTE 67 2014, 12.1.5.4 describes the signalling of inband SCTE35 events in the MPD. It states that "the value attribute should match the PID value of the original SCTE 35 messages".

Whilst technically @value is optional, the owner of the schemeIdUri has defined its meaning and therefore I think it is required in this case.

emsg.value is non-null ("1001") - this should be the value of @value.

I expect implementations will match schemeIdUri/value pairs defined in the MPD and emsg boxes and would fail to associate the emsg with any event stream and discard the message.

Set cache control headers on time sync URL

The time sync URL (e.g. http://vm2.dashif.org/dash/time.txt) is used for determining clock sync. The clients should send a HEAD request to get the Date header from the response. Unfortunately the response does not include cache control headers, meaning the response may be cached. Firefox will cache the entire HEAD response, not sending it at all.

The minimum would be to set Cache-control: no-cache to ensure the HEAD is at least sent again. Setting Cache-control: no-cache, no-store would mean it cannot be cached at all.

Originally filed in shaka-project/shaka-player#606

Atom moov/mvex/mehd is not removed from init.mp4 file

Init mp4 file has moov/mvex/mehd atom indicating the total duration of vod file. This atom is not necessary in a live dash feed. My limited testing indicates it's not a problem but probably atom should still be dropped.

  • Content packager could create a batch process to modify files before copying to the dashlivesim/vod folder.
  • Add vod-analyzer livesim command-line attribute to tell modify an init segment while writing .cfg file.
  • Modify init segment at runtime before returning to http client.

Support for multiple baseURLs

One of the live test vectors in DASH-IF is multiple baseURLs.

A first test case is to just be able to parse this in the client and choose one or the other.

A second test case it to provide one BaseURL which has a less good source, which some segments missing etc, and have the client change if needed.

HTTPS variable not passed to mod_wsgi/mod_dashlivesim.py

Hi,

My environment:

  • CentOS Linux release 7.5.1804 (Core)
  • httpd 2.4.6
  • mod_wsgi 3.4
  • mod_ssl 2.4.6

I'm seeing issues identical to that of issue #14, where the BaseURL is always http. Having done some tests the HTTPS environmental doesn't get included in mod_wsgi/mod_dashlivesim.py

According to Graham Dumpleton on Twitter:

The CGI HTTPS variable is deliberately stripped after setting wsgi.url_scheme to make those applications which used HTTPS in a non WSGI conformant way to fix their code. So if the application is using old CGI HTTPS variable the application should be fixed.

After a very quick search, I can't see any documentation to back this up, but it certainly confirms what I am seeing.

Too much drift in the duration of the Segments

Hi! I'm am currently using the DASH simulator (with my own VOD content) for testing purposes, however, am unable to generate the config file with the VOD analyzer tools.

The failure occurs when the VOD analyzer begins to analyze video content through the video segmenter when meeting the content limitation (as explained here: https://github.com/Dash-Industry-Forum/dash-live-source-simulator/wiki/Content-Limitations).
Does anyone have the same issue? Please let me know and thanks in advance for any advice/solutions!
1

Add version 1 event message support

event messages (as used by scte35.py) use the version 0 variant of the emsg box. The dashif client has been updated to support version 1 event messages (well, almost, see Dash-Industry-Forum/dash.js#3196) but is hard to test without a ready source for these.

The following diff updates emsg.py to handle both version 0 and version 1 event messages.

48c48
< from .structops import uint32_to_str
---
> from .structops import uint32_to_str, uint64_to_str
56c56
<                  emsg_id=0, messagedata=""):
---
>                  emsg_id=0, messagedata="", version=0):
62a63
>         self.version = version
67d67
<         size = 12 + 4*4 + len(self.scheme_id_uri) + 1 + len(self.value) + 1 + len(self.messagedata)
69,78c69,94
<         parts.append(uint32_to_str(size))
<         parts.append("emsg")
<         parts.append("\x00\x00\x00\x00")
<         parts.append(self.scheme_id_uri + "\x00")
<         parts.append(self.value + "\x00")
<         parts.append(uint32_to_str(self.timescale))
<         parts.append(uint32_to_str(self.presentation_time_delta))
<         parts.append(uint32_to_str(self.event_duration))
<         parts.append(uint32_to_str(self.emsg_id))
<         parts.append(self.messagedata)
---
>         if self.version == 0:
>           size = 12 + 4*4 + len(self.scheme_id_uri) + 1 + len(self.value) + 1 + len(self.messagedata)
>           parts.append(uint32_to_str(size))
>           parts.append("emsg")
>           parts.append("\x00")  #version
>           parts.append("\x00\x00\x00") #flags
>           parts.append(self.scheme_id_uri + "\x00")
>           parts.append(self.value + "\x00")
>           parts.append(uint32_to_str(self.timescale))
>           print "presentation_time_delta = %d"%self.presentation_time_delta
>           parts.append(uint32_to_str(self.presentation_time_delta))
>           parts.append(uint32_to_str(self.event_duration))
>           parts.append(uint32_to_str(self.emsg_id))
>         else:
>           size = 12 + 3*4 + 8 + len(self.scheme_id_uri) + 1 + len(self.value) + 1 + len(self.messagedata)
>           parts.append(uint32_to_str(size)) # size
>           parts.append("emsg")   #box name
>           parts.append("\x01") #version
>           parts.append("\x00\x00\x00") #flags
>           parts.append(uint32_to_str(self.timescale)) # timescale
>           parts.append(uint64_to_str(self.presentation_time_delta)) #presentation_time
>           parts.append(uint32_to_str(self.event_duration)) #duration
>           parts.append(uint32_to_str(self.emsg_id)) #id
>           parts.append(self.scheme_id_uri + "\x00") #scheme_id_uri
>           parts.append(self.value + "\x00") # value
>         parts.append(self.messagedata) #messagedata
91c107
<                 message_data=""):
---
>                 message_data="",version=0):
93c109
<     emsg = Emsg(scheme_id_uri, value, timescale, presentation_time_delta, event_duration, emsg_id, message_data)
---
>     emsg = Emsg(scheme_id_uri, value, timescale, presentation_time_delta, event_duration, emsg_id, message_data, version)

saio is not updated to reflect a change in the position of senc

If the input segment uses tfdt box version 0, when baseMediaDecodeTime is modified, tfdt may change version to 1 as a result of baseMediaDecodeTime growing beyond the limit of 32 bits (see process_tfdt in mediasegmentfilter.py).
As a result, the size of tfdt increases and the offsets stored in saio (referencing Sample Auxiliary Information stored inside saiz) may no longer be valid, resulting in an inconsistent (broken) file.

See this example input file opened in isoviewer:
2019-07-18-092513_830x509_scrot

We can see how the one offset in saio equals 721. When mediasegmentfilter.py increases the size of tfdt (which in this file is version 0), senc will move and the offset will not longer be valid.

Since dash-live-source-simulator modifies the size of boxes while rewriting them, there may be more such internal references that need fixing.

No lmsg at end of period

An lmsg "token" is inserted at the end of a presentation, e.g. for a time-limited or periodic service. However, there is no lmsg at the end of each period for multi-period content.

This should be fairly easy to add, by having the period information in the URL and check if the currently requested segment is the last of a period.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.