dash-industry-forum / dash-live-source-simulator Goto Github PK
View Code? Open in Web Editor NEWDASH live source simulator providing reference live content.
License: Other
DASH live source simulator providing reference live content.
License: Other
Hi,
Some contents provided by Dash-Industry-Forum show the format of minimumUpdatePeriod like below.
The contents are in the following links.
•http://vm2.dashif.org/livesim/testpic_2s/Manifest.mpd
•http://vm2.dashif.org/livesim-dev/testpic_2s/Manifest.mpd
And if you open the mpd files, you can see that minimumUpdatePeriod are written like below.
minimumUpdatePeriod="P100Y"
As I know, it's usually represented "PT100Y", including 'T', which means presented timestamp.
However, it doesn't include 'T' .
I'm wondering if this is written in Dash specification. Can you share the document if you have?
When I request https://livesim.dashif.org/livesim/utc_head/testpic_2s/Manifest.mpd, I get redirected to the same URL with plain "http". This results in a mixed content error and a failure in the Shaka Player demo: https://shaka-player-demo.appspot.com/demo/#asset=https://livesim.dashif.org/livesim/utc_head/testpic_2s/Manifest.mpd
Is something misconfigured on the live sim server?
This doesn't happen with https://livesim.dashif.org/dash/vod/testpic_2s/img_subs.mpd for some reason.
The manifest for SegmentTimeline, like:
/livesim/segtimeline_1/testpic_2s/Manifest.mpd
results in presentationTimeOffset value which is a very long string of zeroes.
http://vm2.dashif.org/livesim-dev/testpic_2s/Manifest_thumbs.mpd.
This vector fails conformance test with error report as
Line:Col[44:76]:cvc-datatype-valid.1.2.1: 'thumbs' is not a valid value for 'integer'.
Line:Col[44:76]:cvc-attribute.3: The value 'thumbs' of attribute 'id' on element 'AdaptationSet' is not valid with respect to its type, 'unsignedInt'.
Unexpected error: For input string: "thumbs"
MPD validation not successful - DASH is not valid!
It turns out from DASH schema that AdaptationSet @id should be unsignedInt .. but here its string.
Also EssentialProperty schemeIdUri for thumbnail should be 'http://dashif.org/guidelines/thumbnail_tile'
BaseURLs inserted in the manifest always have the protocol "http", even when the MPD url has protocol "https". This needs to be changed if servers configured to insert BaseURL should work with https.
The usage of BaseURL is controlled by the SET_BASEURL constant in mpdprocessor.py.
Hi,
recently I have been testing live streaming with dash.js plaver v1.6.0 using the live source provided by the simulator (http://vm2.dashif.org/livesim/testpic_2s/Manifest.mpd) and I have observed buffering of up to 30 seconds (see picture) which is a bit curious for a live session.
I then tried with another live source (http://bitlivedemo-a.akamaihd.net/mpds/stream.php?streamkey=bitcodin) and the buffer level was constantly 0 and never exceeded 2 seconds (segment length).
So, I was wondering if there might be some issue with the simulator?
All examples use segmentTemplate with
One test case for live is to have no startNumber attribute in the MPD. This corresponds to the default value of 1.
To handle this and other cases, it is suggested to add another URI parameter to start at any startNumber. For example, sn_X, if X = 1, there should be no startNumber in the Manifest.
While testing a SegmentTimeline based stream generated by livesim (https://vm2.dashif.org/livesim-dev/segtimeline_1/testpic_2s/Manifest.mpd) I have noticed there is a time misalignment in audio chunks.
More specifically, start times declared in the mpd, and used to name the segments, are around 2 seconds behind the real pts within the segments.
When playing this live stream:
https://livesim.dashif.org/livesim/segtimeline_1/utc_head/testpic_6s/Manifest.mpd
I find that the timestamps in the SegmentTimeline are inconsistent across updates. For example:
<SegmentTimeline>
<S d="288768" t="75776683106304" />
<S d="287744" />
The first segment starts at 75776683106304 and ends at 75776683106304 + 288768 = 75776683395072. After updating the manifest, the timestamp jumps:
<SegmentTimeline>
<S d="287744" t="75776683392000" />
75776683395072 jumped backward to 75776683392000, a difference of 3072 timescale units. The timescale is 48000, so the discrepancy is 3072 / 48000 seconds, or 64 ms.
It seems like this sort of thing shouldn't happen, especially given that the segment template is based on $Time$
. When the first timestamp changes, every other segment timestamp changes, too. With every segment start time and URL changing at once, and with no startNumber to base things on, the segments before and after the update have nothing in common with each other.
Here are two complete manifests captured from the live sim from which I took the snippets above:
Is this a bug in the live sim? If not, how should I interpret the change in timestamps if there are no matching segment timestamps from one manifest update to the next?
P.S. Thanks for creating and maintaining this very useful tool!
DashLiveSIM is using segment 1..n filename number in a fragment sequencenumber field. This is fine if m4s file had just one moof/mdat
fragment pair. Having multiple pairs such as a low-latency live test then seqnum should be incremented inside the m4s file per fragment pair.
Should this numbering be a global or can it be reset(=step backward a little bit) after each m4s segment file? This is something to think about and affects the overall implementation. Global numbering need more changes in a source code.
The SequenceNumber value for a fragment that appears later in the timeline MUST be greater than the value for a fragment that appears earlier in the timeline, but SequenceNumber values for consecutive fragments are not required to be consecutive.
According to the standard and the conformance software, if MPD is of type "dynamic" publishTime shall be defined. It's currently missing in develop and feature-segment-timeline (maybe also other) branches.
Hello,
Thanks for this useful repository.
I'd like to report that In:
2 examples point to same URL:
The URL being:
Regards
As reported by a user:
For the following link
https://livesim.dashif.org/livesim/testpic_2s/Manifest.mpd
The audio code is "eng" although according to RFC 5646 its only "en".
Playing http://vm2.dashif.org/livesim/testpic_6s/Manifest.mpd on Firefox highlights an issue with timestamps generated for the audio streams.
Looking at the media segments, the baseMediaDecodeTimes on the audio stream for the same segment number are exactly 1.875 (=== 90000/48000) multiples of the video. Is the VOD -> live timestamp conversion for audio using the wrong timescale for audio or something?
A similar issue was noted Dash-Industry-Forum/dash.js#649 (comment)
Chrome doesn't seem to care and does its best to play anyway
Hi,
Shaka Player complains about this manifest and does not download any segment: http://vm2.dashif.org/livesim-dev/utc_direct/segtimeline_1/periods_30/testpic_6s/multiaudio.mpd
When it was trying to play the following snapshot of the stream (multiaudio.txt), it reported the following debug logs (shaka.txt).
However, if you try with the following url, it plays fine: http://vm2.dashif.org/livesim-dev/utc_direct/segtimeline_1/testpic_6s/multiaudio.mpd.
As I believe that the only difference between the two streams is multi-periodicity, there may be an issue in the generated manifest using "periods_30".
Hi dear,
I created MPD file with two periods but when first period is about to finish the streaming start to freeze and stops playing. Could you please help me ? What is wrong with my MPD file ? I attached MPD file and media segments for testing. Thank you in advance.
test_data.zip
If input vod segments has pre-generated multifragment(multiple moof/mdat pairs) then mediasegmentfilter does not calculate the correct segment duration. It always return the last fragment duration. See this fix to keep duration increasing across all fragments.
dashlivesim/dashlib/mediasegmentfilter.py:
def process_trun(self, data):
...
sample_flags_present = flags & 0x400
sample_comp_time_present = flags & 0x800
duration = 0 if self.duration==None else self.duration ##make multifrag work
for _ in range(sample_count):
...
See also Issue #68 multifragment sequencenumbering.
https://github.com/Dash-Industry-Forum/dash-live-source-simulator/wiki
Multiplexed Content
For eMBMS, better robustness can be achieved by multiplexing audio and video segments. This can be done automatically by the server. It happens if the path to a segment has two underscores in the next to last path component, like .../V1__A1/123.m4s. In this case the two segments V1/123.m4s and V2/123.m4s are fetched and multiplexed. The corresponding thing happens for the init segments. For this to work, the MPD must be manually changed to have a multiplexed representation.
I guess this paragraph has a typo it should say V1/123.m4s and A1/123.m4s
segment payloads(video+audio) are merged?
Do you have an example for the live stream how this manifest should be edited. If one segment file contains data for both video+audio payload how is AdaptationSet/Representation
elements be written?
normal live: http://livesim.dashif.org/livesim/testpic_2s/Manifest.mpd
Hi,
I'm trying to use the simulator with vod dataset from http://www-itec.uni-klu.ac.at/ftp/datasets/DASHDataset2014 to test live source with multiple representations without success: the vod content analyzer would always failed at some point.
I have noticed that it is because it requires some set of informations to be present at MPD and media segment level, e.g, manifest attributes like period id, represenation contentType among other. It also seems to parse the media segments to get some infos from metadata/boxes.
So my questions is: are they any guidelines for content generation to make sure there is enough infos for the vod analyzer to run successfully. Or could you guys make available what tools and configs you used to generate data available at http://vm2.dashif.org/dash/vod/testpic_2s/ ?
Thanks
The provided UTCTiming vector:
http://vm2.dashif.org/livesim/utc_direct-head/testpic_2s/Manifest.mpd
raises error in the conformance software:
http://dashif.org/conformance.html
The following error is shown:
Line:Col[15:89]:cvc-complex-type.2.4.a: Invalid content was found starting with element 'UTCTiming'. One of '{"urn:mpeg:dash:schema:mpd:2011":ProgramInformation, "urn:mpeg:dash:schema:mpd:2011":BaseURL, "urn:mpeg:dash:schema:mpd:2011":Location, "urn:mpeg:dash:schema:mpd:2011":Period}' is expected.
MPD validation not successful - DASH is not valid!
Upon further investigation of the provided test vector, it was found that the UTCTiming element appears before other elements such as Period and BaseURL.
But the DASH-MPD.xsd schema states that the UTCTiming element should be present after all the other elements. The simple fix is to use move the UTCTiming element to the end, after all the other elements.
I am working on to provide a PR for this issue.
https://testassets.dashif.org/#testvector/details/58a5da727459f8cb201b8a56
This test vector has EventStream
element with messageData
attribute; however, messageData
attribute is an attribute of Event
element according to ISO/IEC 23009-1.
In mpdprocessor.py the create_inline_mpdcallback_elem(BaseURLSegmented)
function assigns messageData
attribute to EventStream
element.
The URL is throwing errors to the cache bypass technique used, e.g. URL: http://livesim.dashif.org/livesim-dev/periods_1/testpic_2s/Manifest.mpd?1560849939915
Response:
Unknown file extension: .mpd?1560849939915
So its taking the benign query string as a part of the URL.
Related issue: Dash-Industry-Forum/DASH-IF-Conformance#447
The DASH URL such as https://livesim.dashif.org/livesim-chunked/chunkdur_1/ato_7/testpic4_8s/Manifest300.mpd provides content with time code burned based on server hour clock so we can see how far behind live we are.
I don't have understanding of how to create this. Does anyone have information to replicate a DASH server like that quickly?
Regarding why I raised this as an issue, I was not sure how to ask for doubts regarding the project so raised one. If this is the right way to ask queries, please let me know right procedure to reach out.
I have an open issue in Dashjs relating to DashLiveSim. I put this here for reference.
Dash-Industry-Forum/dash.js#2863
Currently there is no @availabilityTimeOffset and @availabilityTimeComplete in the mpd for Low Latency(chunked). But I could see the BaseURL with @availabilityTimeOffset and @availabilityTimeComplete in the mpd until a few days ago.
Is it a bug of the simulator running? Please take a look at this.
can we add an option to insert inband MPD update events with given frequency? e.g inbandmpdupdate_30s would be inserting inband update event every 30 second.
Hi, I tried to test my video encoded with H.265/HEVC codec. It was okay to stream as a VOD streaming service, but it failed when I tried it through dash-live-source-simulator.
videoCodec (video/mp4;codecs="hvc1.1.6.L93.90") is not supported.
According to 'the Guidelines for Implementation', DASH supports both H.264/AVC and H.265/HEVC. So, my question is, does 'Dash-live-source-simulator' supports H.265/HEVC codec for live streaming service?, if not, how can I simulate the DASH live streaming service for UHD video?
Thanks.
The VoD analyzer uses segment duration from MPD and assumes it to be in seconds without considering the timescale value. This means e.g. for VoD MPD with @duration = 2000 and @timescale = 1000 the analyzer will set a duration of 2000 seconds for a segment. Leading to analysis failure due to drift detection in segment duration that actually aren't there.
Instead the analyzer should derivate the segment duration value in seconds using the timescale value, i.e @duration / @timescale
DLS accepts only int
values as segment duration in config files. For segment durations under a second an error is thrown. It should accept sub-second segment durations as well.
This line defines the segment duration in config files as int:
Track header flags in initialization segment indicate that the track is disabled, which should not be the case (disabled tracks should be ignored during presentation). Is there any reason for this ?
jean.
Only keyframes should be marked as is_sync in the mp4, however they all appear to be, at least in http://vm2.dashif.org/livesim/testpic_6s/Manifest.mpd. This causes decoding artefacts on ABR and seek.
(Originally from Dash-Industry-Forum/dash.js#1786)
I think I see an inconsistency when using SegmentTimeline and when the testpic_2s source wraps around.
Although you can wait to request the manifest around the top of each hour, an immediate way to reproduce this seems to be to simulate a manifest pull right before the wrap around using the start_ parameter via:
START=$(($(date +%s)-3598)); wget -qO - http://livesim.dashif.org/livesim/start_$START/segtimeline_1/testpic_2s/Manifest.mpd
showing a timeline 296640000-323640000:
<S d="180000" r="150" t="296640000" />
and right after the wrap via
START=$(($(date +%s)-3602)); wget -qO - http://livesim.dashif.org/livesim/start_$START/segtimeline_1/testpic_2s/Manifest.mpd`
showing a timeline 297000000-2973600000:
<S d="180000" t="297000000" />
<S d="180000" />
I think in the above case, the first timeline entry needs a @r of roughly 149, because the previous manifest indicates the timeline ending at 323640000.
When using multiple periods, each period has a presentationTimeOffset attribute added to the SegmentTemplate.
This value should be scaled by the SegmentTemplate timescale, if present.
DASH specification says:
presentationTimeOffset : The value of the presentation time offset in seconds is the division of the value of this attribute and the value of the timescale attribute.
If you enable segment timeline output the problem is clear, audio and video adaptation sets get different timescales but use the same presentationTimeOffset value.
eg http://vm2.dashif.org/livesim-dev/segtimeline_1/periods_4/testpic_6s/Manifest.mpd
When not using segment timeline the timescale defaults to 1 and the presentationTimeOffset used is the same as the period start time.
I think the mpdprocessor needs to rescale the presentationTimeOffset when it sets the segment template timescale?
SCTE 67 2014, 12.1.5.4 describes the signalling of inband SCTE35 events in the MPD. It states that "the value attribute should match the PID value of the original SCTE 35 messages".
Whilst technically @value
is optional, the owner of the schemeIdUri has defined its meaning and therefore I think it is required in this case.
emsg.value is non-null ("1001") - this should be the value of @value
.
I expect implementations will match schemeIdUri/value pairs defined in the MPD and emsg boxes and would fail to associate the emsg with any event stream and discard the message.
The time sync URL (e.g. http://vm2.dashif.org/dash/time.txt
) is used for determining clock sync. The clients should send a HEAD request to get the Date header from the response. Unfortunately the response does not include cache control headers, meaning the response may be cached. Firefox will cache the entire HEAD response, not sending it at all.
The minimum would be to set Cache-control: no-cache
to ensure the HEAD is at least sent again. Setting Cache-control: no-cache, no-store
would mean it cannot be cached at all.
Originally filed in shaka-project/shaka-player#606
The current publishTime implementation is too simplistic, see #20 for some discussion.
Init mp4 file has moov/mvex/mehd atom indicating the total duration of vod file. This atom is not necessary in a live dash feed. My limited testing indicates it's not a problem but probably atom should still be dropped.
dashlivesim/vod
folder.TrackID is declared as 2 in the init segment (both representations), but fragments use trackID 1
One of the live test vectors in DASH-IF is multiple baseURLs.
A first test case is to just be able to parse this in the client and choose one or the other.
A second test case it to provide one BaseURL which has a less good source, which some segments missing etc, and have the client change if needed.
As reported in Dash-Industry-Forum/dash.js#2345 there is a problem with
http://vm2.dashif.org/livesim/testpic_2s/cea608_and_segs.mpd
When you play this link with dash.js, and choose to show ttml subtitle.
The time is wrong and have 2s offset from correct time.
Looking at the manifest, the startNumber is 1 subTitles, while it is 0 for other media, which looks like a bug.
http://vm2.dashif.org/ seems to be down at the moment. Is this intentional? Will it be back any time soon?
Hi,
I have been trying to use this kind of url recently, but the "mup_30" parameter does not seem to be applied properly (keep getting minimumUpdatePeriod="PT0S" in the manifest).
https://vm2.dashif.org/livesim-dev/utc_direct/mup_30/segtimeline_1/periods_20/testpic_6s/multiaudio.mpd
Am I doing something wrong?
Thanks
The testurl "http://vm2.dashif.org/livesim/testpic_2s/two_regions.mpd" mentioned in https://github.com/Dash-Industry-Forum/dash-live-source-simulator/wiki/Test-URLs is not accessible:
Getting following error for this:
DASH Proxy Error: [Errno 2] No such file or directory: '/var/www/dash-live/content/testpic_2s/two_regions.mpd'
URL=/livesim/testpic_2s/two_regions.mpd
Can you please correct this.
Hi,
My environment:
I'm seeing issues identical to that of issue #14, where the BaseURL is always http. Having done some tests the HTTPS environmental doesn't get included in mod_wsgi/mod_dashlivesim.py
According to Graham Dumpleton on Twitter:
The CGI HTTPS variable is deliberately stripped after setting wsgi.url_scheme to make those applications which used HTTPS in a non WSGI conformant way to fix their code. So if the application is using old CGI HTTPS variable the application should be fixed.
After a very quick search, I can't see any documentation to back this up, but it certainly confirms what I am seeing.
Hi! I'm am currently using the DASH simulator (with my own VOD content) for testing purposes, however, am unable to generate the config file with the VOD analyzer tools.
The failure occurs when the VOD analyzer begins to analyze video content through the video segmenter when meeting the content limitation (as explained here: https://github.com/Dash-Industry-Forum/dash-live-source-simulator/wiki/Content-Limitations).
Does anyone have the same issue? Please let me know and thanks in advance for any advice/solutions!
event messages (as used by scte35.py
) use the version 0 variant of the emsg
box. The dashif client has been updated to support version 1 event messages (well, almost, see Dash-Industry-Forum/dash.js#3196) but is hard to test without a ready source for these.
The following diff updates emsg.py
to handle both version 0 and version 1 event messages.
48c48
< from .structops import uint32_to_str
---
> from .structops import uint32_to_str, uint64_to_str
56c56
< emsg_id=0, messagedata=""):
---
> emsg_id=0, messagedata="", version=0):
62a63
> self.version = version
67d67
< size = 12 + 4*4 + len(self.scheme_id_uri) + 1 + len(self.value) + 1 + len(self.messagedata)
69,78c69,94
< parts.append(uint32_to_str(size))
< parts.append("emsg")
< parts.append("\x00\x00\x00\x00")
< parts.append(self.scheme_id_uri + "\x00")
< parts.append(self.value + "\x00")
< parts.append(uint32_to_str(self.timescale))
< parts.append(uint32_to_str(self.presentation_time_delta))
< parts.append(uint32_to_str(self.event_duration))
< parts.append(uint32_to_str(self.emsg_id))
< parts.append(self.messagedata)
---
> if self.version == 0:
> size = 12 + 4*4 + len(self.scheme_id_uri) + 1 + len(self.value) + 1 + len(self.messagedata)
> parts.append(uint32_to_str(size))
> parts.append("emsg")
> parts.append("\x00") #version
> parts.append("\x00\x00\x00") #flags
> parts.append(self.scheme_id_uri + "\x00")
> parts.append(self.value + "\x00")
> parts.append(uint32_to_str(self.timescale))
> print "presentation_time_delta = %d"%self.presentation_time_delta
> parts.append(uint32_to_str(self.presentation_time_delta))
> parts.append(uint32_to_str(self.event_duration))
> parts.append(uint32_to_str(self.emsg_id))
> else:
> size = 12 + 3*4 + 8 + len(self.scheme_id_uri) + 1 + len(self.value) + 1 + len(self.messagedata)
> parts.append(uint32_to_str(size)) # size
> parts.append("emsg") #box name
> parts.append("\x01") #version
> parts.append("\x00\x00\x00") #flags
> parts.append(uint32_to_str(self.timescale)) # timescale
> parts.append(uint64_to_str(self.presentation_time_delta)) #presentation_time
> parts.append(uint32_to_str(self.event_duration)) #duration
> parts.append(uint32_to_str(self.emsg_id)) #id
> parts.append(self.scheme_id_uri + "\x00") #scheme_id_uri
> parts.append(self.value + "\x00") # value
> parts.append(self.messagedata) #messagedata
91c107
< message_data=""):
---
> message_data="",version=0):
93c109
< emsg = Emsg(scheme_id_uri, value, timescale, presentation_time_delta, event_duration, emsg_id, message_data)
---
> emsg = Emsg(scheme_id_uri, value, timescale, presentation_time_delta, event_duration, emsg_id, message_data, version)
If the input segment uses tfdt
box version 0, when baseMediaDecodeTime
is modified, tfdt
may change version to 1 as a result of baseMediaDecodeTime
growing beyond the limit of 32 bits (see process_tfdt
in mediasegmentfilter.py
).
As a result, the size of tfdt
increases and the offsets stored in saio
(referencing Sample Auxiliary Information stored inside saiz
) may no longer be valid, resulting in an inconsistent (broken) file.
See this example input file opened in isoviewer:
We can see how the one offset in saio
equals 721. When mediasegmentfilter.py
increases the size of tfdt
(which in this file is version 0), senc
will move and the offset will not longer be valid.
Since dash-live-source-simulator modifies the size of boxes while rewriting them, there may be more such internal references that need fixing.
An lmsg "token" is inserted at the end of a presentation, e.g. for a time-limited or periodic service. However, there is no lmsg at the end of each period for multi-period content.
This should be fairly easy to add, by having the period information in the URL and check if the currently requested segment is the last of a period.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.