Giter Site home page Giter Site logo

Comments (16)

openspeedtest avatar openspeedtest commented on July 28, 2024 1

Browser limit for parallel xhr is 6..
This is latency measurements during download test and upload test right.
Because of limited parallel connection we cannot make new requests. But we can measure load events of each requests. But i am not sure it will be enough or not. Because on mobile devices or slower connections browser will keep the additional requests pending.

I need to test this then only i can say anything about this.

from speed-test.

westurner avatar westurner commented on July 28, 2024 1

So you are asking about " Can we implement 8 parallel request to server " also measure latency at the same time?

Yes. That is what dsl_reports8dn.conf does, for example.

https://github.com/tohojo/flent/blob/master/flent/tests/dslreports_8dn.conf

Parallel connection limitations only applicable to single host , if i remember correctly. But i never tested multiple connection to more than one host.

What could affect the time cost performance of how many HTTP resources referenced are downloaded in parallel by HTML'/JS/WASM?

  • System load noise

  • Same-origin policy AND/OR .*Cookie.* origin policy

  • HTTP "pipelining" / request multiplexing

    • HTTP/1.1
    • HTTP/2
    • HTTP/3 (UDP)
  • Max active connections

  • TCP per-IP host connection limits

  • UDP per-IP host connection limits

  • How fetch/xhr JS/WASM tasks map to actual tasks in separate "processes*, threads, coroutines (with static messages over channels) that share the same networking stack

    • [links above re browser threads, processes, ipc, actors beneath the W3C DOM APIs]

  • /? domain sharding CDN
    https://www.google.com/search?q=domain+sharding+CDN

Browsers limit the number of active connections for each domain. To enable concurrent downloads of assets exceeding that limit, domain sharding splits content across multiple subdomains. When multiple domains are used to serve multiple assets, browsers are able to download more resources simultaneously, resulting in a faster page load time and improved user experience.

The problem with domain sharding, in terms of performance, is the cost of extra DNS lookups for each domain and the overhead of establishing each TCP connection.

The initial response from an HTTP request is generally an HTML file listing other resources such as JavaScript, CSS, images and other media files that need to be downloaded. As browsers limit the number of active connections per domain, serving all the required resources from a single domain could be slow as assets need to be downloaded sequentially. With domain sharding, the required downloads are served from more than one domain, enabling the browser to simultaneously download needed resources. Multiple domains, however, is an anti-pattern, as DNS lookups slow initial load times.

HTTP2 supports unlimited concurrent requests making domain sharding an obsolete requirement when HTTP/2 is enabled.

Which servers have HTTP/2 enabled by default?
Which servers have HTTP/3 enabled by default; w/ less TCP & TCP load-balancing latency

If you really need to do this , make two servers and try to send 6 parallel XHR request to each serves at the same time.. if 12 connection worked at the same time, then you can use 8 for download and other for latency.

Instead of providing documentation or along with that write what you are trying to say in few sentences.

Reference material discovered otw to answering my own async question:
https://en.wikipedia.org/wiki/Web_performance#Optimization_techniques

  • Domain Sharding: If the static file hosting web server or client doesn't support HTTP/2, domain sharding is advisable
    • Same-origin and CSP and CORS policy are how relevant to remote JS and CSS and WASM resources from other domains

from speed-test.

westurner avatar westurner commented on July 28, 2024 1

FWIU the max_connections_per_domain setting varies amongst browsers:

  • "Max parallel HTTP connections in a browser?"

https://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser/14768266#14768266

No hurry here. Cool tool and thanks as well

from speed-test.

westurner avatar westurner commented on July 28, 2024

+someday +maybenp

Thought the RRUL thing might be worth sharing

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

see #17
You can run a stress test on one browser window
http://xxx.xxx.xxx.xxx:3000?s=y
and run ping test on another window
http://xxx.xxx.xxx.xxx:3000?t=p&p=300
you can see the added latency

or you can use two browser for this.
When testing latency while running stress test will show the added latency.

You can run speed test and latency test indefinitely.

t=y for run a speed test for full year
t=p&p=100000 to send 1L Ping

from speed-test.

westurner avatar westurner commented on July 28, 2024

jupyterlab/jupyterlab#1639 (comment) :

  • "4 Ways to Communicate Across Browser Tabs in Realtime"

  • JupyterLite implements the pyolite kernel in a Web Worker that doesn't block the main thread. AFAIU, web workers are possible without multiple browser tab overhead.

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

Webworker also have same limitations.
XHR is asynchronous, and it will not block the code. Moving it to webworker make no sense.

Webworker is for CPU intensive tasks.

I am no expert. When i tested webworker on my app , I saw no improvements.

from speed-test.

westurner avatar westurner commented on July 28, 2024

Service Workers map onto all the processes in the browser task manager:

From https://web.dev/workers-overview/ :

  • A page can spawn multiple web workers, but a single service worker controls all the active tabs under the scope it was registered with.
  • The lifespan of the web worker is tightly coupled to the tab it belongs to, while the service worker's lifecycle is independent of it. For that reason, closing the tab where a web worker is running will terminate it, while a service worker can continue running in the background, even when the site doesn't have any active tabs open.

HTTP Same-origin; HTTP-pipelining in HTTP/1.1, HTTP/2, and HTTP/3; and parallel download tests:

From https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers :

Service workers are restricted to running across HTTPS for security reasons.

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

Looks like a boat... You failed the turing test.

from speed-test.

westurner avatar westurner commented on July 28, 2024

What looks like a boat?

To download multiple files over HTTP in parallel (in order to implement RRUL / dslreports_8dn.conf) , http pipelining and same origin policy will be relevant: does it download the 8 files in serial or in parallel?

from speed-test.

westurner avatar westurner commented on July 28, 2024

Can we (also) download in ~actual parallel with the Service Workers API? (Can we see actual parallel fetch subprocesses (that can run on separate cores) with the browser Task Manager)?)

Is WASM (as in JupyterLite, which could interactively display speedtest reports) any advantage for parallel download speed tests and/or upload speed tests that report "bufferbloat" which indicates that low-latency applications will lag?

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

So you are asking about " Can we implement 8 parallel request to server " also measure latency at the same time?

Parallel connection limitations only applicable to single host , if i remember correctly. But i never tested multiple connection to more than one host.

If you really need to do this , make two servers and try to send 6 parallel XHR request to each serves at the same time.. if 12 connection worked at the same time, then you can use 8 for download and other for latency.

Instead of providing documentation or along with that write what you are trying to say in few sentences.

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

Yes, we need to test this on HTTP2 Server. Now I am busy with some other work. I will come back to this later.
If you know Javascript, deploy the openspeedtest docker image, enable http2, and log the download event data to the console.
After that, run the test by adding xhr=12 or 20.

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

I have not tested this. I was thinking like this.
When we start testing, when the first DATA request receives data, we can send a PING request, then the next DATA request, and before each DATA request, we can send a PING request. That will measure latency under load. For HTTP 1.1, this will work under six requests, and for HTTP2 and above, it will go parallel. I am currently working on some other stuff. I will come back to this soon. Thanks for the valuable insights.

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

I will test this soon.
Chat-GPT
Question
Http2 single TCP connection can be a bottleneck
Answer
It is possible that a single HTTP/2 connection using a single TCP connection could become a bottleneck in certain situations. This is because HTTP/2 multiplexing allows multiple requests to be sent and received over a single TCP connection at the same time, but the total bandwidth available to all of these requests is still limited by the capacity of the underlying TCP connection.

If the total combined bandwidth required by all of the multiplexed requests exceeds the capacity of the TCP connection, the connection could become a bottleneck and limit the overall performance. This could occur, for example, if the client is trying to download a large number of very large files concurrently over a single HTTP/2 connection.

To avoid this bottleneck, it may be necessary to use multiple HTTP/2 connections or to switch to a different protocol that is better suited for transferring large amounts of data, such as BitTorrent or FTP.

from speed-test.

openspeedtest avatar openspeedtest commented on July 28, 2024

@westurner
HTTP 1.1
Screenshot 2023-01-26 at 6 51 05 AM
HTTP 2 & 3
Screenshot 2023-01-26 at 6 50 53 AM
We can send more than 6 XMLHttpRequest to the server using HTTP2 and 3.
HTTP 1.1 will use 6 Parallel connections to the server.
HTTP 2 & 3 will use a single connection to the server.

from speed-test.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.