Comments (7)
Time Penalties for Congestion
Improving vehicle speed estimates using street network centrality (2016) with pdf here. They simply estimate centrality, and then use a linear regression against estimates of vehicular speeds, but don't publish any of the coefficients. They nevertheless show one interesting, and potentially relevant, result in Fig 7, relating speed estimates to both network closeness and betweenness centality. On secondary or tertiary roads, speeds decrease with increases in both centrality and closeness, but on residential or lower category roads, speed increases with increasing centrality. This could nevertheless reflect some kind of rural effect towards the network periphery, and need not suggest any general pattern.
The results of that figure alone might be sufficient to inform a general heuristic, although that would also need to be scaled by overall city size. That shows speeds along 50km/hr segments decreasing to underr 20km/hr along the most central edges, although closeness appears to have an even greater influence than centrality alone (where closeness is measured as the inverse of the sum of distances to all other nodes, and edge closeness is just averaged from measures at each pair of terminal nodes).
from m4ra.
One likely viable approach: Get data from https://movement.uber.com/ and use that to calibrate scales between centrality and their estimates. That should give a rough idea of the most appropriate scaling of centrality values. Then use that to extract best linear coefficient for as many cities as possible, and relate that linear coefficient to city sizes to provide a final heuristic way to estimate general travel times from centrality.
from m4ra.
I did not test this, but here is one idea I had to get better time estimates when using dodgr, coupled with a mobility model of a given region : "load" the network edges iteratively with dodgr_flows_aggregate, and adjust the speeds on the edges according to some capacity - speed function.
This approach is used in "Predicting commuter flows in spatial networks using a radiation model based on temporal ranges" (https://www.nature.com/articles/ncomms6347), with a rather simple function : speed is set to zero when the flow is above some capacity threshold.
from m4ra.
Thanks for the link @FlxPo. Interestingly, their best estimate of travel times gives a correlation coefficient of 0.639, or 0.752 with the switch-off capacity limitation you describe. I set up a separate repo to perform calibration experiments against the Uber movement data. Straight-up m4ra::m4ra_times_single_mode()
values give a correlation with observed (uber movement) values of 0.720, which is pretty similar. Any modification to account for network centality only reduces that correlation.
This leads to the conclusion that perhaps the only empirically-justifiable modification which can be implemented is an overall modification of the scale of m4ra
travel time estimates to scales represented in empirical data. However, empirical travel times are generally much faster than the m4ra
estimates, by a factor of 2-3 times. That's likely because m4ra
times rely on the weighting profiles taken from routino, which have maximum speeds well below likely actual speeds, such as 25km/hr on residential, 40 on tertiary, 55 on secondary, and 65 on primary. Actual maximum speeds are obviously well above that.
Need to modify the algorithm to account for the actual maximum speeds given in OSM data, and use those to override the values in the weighting profile. That may in turn affect relationships with observed values such that effects of centrality are different / observable ...?
from m4ra.
Ah, the sample data i've been working with for Brussels, Belgium are the centre of the city only, and just exclude the enormous motorway ring around the city centre. So the m4ra
times can't reflect routing along the motorways. I'll repeat the calibration in another city...
from m4ra.
Calibration procedure for vehicular travel times documented in separate repository https://github.com/UrbanAnalyst/ttcalib
from m4ra.
The procedure for this is now stable and sufficiently documented in the ttcalib
repo linked above. That ultimately indicates that centrality makes very little difference at all, so is not even worth considering, but that traffic lights are the singularly most important factor. Instead of the routine defaults of traffic lights = 8 seconds and turn penalty = 7.5 seconds, that calibration exercise suggests an optimal traffic light penalty of 16 seconds, and a turn penalty of only 1 second.
The most important thing here for inter-city comparisons is to have standardised values. These particular value can easily be refined through re-application to other areas at any later stage, but are sufficiently empirically justifiable to proceed for now on that basis.
from m4ra.
Related Issues (20)
- Use fst to cache networks HOT 3
- Include options to extract data from local file with osmium
- m4ra_times_mm_car function HOT 3
- Separate prepare_data function and use output of that as input to all main functions HOT 1
- Return data on number of transfers HOT 2
- Parallelise add_net_to_gtfs
- Change storage mode of Rcpp matrices to <int>
- Estimate frequency of fastest connections HOT 3
- Estimate quality of bicycle infrastructure HOT 1
- Restrict main MM routines to distance threshold HOT 1
- Add elevation to networks generated in "prepare_data" phase HOT 1
- Loop over subsets of stops in m4ra_gtfs_traveltimes
- Reduce sizes of initial times,transfers,interval matrices
- Use arrow to read/write main GTFS matrices
- running m4ra_gtfs_traveltimes on windows is impossible due to mc.cores>1 HOT 1
- Move atfutures -> UrbanAnalyst
- Fix fst write routines HOT 6
- bike car ratios
- multi-modal algorithm HOT 6
- Automobile times HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from m4ra.