Comments (30)
from vroom.
from vroom.
For the sake of completeness: you might want to check out vroom-express, an expressjs-based wrapper to use VROOM with HTTP POST requests.
I saw the express project. If I remember correctly, it basically shells out to invoke the binary.
The PR sashakh made is great for c++ projs. I didn't try creating v8 bindings using swig or something based on his work but I'm sure it's possible with some work.
My point in commenting above is that I generally think it makes the project more useful.
from vroom.
I can see how useful this could be for integration in other pieces of software. At the moment, a loader could build a problem from a string (or file name) and solving would return a string (or a rapidjson object).
I think a proper library would require to expose more than just this. For example a problem object that could be directly populated, and also an object to represent a solution and use it directly after solving.
from vroom.
The idea is to use it as a native library in an Android application (@fijemax)
from vroom.
This sounds cool. But how do you plan to re-use the solution output, just parse it again in the client application?
from vroom.
The idea with a library if to share objects not exchanging and parsing files.
from vroom.
to share objects
Precisely what I mean: we can't do this right now. At the moment, there is no "solution" object that could be shared, only code reaching out all over the place to write the expected json
to output.
I think this should be considered in the light of #44 (I just filled this issue to write down things I had in mind before).
from vroom.
This is probably getting nearer with the refactor for #44, as involved objects (locations, jobs, vehicles, routes, input problem, solution etc.) now have a proper data representation.
from vroom.
Hi,
I'm using vroom as library, adding such simple makefile patch:
diff --git a/src/makefile b/src/makefile
index c729fcf..0ff069f 100644
--- a/src/makefile
+++ b/src/makefile
@@ -10,6 +10,7 @@ LDLIBS = -lboost_system -lboost_regex -lboost_log -lboost_log_setup -lpthread -l
# Using all cpp files in current directory.
MAIN = ../bin/vroom
+LIB = ../lib/libvroom.a
SRC = $(wildcard *.cpp)\
$(wildcard ./algorithms/*.cpp)\
$(wildcard ./routing/*.cpp)\
@@ -34,11 +35,14 @@ endif
OBJ = $(SRC:.cpp=.o)
# Main target.
-all : $(MAIN)
+all : $(MAIN) $(LIB)
$(MAIN) : $(OBJ) main.o
$(CXX) $(CXXFLAGS) -o $@ $^ $(LDLIBS)
+$(LIB) : $(OBJ)
+ $(AR) cr $@ $^
+
# Building .o files.
%.o : %.cpp %.h
$(CXX) $(CXXFLAGS) -c $< -o $@
And my code then looks like:
input input_data(NULL, false); // geometry = false
input_data.add_vehicle(0, optional_coords_t({ p[0].pos->lon,
p[0].pos->lat}),
boost::none);
unsigned i;
for (i = 1; i < size; i++) {
input_data.add_job(i, optional_coords_t({p[i].pos->lon,
p[i].pos->lat}));
}
input_data._max_cost_per_line.assign(size, 0);
input_data._max_cost_per_column.assign(size, 0);
input_data._matrix = get_matrix(osrm, input_data._locations,
input_data._max_cost_per_line,
input_data._max_cost_per_column);
solution sol = input_data.solve(1); // nb_threads = 1
if (sol.code != 0) {
err("sol.code %d: %s\n", sol.code, sol.error.c_str());
return;
}
for (const auto & r:sol.routes) {
unsigned i = 0;
for (const auto & s:r.steps) {
//const location_t &l = s.location;
//dbg("%lu: (%f,%f)\n", s.job, l.lon.get(), l.lat.get());
n[i] = s.job;
i++;
}
n[0] = 0;
}
from vroom.
However #69 will create certain problems with continuous runs.
from vroom.
@sashakh thanks for sharing, glad to know this works fine for you. The add_job
and add_vehicle
functions where added with that exact usage in mind.
Ideally you should not have to worry about internal members like _max_cost_per_[line|column]
and calling get_matrix
yourself. You can avoid that part by passing your osrm
object as first argument in the input
ctor (instead of NULL
). In the current code-base, this object is constructed here:
vroom/src/utils/input_parser.cpp
Lines 31 to 47 in d75bf82
depending on whether we use
libosrm
or osrm-routed
.
Then calling solve
on an input
object will automatically take care of setting up the matrix via set_matrix
.
from vroom.
from vroom.
@sashakh could you submit your makefile
patch as a pull request when you find time ? I'd be happy to merge it in. The support for library use is still quite experimental but it does seem relevant for several use-cases.
from vroom.
from vroom.
@sashakh creating the bin
and lib
directories from makefile rules is probably the more convenient option. Thanks!
from vroom.
from vroom.
FYI: next step toward this landed with #84, introducing a input::set_matrix(matrix<cost_t>&&)
function that is used throughout the codebase (_matrix
member in input
is now private).
See updated libvroom.cpp
example for usage from a C++ context.
from vroom.
I flagged this as an experimental feature in the v1.2.0
release. It should be fully working. Yet I think we should keep that ticket open for reference as the C++ API might still evolve. Also there is no proper documentation besides the examples in libvroom_examples/libvroom.cpp
.
from vroom.
I second this request. I think it'd be good to have javascript and or python wrappers so that the vroom solver can be invoked from other high level languages
from vroom.
@dbhoot thanks for your interest. This ticket was primarily about compiling a C++ library, which is now effective, and I've had a couple reports from people using this successfully in their own C++ code. The reason I did not close here is that I still consider it as experimental (read "I don't want to be bound to the C++ API, and any internal change will not be considered as a breaking change"). I try to keep the example file up to date though.
I don't have a need myself for bindings to other languages, but I definitely see how that might be useful to others. If you're interested in setting something up, then feel free to open a dedicated ticket here to discuss the best way to proceed.
For the sake of completeness: you might want to check out vroom-express, an expressjs-based wrapper to use VROOM with HTTP POST requests.
from vroom.
I struggled a little bit trying to get libvroom to run with Rust, gave up and did a shell command.
Did the same thing in Kotlin too as it was difficult/not enough time to build JNI bindings for it
That said, it would be mega useful if it was easier to utilise rather than the above methods as command is expensive in the JVM
from vroom.
@AwesomeIbex not sure exactly what you're trying to achieve here as the purpose of libvroom is to call the native functions from C++.
If you're interested in using it from your C++ code, then you'll find an example on how to link the library and how to setup and solve a problem.
If you're interested in bindings for other languages, then please check my previous comment above that still applies to date.
from vroom.
Hi,
I'm trying to use the vroom lib from Java. I thought about using JavaCPP (https://github.com/bytedeco/javacpp) and if possible create a module in JavaCPP Presets (https://github.com/bytedeco/javacpp-presets). I am not a C ++ expert. I would need some help creating the cppbuild.sh file (https://github.com/bytedeco/javacpp-presets/wiki/Create-New-Presets#the-cppbuildsh-file).
Thanks in advance.
from vroom.
I'm not familiar at all with JavaCPP (and not that much with Java either), so I can't really comment on the whole process. You'll probably get better generic advise from the JavaCPP maintainers.
On the specific question of the cppbuild.sh
script: the example you're pointing to seems to get down to building using make
. So if you don't need all the fancy multi-platform stuff, you should be able to simply paste the usual build instructions for vroom
in your script, including pulling the repo and installing dependencies.
Interested to have any feedback if you end up using the project from Java.
from vroom.
Thank you for your reply. We are already using vroom from java with the ProcessBuilder class (https://docs.oracle.com/javase/9/docs/api/java/lang/ProcessBuilder.html). Same as what vroom-express does.
It is not scalable for our new use cases: between 20,000 to 40,000 requests, in real time, for optimizing routes in the morning between 8 a.m. and 10 a.m.
We want first reduce process startup overhead.
In addition, we want to deploy vroom on our agents' android smartphones with a cost matrix that corresponds to the geographic coverage area of our sites.
I will share the results of our work as soon as it is done.
from vroom.
I flagged this as an experimental feature in the
v1.2.0
release. It should be fully working. Yet I think we should keep that ticket open for reference as the C++ API might still evolve. Also there is no proper documentation besides the examples inlibvroom_examples/libvroom.cpp
.
I think the library part of this project is a fact and already a number of people depend on it (myself included). I would therefore propose to close this issue and handle any outstanding topics as new issues.
Any thoughts?
from vroom.
I think the library part of this project is a fact
Yes you're right of course. Again my only concern here is that I don't want to be tied by the C++ API and have to use work-arounds to make it evolve in a non-breaking way.
Maybe we could advertise the C++ API but add some warnings stating that the non-breaking semantic versioning approach only applies to the json API?
from vroom.
Maybe we could advertise the C++ API but add some warnings stating that the non-breaking semantic versioning approach only applies to the json API?
Works for me.
from vroom.
👍
from vroom.
Related Issues (20)
- vroom Error missing duration HOT 5
- Update OS and compilers in CI HOT 1
- Use `std::format` HOT 4
- Use address and UB sanitizers HOT 3
- Handle routing engine http statuses HOT 7
- Ensure request interpretation as TSP HOT 3
- Error with inconsistent vehicle capacity arrays HOT 1
- Seeking Advice: Implementing User-Defined Routes with Invisible Points to Guide Optimization Algorithm HOT 4
- Design of Matrix class HOT 7
- Refactor functions to accept a range instead of a pair of iterators HOT 2
- Compiler warning in gcc-12 build HOT 5
- Update GitHub Actions HOT 1
- Speed up OSRM CI build HOT 2
- Mismatch in compilers used for OSRM and VROOM in CI HOT 1
- VROOM schedules next-day deliveries despite sufficient time windows HOT 5
- Can i solve this user case with VROOM? HOT 2
- More unassigned when using 1 vehicles instead of 2, while creating always 1 route only HOT 10
- Function template names forward iterator while in fact it requires a random access iterator HOT 4
- position job by time HOT 11
- Evaluating dropping rapidjson HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from vroom.