Giter Site home page Giter Site logo

neurallayer / roboquant Goto Github PK

View Code? Open in Web Editor NEW
328.0 328.0 42.0 22.5 MB

Roboquant is a fast, flexible, user-friendly and completely free algorithmic trading platform

Home Page: https://www.roboquant.org

License: Apache License 2.0

Kotlin 28.83% CSS 0.05% JavaScript 71.12%
algo-trading automated-trading backtesting cryptocurrency high-performance jupyter-notebook kotlin machine-learning trading trading-bot trading-strategies

roboquant's People

Contributors

eapfel avatar idanakav avatar jbaron avatar mlake avatar mprey avatar omniv0x avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

roboquant's Issues

Add run additional phases run to support more use cases

Support more complex sequence run phases. For example, the following 4 phases:

warmup -> main -> validation -> closing

This would be a change to the run method. The new API could look something like this:

run(feed: Feed, timeframe: Timeframe, phases: List<Instant>, name: String)

And use could look something like this:

val timeframe = Timeframe.past(5.years)
val phases = timeframe.offsets(30.days, 2.years, 3.years, 10.days)  
run(feed, timeframe, phases)

Feature Chart

Make it easier to visualise input features (like prices) and indicators (like RSI).

Right now only prices can be plot or metrics. And metrics are only available after a run and don't provide insights into input features.

com.google.gson.JsonIOException: Failed making field 'java.lang.Throwable#detailMessage' accessible;

I encounter this exception often with JDK17 when I try to get live data or get a lot of historic data.
Exception in thread "DefaultDispatcher-worker-2" com.google.gson.JsonIOException: Failed making field 'java.lang.Throwable#detailMessage' accessible; either change its visibility or write a custom TypeAdapter for its declaring type at com.google.gson.internal.reflect.ReflectionHelper.makeAccessible(ReflectionHelper.java:22) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.getBoundFields(ReflectiveTypeAdapterFactory.java:158) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.create(ReflectiveTypeAdapterFactory.java:101) at com.google.gson.Gson.getAdapter(Gson.java:501) at com.google.gson.Gson.fromJson(Gson.java:990) at com.google.gson.Gson.fromJson(Gson.java:956) at com.google.gson.Gson.fromJson(Gson.java:905) at com.google.gson.Gson.fromJson(Gson.java:876) at com.oanda.v20.Context.execute(Context.java:303) at com.oanda.v20.pricing.PricingContext.get(PricingContext.java:179) at org.roboquant.oanda.OANDALiveFeed$subscribeOrderBook$3.invokeSuspend(OANDALiveFeed.kt:139) at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664) Suppressed: kotlinx.coroutines.DiagnosticCoroutineContextException: [StandaloneCoroutine{Cancelling}@4d705c18, Dispatchers.Default] Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field private java.lang.String java.lang.Throwable.detailMessage accessible: module java.base does not "opens java.lang" to unnamed module @51a9ad5e at java.base/java.lang.reflect.AccessibleObject.throwInaccessibleObjectException(AccessibleObject.java:387) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:363) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:311) at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:180) at java.base/java.lang.reflect.Field.setAccessible(Field.java:174) at com.google.gson.internal.reflect.ReflectionHelper.makeAccessible(ReflectionHelper.java:19) ... 16 more

Improve IBKR async

Right now there are sometimes timeouts used to deal with IBKR async behaviour. Better to replace with proper wait behaviour.

Improve test coverage

Bring test coverage above 90% for the key components (running unit tests with FULL_COVERAGE flag)

Metric Chart based on live feed shows erratic results

When using time scale, the metric charts sometimes shows a wrong graph (one x value has multiple y values). When using the step scale all is fine. Can been seen using for example using Alpaca Live Feed provider during trading hours.

There is a mistake either in the captured metric data or more likely the chart itself cannot deal with very small differences in time (currently passed as a string).

Better support classic notebook environments

Right now classic and lab notebooks render different. Classic notebooks use iframes for embedding HTML output while lab notebooks directly insert the HTML snippet. This makes it important that the environment is detected/set correctly otherwise the output is not rendered correctly.

Classic notebooks can also directly insert an HTML snippet, but somehow the echarts library doesn't load correctly when added it to the main HTML page. It only works in an iframe in a classic notebook.

possible reasons:

  1. requirejs that is loaded in a classic notebook stops echarts from loading correctly
  2. the load script that is generated by kotlin kernel contains a bug

Using DataFrames

Look into using DataFrames to have nicer outputs of data in Jupyter Notebooks

Optimize Chart toolbox options

The default chart toolbox options don't always make much sense for all chart types. Should be better customized for each chart type.

Grouping of actions in a single event in live feeds

When subscribing to assets in live feeds, it is sometime desirable to group actions together is they arrive closely after each other. Otherwise every action would result in its own event.

This could be implemented in the abstract base class LiveFeed by support a "queue" where actions are put before a timeout is triggered and send all of them in a single Event.

Rounding problem on position sizes

Hi!
We've been working on porting our (already on production) crypto Backtrader strategy to Roboquant very easily, and find the design decisions you made very sound. We did find a rounding issue on position sizes operating with crypto coins, the same issue we had to fix on Backtrader.

Whenever we have more than one concurrent bracket order, when one of the limit orders (tp or sl) is executed, and the order quantity is substracted from the position size, the position can end up having a very small amount due to rounding errors using doubles. We managed to solve the issue rewriting part of the Position class (mostly the plus method), maintaining a read only double size so the change is encapsulated in the Position. We can fork your repo and create a PR if that's ok, or we can chat about that if you want.

Thanks for creating Roboquant!

Regards,
Martin

slf4j simple logger conflicts with other slf4j implementations

The inclusion of slf4j simple logger prevents the use of other slf4j implementations. The inclusion of simplelogger.properties in resources prevents the use of custom log message formatting. The right way to use slf4j is to only include the slf4j api in a library so the users can use any of the slf4j implementations themselves.

One solution could be to move the slf4j simple logger dependency and the properties file to another module, call it roboquant-logger. That way advanced developers could simply skip that module when using roboquant with other libraries that use slf4j. While novice users will have logging out of the box.

Issue with PriceQuote Parser

Hi!
Im having some unexpected results where priceAction.getPrice(type="BID") returns a different column in the testfile.

The CSV looks like this:

timestamp, bid, ask, bidsize, asksize
1556805604162,14.55,14.66,18,300
1556805604162,14.55,14.66,18,400
1556805604162,14.55,14.66,18,600
1556805604162,14.55,14.66,18,700
1556805604162,14.55,14.66,18,800

My actual data looks a little different so I defaulted to customizing the parsers from the beginning. I then tought I was making a mistake there and simplified the data to the above.

My CSVConfig looks like this:

val csvConfig = CSVConfig(hasHeader = true,
    priceParser = PriceQuoteParser(autodetect = false, bid = 1, ask = 2, bidVolume = 3, askVolume = 4),
    timeParser = AutoDetectTimeParser(0)
)

I also tried the autodetection (usually i dont use headers neither, so i added them according to the autodetect init.
I am using this test function:

override fun generate(event: Event): List<Signal> {
        val signals = mutableListOf<Signal>()
        for ((asset, priceAction) in event.prices) {
            println("Ask: ${priceAction.getPrice(type="ASK")}")
            println("Bid: ${priceAction.getPrice(type="BID")}")
         }  
        return signals
    }

To get this output:

Ask: 14.66
Bid: 900.0
Ask: 14.66
Bid: 900.0
Ask: 14.66
Bid: 900.0
Ask: 14.66
Bid: 800.0

So basically getPrice(type = "BID") returns my asksize column.

Modelling Shorts

Hi,
We've stumbled on a limitation that we haven't been able to circunvent while backtesting shorts. We are currently using BracketOrders for our trades, but the problem happens on every order type. When the sim broker updates the account and makes a withdraw, it actually increases the account's cash amount when we are shorting (which is consistent with the sell operation), but it is not something that happens in a real exchange. Do you have a suggestion regarding using short on the sim broker? Our current idea would be to create a new simbroker handling this. We could also try to support this on current simbroker.

Regards,
Martin

Maven Archetype

Have a Maven Archetype to quickly start your developing your own roboquant based trading application

Charts rendering too small when Jupyter output cell is hidden

When a page or output-cell is not visible in a Jupyter Notebook when rendering a chart, it is rendered with a default width of 100px which is too small. The only way currently to fix this, is to double-click on the chart to resize it back to 100%.

Ideally the resizing should be done automatically when the visibility or size of a cell out changes.

Example of too small rendering:

Screenshot 2022-04-30 at 13 00 45

predefined set of CSVConfig

For common and often used CSV files formats, have a pre-defined CSVConfig file that makes it convenient to parse those files without having to manually configure the CSVConfig.

Distributed back-tests and tuning

Right now, roboquant can be run using multiple-core machines. However for larger date sets and potential hyper-parameter search this might not be sufficient.

So research is required how to support testing and tuning on multiple machines.

Improve code documentation

Several places in the code base, documentation is missing or not complete.
To see what is missing in a certain module, you can run:

mvn dokka:dokka -P release -pl roboquant

Before the 1.0 release, at least all key Classes and Interfaces in the core roboquant module should be well documented.

Improve crypto broker functionality

Right now the live-trading crypto broker are not fully implemented. Help would be welcome since limited exposure to crypto trading using Binance or other pure Crypto brokers.

Return metrics as a map

Currently Logger.getMetric returns a list. However in order to promote generic processing it could return a Map with the key being a unique run. That way you are forced to deal with multi-run metrics.

Alternatively, we can have a generic toMap() function that does the same

Finalize API v1.0

Get the API for 1.0 finalised. Main things todo:

  1. use standard logging (SLF4J)
  2. give classes consistent naming, for example implementations of Strategy should be named SomeKindStrategy
  3. better time duration API
  4. removed code/classes that are not much used so far (could be added later again)

Advanced policies

@jbaron the advanced policies section of the docs is helpful but not too clear on how to tie everything together.

In the example you gave:

class MyPolicy: Policy {
    private var rebalanceDate = Instant.MIN
    private val holdingPeriod = 20.days
    /**
     * Based on some logic determine the target portfolio
     */
    fun getTargetPortfolio() : List<Position> {
        TODO("your logic goes here")
    }
    override fun act(signals: List<Signal>, account: Account, event: Event): List<Order> {
        if (event.time < rebalanceDate) return emptyList()
        rebalanceDate = event.time + holdingPeriod
        val targetPortfolio = getTargetPortfolio()
        // Get the difference of target portfolio and the current portfolio
        val diff = account.positions.diff(targetPortfolio)
        // Transform the difference into MarketOrders
        return diff.map { MarketOrder(it.key, it.value) }
    }
    override fun reset() { rebalanceDate = Instant.MIN }
}
val roboquant = Roboquant(
    NoSignalStrategy(), // will always return an empty list of signals
    policy = MyPolicy()
)

does getTargetPortfolio trigger the act function?

the use case I think of is to use the NoSignalStrategy and generate the orders directly in the Policy when I receive some data from outside my system. what’s the flow for getting that setup?

thanks for the great work!

EsperTech support

Investigate adding EsperTech event processing (for example in strategies) to enable flexible moving window based event processing and logic. See also espertech.com for some of their product details.

Order tag is missing in new branch

Hi, we are currently upgrading to the new version and found out that the tag property was removed from . We were using it to map client order id from Binance and we generally use it when syncyng different order types. Is this being replaced with something different? We are reading in our fork, and would happily create a PR for this but weren't sure about the rationale behind this.

Variable Required BarAdjustment for Alpaca Historical Bars

In the wrapper function for fetching historical stock bars using Alpaca, the variable used for historical price data adjustment is hardcoded to BarAdjustment.ALL.

We need to allow the variable to be modified from calling the wrapper function to match historical data with various adjustment settings.

Hyper parameter search

Most trading strategies are non-differentiable, but they have a lot of parameters that need to be tuned. Most simple hyper parameters searches take too much time, so research is required for viable alternatives.

Consider opening Order classes

Hi,

It'd be useful if classes from the order hierarchy, in particular the BracketOrder were made open. We've been forced to duplicate a large part of the code so we could model an specific kind of order.

TIA,
Martin Paoletta

Better support for advanced strategies

Add more building blocks to make it easier to add advanced strategies like those based on machine learning algo's. Right now requires writing too much plumbing code.

Some required features:

  1. Make it easy to capture historic data (multi-dimensional)
  2. Add feature engineering capabilities
  3. Add bindings to common libraries, like XGBoost

binance enum issue

I'm unable to use BinanceLiveFeed or BinanceHistoricFeed using v1.3.0.

Throws the following:

com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type com.binance.api.client.domain.general.FilterType from String "NOTIONAL": not one of the values accepted for Enum class: [MAX_POSITION, MIN_NOTIONAL, PRICE_FILTER, LOT_SIZE, MARKET_LOT_SIZE, TRAILING_DELTA, MAX_ALGO_ORDERS, MAX_NUM_ORDERS, EXCHANGE_MAX_ALGO_ORDERS, MAX_NUM_ALGO_ORDERS, ICEBERG_PARTS, PERCENT_PRICE_BY_SIDE, EXCHANGE_MAX_NUM_ORDERS, MAX_NUM_ICEBERG_ORDERS, PERCENT_PRICE] at [Source: (okhttp3.ResponseBody$BomAwareReader); line: 1, column: 1550] (through reference chain: com.binance.api.client.domain.general.ExchangeInfo["symbols"]->java.util.ArrayList[0]->com.binance.api.client.domain.general.SymbolInfo["filters"]->java.util.ArrayList[6]->com.binance.api.client.domain.general.SymbolFilter["filterType"])
com.binance.api.client.exception.BinanceApiException: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type com.binance.api.client.domain.general.FilterType from String "NOTIONAL": not one of the values accepted for Enum class: [MAX_POSITION, MIN_NOTIONAL, PRICE_FILTER, LOT_SIZE, MARKET_LOT_SIZE, TRAILING_DELTA, MAX_ALGO_ORDERS, MAX_NUM_ORDERS, EXCHANGE_MAX_ALGO_ORDERS, MAX_NUM_ALGO_ORDERS, ICEBERG_PARTS, PERCENT_PRICE_BY_SIDE, EXCHANGE_MAX_NUM_ORDERS, MAX_NUM_ICEBERG_ORDERS, PERCENT_PRICE]
at [Source: (okhttp3.ResponseBody$BomAwareReader); line: 1, column: 1550] (through reference chain: com.binance.api.client.domain.general.ExchangeInfo["symbols"]->java.util.ArrayList[0]->com.binance.api.client.domain.general.SymbolInfo["filters"]->java.util.ArrayList[6]->com.binance.api.client.domain.general.SymbolFilter["filterType"])
at com.binance.api.client.impl.BinanceApiServiceGenerator.executeSync(BinanceApiServiceGenerator.java:96)
at com.binance.api.client.impl.BinanceApiRestClientImpl.getExchangeInfo(BinanceApiRestClientImpl.java:44)
at org.roboquant.binance.Binance.retrieveAssets(Binance.kt:54)
at org.roboquant.binance.BinanceHistoricFeed.(BinanceHistoricFeed.kt:46)
at org.roboquant.binance.BinanceHistoricFeed.(BinanceHistoricFeed.kt:34)
at Line_8_jupyter.(Line_8.jupyter.kts:1)
at java.base/jdk.internal.reflect.DirectConstructorHandleAccessor.newInstance(DirectConstructorHandleAccessor.java:67)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:483)
at kotlin.script.experimental.jvm.BasicJvmScriptEvaluator.evalWithConfigAndOtherScriptsResults(BasicJvmScriptEvaluator.kt:105)
at kotlin.script.experimental.jvm.BasicJvmScriptEvaluator.invoke$suspendImpl(BasicJvmScriptEvaluator.kt:47)
at kotlin.script.experimental.jvm.BasicJvmScriptEvaluator.invoke(BasicJvmScriptEvaluator.kt)
at kotlin.script.experimental.jvm.BasicJvmReplEvaluator.eval(BasicJvmReplEvaluator.kt:49)
at org.jetbrains.kotlinx.jupyter.repl.impl.InternalEvaluatorImpl$eval$resultWithDiagnostics$1.invokeSuspend(InternalEvaluatorImpl.kt:103)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:284)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:85)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:59)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:38)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at org.jetbrains.kotlinx.jupyter.repl.impl.InternalEvaluatorImpl.eval(InternalEvaluatorImpl.kt:103)
at org.jetbrains.kotlinx.jupyter.repl.impl.CellExecutorImpl$execute$1$result$1.invoke(CellExecutorImpl.kt:75)
at org.jetbrains.kotlinx.jupyter.repl.impl.CellExecutorImpl$execute$1$result$1.invoke(CellExecutorImpl.kt:73)
at org.jetbrains.kotlinx.jupyter.ReplForJupyterImpl.withHost(repl.kt:665)
at org.jetbrains.kotlinx.jupyter.repl.impl.CellExecutorImpl.execute(CellExecutorImpl.kt:73)
at org.jetbrains.kotlinx.jupyter.repl.CellExecutor$DefaultImpls.execute$default(CellExecutor.kt:15)
at org.jetbrains.kotlinx.jupyter.ReplForJupyterImpl$evalEx$1.invoke(repl.kt:478)
at org.jetbrains.kotlinx.jupyter.ReplForJupyterImpl$evalEx$1.invoke(repl.kt:469)
at org.jetbrains.kotlinx.jupyter.ReplForJupyterImpl.withEvalContext(repl.kt:432)
at org.jetbrains.kotlinx.jupyter.ReplForJupyterImpl.evalEx(repl.kt:469)
at org.jetbrains.kotlinx.jupyter.messaging.ProtocolKt$shellMessagesHandler$2$res$1.invoke(protocol.kt:318)
at org.jetbrains.kotlinx.jupyter.messaging.ProtocolKt$shellMessagesHandler$2$res$1.invoke(protocol.kt:312)
at org.jetbrains.kotlinx.jupyter.JupyterExecutorImpl$runExecution$execThread$1.invoke(execution.kt:37)
at org.jetbrains.kotlinx.jupyter.JupyterExecutorImpl$runExecution$execThread$1.invoke(execution.kt:32)
at kotlin.concurrent.ThreadsKt$thread$thread$1.run(Thread.kt:30)
Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type com.binance.api.client.domain.general.FilterType from String "NOTIONAL": not one of the values accepted for Enum class: [MAX_POSITION, MIN_NOTIONAL, PRICE_FILTER, LOT_SIZE, MARKET_LOT_SIZE, TRAILING_DELTA, MAX_ALGO_ORDERS, MAX_NUM_ORDERS, EXCHANGE_MAX_ALGO_ORDERS, MAX_NUM_ALGO_ORDERS, ICEBERG_PARTS, PERCENT_PRICE_BY_SIDE, EXCHANGE_MAX_NUM_ORDERS, MAX_NUM_ICEBERG_ORDERS, PERCENT_PRICE]
at [Source: (okhttp3.ResponseBody$BomAwareReader); line: 1, column: 1550] (through reference chain: com.binance.api.client.domain.general.ExchangeInfo["symbols"]->java.util.ArrayList[0]->com.binance.api.client.domain.general.SymbolInfo["filters"]->java.util.ArrayList[6]->com.binance.api.client.domain.general.SymbolFilter["filterType"])
at com.fasterxml.jackson.databind.exc.InvalidFormatException.from(InvalidFormatException.java:67)
at com.fasterxml.jackson.databind.DeserializationContext.weirdStringException(DeserializationContext.java:1996)
at com.fasterxml.jackson.databind.DeserializationContext.handleWeirdStringValue(DeserializationContext.java:1224)
at com.fasterxml.jackson.databind.deser.std.EnumDeserializer._deserializeAltString(EnumDeserializer.java:356)
at com.fasterxml.jackson.databind.deser.std.EnumDeserializer._fromString(EnumDeserializer.java:230)
at com.fasterxml.jackson.databind.deser.std.EnumDeserializer.deserialize(EnumDeserializer.java:198)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:314)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer._deserializeFromArray(CollectionDeserializer.java:359)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:244)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:28)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:314)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer._deserializeFromArray(CollectionDeserializer.java:359)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:244)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:28)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:314)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)
at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:2105)
at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1513)
at retrofit2.converter.jackson.JacksonResponseBodyConverter.convert(JacksonResponseBodyConverter.java:33)
at retrofit2.converter.jackson.JacksonResponseBodyConverter.convert(JacksonResponseBodyConverter.java:23)
at retrofit2.OkHttpCall.parseResponse(OkHttpCall.java:243)
at retrofit2.OkHttpCall.execute(OkHttpCall.java:204)
at com.binance.api.client.impl.BinanceApiServiceGenerator.executeSync(BinanceApiServiceGenerator.java:88)
... 36 more

Optimise resource usage

In order to run roboquant better/cheaper on virtualised environments, CPU & memory usage could be further reduced. This is especially true for example for running Notebooks on public sites like MyBinder that have limited capacity.

Goals

  • Reduce startup times
  • be able to run midsize back-tests on limited memory (for example < 512Mb heap)

Possible areas of interest

  1. Check if switching to OpenJ9 JDK could be an alternative (for example for the Docker image used by MyBinder.org)
  2. See if additional Hotspot JVM configuration helps to reduce memory & start-up time (potentially at the cost of slightly lower throughput)
  3. See if docker image can be optimised so less has to be downloaded when initially started
  4. Are there still optimisations in roboquant code that could help. Possible things to look at are: release resources earlier, optimise access patterns to large collections, inline more methods, keep price history as floats (instead of Doubles).

Create project quick start

Make it easier for people to start their own trading application using roboquant. Especially for people new to Kotlin and the JVM eco system, this might make it easier to get up to speed.

One possible solution is to create and publish a Maven Archetype. But could also be something similar using Gradle.

Warmup before live trading

Make it easier for your strategy and policy to warm-up (aka store necessary historic data) before starting live trading. Few things should be taken care of:

  1. Play the historic data that is used for warm-up
  2. Ensure during play-back of historic data generated orders don't get processed or metrics get logged
  3. After historic feed finished, swap over to playing the live feed
  4. Ensure that during start of live feed, the components don't get reset and loose their state

order execution using more sophisticated price matching

Hi, currently order execution is using the open price. This is something that is done on the execution method on the order, and while a refactor of the sim execution part would allow to change it somehow, it still needs the broker to send the whole candlestick in order to do something like backtrader does:

https://www.backtrader.com/docu/order-creation-execution/order-creation-execution/

It probably was a backtesting strategy choice, but it'd be great to allow for different strategies regarding this. This probably means having a proper OHLC data class instead of a map. Do you have any thoughts regarding this?

Best regards and thanks again for this wonderful project,
Martin

Removed less used classes

Ensure the API stays stable by removing classes/interfaces that are not much being used and not completely proven before release 1.0

Exceptions in ZonedPeriod conversions

Existing conversion methods in ZonedPeriod are assuming the underlying object will have specific units which is not true.

Example:
5.minutes.toMinutes() //Method threw 'java.time.temporal.UnsupportedTemporalTypeException' exception.

In that case the Duration object actually holds seconds.
Instead, it will be better to use the existing conversion methods in the Java API itself.

Improve visualisation when not using a notebook

When running strategies it is useful to have certain visualisations available, also when running it is a straight forward JVM application and not inside a Jupyter Notebook.

Some thoughts:

  1. Reuse existing echarts library to generate a static web-page.
    Right now charts are part of roboquant-jupyter library, have to see how to best use them in a standalone application. Perhaps a configurable MetricsLogger or Metric that can generate static HTML pages.

  2. Use TensorBoard to visualise metrics from a run.
    This could be very flexible and gives insights also during a run. But more work since I couldn't find a higher level binding from Java to required ProtoBuf format.

More advanced examples

Include some more advanced examples that for example show using tick data and more complex strategies and policies.

Because of the excellent performance of roboquant, it is a viable solution for people how deal with higher frequency data. However currently not many examples show how to develop such solutions.

[Feature request] Support for more data formats

Currently, AFAIK, roboquant only supports .csv local historical data.

However, this is not great in terms of storage efficiency, and there are many alternatives (e.g pickle, feather, parquet...) with various combinations of compressions (e.g. zip, gzip...).

Would it be possible to add native support for some of these other formats?

(Personally I use zip-compressed pickle as the best compromise between size and read/write times)

df.to_pickle('foo.pkl', compression='zip')

pd.read_pickle('foo.pkl', compression='zip')

Remove OANDA support

OANDA moved in Europe to new entity called OANDA TMS and there is no support for Rest API anymore (already errors wooing up during integration tests).

So for now I will remove it from roboquant, since otherwise it would be included non-tested. If it changes in the future, I might bring it back.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.