Giter Site home page Giter Site logo

sadighian / crypto-rl Goto Github PK

View Code? Open in Web Editor NEW
805.0 54.0 231.0 165.23 MB

Deep Reinforcement Learning toolkit: record and replay cryptocurrency limit order book data & train a DDQN agent

Python 100.00%
bitfinex multiprocessing mongodb orderbook tick-data order-book limit-order-book hft multithreading recorder

crypto-rl's People

Contributors

amexn-me avatar dependabot[bot] avatar sadighian avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

crypto-rl's Issues

KeyError: 'render_modes' while render_modes = env_creator.metadata["render_modes"]

(rlvenv) (base) quentinxu@LAPTOP-7JKQ25SJ:/mnt/d/crypto-rl$ python3 experiment.py
Using TensorFlow backend.
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
[2022-07-21 09:01:00,312] Experiment creating agent with kwargs: {'window_size': 100, 'max_position': 5, 'fitting_file': 'demo_LTC-USD_20190926.csv.xz', 'testing_file': 'demo_LTC-USD_20190926.csv.xz', 'symbol': 'LTC-USD', 'id': 'trend-following-v0', 'number_of_training_steps': 100000.0, 'gamma': 0.99, 'seed': 1, 'action_repeats': 5, 'load_weights': False, 'visualize': False, 'training': True, 'reward_type': 'default', 'nn_type': 'cnn', 'dueling_network': True, 'double_dqn': True}
Traceback (most recent call last):
File "experiment.py", line 112, in
main(kwargs=args)
File "experiment.py", line 106, in main
agent = Agent(**kwargs)
File "/mnt/d/crypto-rl/agent/dqn.py", line 46, in init
self.env = gym.make(**kwargs)
File "/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/gym/envs/registration.py", line 625, in make
render_modes = env_creator.metadata["render_modes"]
KeyError: 'render_modes'

Question about exporting training data

In your paper you write:

We export the recorded data to compressed CSV files, segregated
by trading date using UTC timezone; each file is approximately 160Mb in size before compression.

  • Let's say I start to record data on day 1 at 18:00:00 UTC time. Since this is the first tick, the recorder requests a new order book snapshot from the exchange, and creates a tick with type load_book

  • The recording runs 110 hours till day 6 08:00:00 without any issue.

Now I want to crate the features for day 4 (00:00:00 - 23:59:59). If the Simulator queries for the data, the _query_arctic will searches for the first load_book point https://github.com/sadighian/crypto-rl/blob/arctic-streaming-ticks-full/data_recorder/database/database.py#L90, but it won't find anything, so it returns zero tick.

I saw, that the Simulator's extract_features method splits the data by days. https://github.com/sadighian/crypto-rl/blob/arctic-streaming-ticks-full/data_recorder/database/simulator.py#L303 and I could extract from the first day till the 5th, and then I have the 4th separated, but if I try to read so much data from the db the process gets killed because of OOM. (10GB)

How did you extract the 30 days training data? In one batch? What is the recommended size of memory for that? To be honest I have not looked through the whole code yet, maybe I am missing something.

Out of memory and "load_book" not being inserted

Hi, once again, thank you very much for the repo.

I have been having an issue, where i am unable to export a day of BTC-USD because the process gets killed because i run out of memory, even with 32gb of memory. I have commented on #20 but i am now opening a ticket as i have more issues.

I have been trying to modify the code such that i can query and export less than a days worth of data at a time (to get around the memory problem) and it has been somewhat successful. However, i have an issue with the _query_artic function. If i understand the project correctly, there should be inserted a "load_book" in the database after every snapshot (e.g. every second with the default settings), however, when i try to query less than a days worth at a time, no "load_book" is being found, and thus i get an exception that my query contains no data (Because the start index is not existent because the cursor.loc does not find a load_book).

So, i have been querying the data just using commands form Artic, and in my data, it seems that "load_book" is only being inserted once when i start recorder.py and then never again.

Am i misunderstanding how the project works? Or is there an issue here?

I will be grateful for any help you may be able to provide.

Endless warming up

This has been occurring frequently in the last weeks. Re-connections happen all the time, but in this case it seems, that the recovering has failed. After restarting the recorder script everything goes back to normal.

Have you had similar issue?

[2020-07-08 15:23:07,363] BTC-USD >> 30,559,489 <>     6,807 |     6,807 x 9,314.59  ||  9,314.60 x     3,154 |     3,294 <> 43,411,349
[2020-07-08 15:23:12,364] BTC-USD >> 30,562,824 <>       522 |       522 x 9,315.61  ||  9,316.64 x     2,329 |     2,329 <> 43,432,405
[2020-07-08 15:23:17,365] BTC-USD >> 30,562,824 <>       522 |       522 x 9,315.61  ||  9,315.62 x    39,431 |    41,427 <> 43,433,603
[2020-07-08 15:23:22,365] BTC-USD >> 30,568,042 <>     3,364 |     3,364 x 9,315.01  ||  9,315.02 x    57,563 |    57,563 <> 43,434,091
[2020-07-08 15:23:25,178] 
Bitfinex - tBTCUSD: 20051 Stop/Restart WebSocket Server (please reconnect)
[2020-07-08 15:23:25,181] tBTCUSD's order book cleared.
[2020-07-08 15:23:25,182] 
[BITFINEX - tBTCUSD] ...going to try and reload the order book

[2020-07-08 15:23:25,344] bitfinex: subscription exception WebSocket connection is closed: code = 1005 (no status code [internal]), no reason
[2020-07-08 15:23:25,345] bitfinex: Retrying to connect... attempted #5
Exception in thread tBTCUSD:
Traceback (most recent call last):
  File "/root/.pyenv/versions/3.7.0/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/root/crypto-rl/data_recorder/bitfinex_connector/bitfinex_client.py", line 68, in run
    self.exchange.upper())
websockets.exceptions.ConnectionClosed: WebSocket connection is closed: code = 10001 (unknown), reason = BITFINEX: no explanation

[2020-07-08 15:23:25,668] Requesting Book: {"event": "subscribe", "channel": "book", "prec": "R0", "freq": "F0", "symbol": "tBTCUSD", "len": "100"}
[2020-07-08 15:23:25,668] BOOK BITFINEX: tBTCUSD subscription request sent.
[2020-07-08 15:23:25,668] Requesting Trades: {"event": "subscribe", "channel": "trades", "symbol": "tBTCUSD"}
[2020-07-08 15:23:25,669] TRADES BITFINEX: tBTCUSD subscription request sent.
[2020-07-08 15:23:27,366] Coinbase - BTC-USD is warming up
[2020-07-08 15:23:32,367] Coinbase - BTC-USD is warming up
[2020-07-08 15:23:37,367] Coinbase - BTC-USD is warming up
[2020-07-08 15:23:42,368] Coinbase - BTC-USD is warming up
[2020-07-08 15:23:47,368] Coinbase - BTC-USD is warming up
[2020-07-08 15:23:52,369] Coinbase - BTC-USD is warming up
[2020-07-08 15:23:57,369] Coinbase - BTC-USD is warming up
[2020-07-08 15:24:02,369] Coinbase - BTC-USD is warming up
[2020-07-08 15:24:07,370] Coinbase - BTC-USD is warming up
[2020-07-08 15:24:12,371] Coinbase - BTC-USD is warming up

arctic TICK_STORE doesn't exist in arcticdb

I have tried to manually search through the documentation to find how it was migrated, but to no avail. I am not a computer science expert in any way shape or form, so I will phrase my question in a very simple manner: Since arctic has migrated to arcticdb, how have you reconfigured this whole code as to work as it should be ?

Define a time limit on recorder

I want to record the LOB using recorder.py but when I run for 24 hours, some connection error happens at some time. I would like to record just a short period to try on some data, see if mongo works with me etc. I don't understand how the run_until_complete function works and how we can define to record for a specific (shorter) period.
Thanks in advance!

normalized_data shape

In data_pipeline after concatenating

            print(f'imbalance_data after apply_ema: {imbalance_data.shape}') # (86400, 21)
            print(f'normalized_data: {normalized_data.shape}') # (86400, 124)
            normalized_data = pd.concat((normalized_data, imbalance_data), axis=1) 
            print(f'normalized_data + imbalance_data: {normalized_data.shape}') # (172800, 145)

At the end normalized_data.shape should not be (86400, 145)? It seems, that the indices of normalized_data are timestamps (2020-06-19 23:58:38.964857+00:00, 2020-06-19 23:58:39.964857+00:00, 2020-06-19 23:58:40.964857+00:00), but imbalance_data has normal indices (85841, 85842, 85843) and concatenating the two data frames adds the rows with a lot of NaN's, like in this example.

import pandas as pd

d1 = {'col1': [1, 2], 'col2': [3, 4]}
d2 = {'col3': [1, 2], 'col4': [3, 4]}
indices = ['A', 'B']
df1 = pd.DataFrame(data=d1, index=indices)
df2 = pd.DataFrame(data=d2)

df3 = pd.concat((df1, df2), axis=1)
print(df3)
   col1  col2  col3  col4
A   1.0   3.0   NaN   NaN
B   2.0   4.0   NaN   NaN
0   NaN   NaN   1.0   3.0
1   NaN   NaN   2.0   4.0

This could work:

normalized_data = pd.concat((normalized_data, imbalance_data.set_index(normalized_data.index)), axis=1)

Earlier I had problems with NaNs but I simple ignored it with

observation[np.isnan(observation)] = 0

in _get_step_observation

And one more question:
To the LOB imbalances you add the mean but in your paper there is no mention of it. Is it necessary?

it should be have a2c.py ppo.py?

thanks for you repo, I read your paper, the repo dose't have ppo.py? because .gitignore file include,

agent/dqn_weights/*
agent/ppo_weights/*
agent/acer_weights/*
agent/a2c_weights/*

5cfbdf1
so, I want to replay your paper test, but I don't know how can create ppo acer a2c.py

LOB training from historial LOB

Hi - Thanks for the awesome repo, am trying to understand how I can use your algorithm to train a DQN agent on historical LOB data on agent has seen for one asset (time, Bid Price, Ask Price, Bid Quantity, Ask Quantity, Order side) with Quantity being 0 when no order has been fulfilled but prices have fluctuated. When reviewing the data under recording : demo_LTC-USD_20190926.csv.xz , there are multiple columns specific to cryptocurrencies I assume, whereas I only have basic info. Is there a way/example to use historical LOB for any HFT other than crypto, or is the algorithm bespoke to the additional columns existent for cryptocurrencies?

Python version required

I've been trying to install on 3.7, 3.8 and 3.9.. It looks like always there is at least a few dependencies that fail to satisfy all your requirements. What python version did you use?
BTW I'm using Ubuntu 20.04

metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases

While I was trying experiment.py, error encounted:

(rlvenv) (base) quentinxu@LAPTOP-7JKQ25SJ:/mnt/d/crypto-rl$ python3 experiment.py
Using TensorFlow backend.
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/mnt/d/crypto-rl/rlvenv/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
Traceback (most recent call last):
File "experiment.py", line 3, in
from agent.dqn import Agent
File "/mnt/d/crypto-rl/agent/init.py", line 1, in
from agent.dqn import DQNAgent
File "/mnt/d/crypto-rl/agent/dqn.py", line 10, in
import gym_trading
File "/mnt/d/crypto-rl/gym_trading/init.py", line 3, in
from gym_trading.envs.market_maker import MarketMaker
File "/mnt/d/crypto-rl/gym_trading/envs/init.py", line 1, in
from gym_trading.envs.market_maker import MarketMaker
File "/mnt/d/crypto-rl/gym_trading/envs/market_maker.py", line 6, in
from gym_trading.envs.base_environment import BaseEnvironment
File "/mnt/d/crypto-rl/gym_trading/envs/base_environment.py", line 23, in
class BaseEnvironment(Env, ABC):
TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases((rlv(((((((((((((((((rlv((rl((r(rl(((r(

recorder get stuck

hello , sadighian.
I started the recorder 20hours ago, it's get correct data at the begining , but 5hours ago, it's got stuck and print the same data. you can check the log below.

[2021-12-28 10:28:53,018] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:53,019] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:28:54,018] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:28:54,019] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:54,019] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:28:55,018] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:28:55,019] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:55,019] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:28:56,019] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:28:56,019] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:56,019] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:28:57,019] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:28:57,019] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:57,019] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:28:58,019] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:28:58,019] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:58,020] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:28:59,019] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:28:59,020] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:28:59,020] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:00,019] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:00,020] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:00,020] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:01,020] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:01,020] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:01,020] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:02,020] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:02,020] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:02,020] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:03,020] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:03,020] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:03,021] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:04,020] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:04,021] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:04,021] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:05,020] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:05,021] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:05,021] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0
[2021-12-28 10:29:06,021] BTC-USD >> 0 <> 0 | 6,502 x 51,081.14 || 51,081.15 x 1 | 0 <> 0
[2021-12-28 10:29:06,021] LTC-USD >> 0 <> 0 | 1,113 x 158.61 || 158.66 x 746 | 0 <> 0
[2021-12-28 10:29:06,021] ETH-USD >> 0 <> 0 | 124 x 4,067.91 || 4,068.17 x 81 | 0 <> 0

Active and Inactive Limit Order Book

Active means the limit order book would change when the rl agent submits a new order. In comparison, the inactive limit order book means the rl agent is trained purely on the historical limit order book. I have found this repo is an inactive one. Is it right?

Error while exporting to csv

I tried to export some data from the db to csv.

Maybe related man-group/arctic#615

I do not think that it should be a problem, but there was a gap while I recorded that day. I had to stop it for a couple of hours because I was travelling.
.

python data_recorder/tests/test_simulator.py
[2020-05-04 16:37:49,281] init_db_connection for None...
[2020-05-04 16:37:49,281] Connecting to mongo: localhost (localhost)
[2020-05-04 16:37:49,290]
Getting ['BTC-USD'] data from Arctic Tick Store...
///.pyenv/versions/crypto-rl/lib/python3.7/site-packages/arctic/tickstore/tickstore.py:418: FutureWarning: Conversion of the second argument of issubdtype from int to np.signedinteger is deprecated. In future, it will be treated as np.int64 == np.dtype(int).type.
if np.issubdtype(dtype, int):
[2020-05-04 16:47:11,561] Got data in 562.269496 secs, creating DataFrame...
[2020-05-04 16:47:19,641] 2600000 rows in 570.34618 secs: 4558 ticks/sec
[2020-05-04 16:47:22,474] Completed querying 2590711 ['BTC-USD'] records in 573 seconds
[2020-05-04 16:47:22,498] Completed get_tick_history() in 573 seconds
[2020-05-04 16:47:22,498] querying BTC-USD
[2020-05-04 16:47:22,523]
Database: [coinbase is recording BTC-USD]

[2020-05-04 16:47:22,523] init_db_connection for BTC-USD...
[2020-05-04 16:47:22,575] Connecting to mongo: localhost (localhost)
[2020-05-04 16:47:23,048] Starting get_orderbook_snapshot_history() loop with 2590711 ticks for ['BTC-USD']
[2020-05-04 16:47:23,063] ...completed 0 loops in 0 seconds
[2020-05-04 16:47:23,063] BTC-USD's order book cleared.
[2020-05-04 16:47:26,206] Book finished loading at None
[2020-05-04 16:47:26,208] BTC-USD first tick: 2020-04-25 16:11:00.023615+00:00
///.pyenv/versions/crypto-rl/lib/python3.7/site-packages/arctic/tickstore/tickstore.py:757: FutureWarning: A future version of pandas will default to skipna=True. To silence this warning, pass skipna=True|False explicitly.
v = TickStore._ensure_supported_dtypes(v)
Traceback (most recent call last):
File "data_recorder/tests/test_simulator.py", line 79, in
test_get_orderbook_snapshot_history()
File "data_recorder/tests/test_simulator.py", line 38, in test_get_orderbook_snapshot_history
orderbook_snapshot_history = sim.get_orderbook_snapshot_history(query=query)
File "///crypto-rl/data_recorder/database/simulator.py", line 253, in get_orderbook_snapshot_history
order_book.new_tick(msg=tick)
File "///crypto-rl/data_recorder/coinbase_connector/coinbase_orderbook.py", line 141, in new_tick
self.db.new_tick(msg)
File "///crypto-rl/data_recorder/database/database.py", line 62, in new_tick
self.collection.write(self.sym, self.data)
File "///.pyenv/versions/crypto-rl/lib/python3.7/site-packages/arctic/tickstore/tickstore.py", line 597, in write
buckets = self._to_buckets(data, symbol, initial_image)
File "///.pyenv/versions/crypto-rl/lib/python3.7/site-packages/arctic/tickstore/tickstore.py", line 623, in _to_buckets
bucket, initial_image = TickStore._to_bucket(x[i:i + self._chunk_size], symbol, initial_image)
File "///.pyenv/versions/crypto-rl/lib/python3.7/site-packages/arctic/tickstore/tickstore.py", line 757, in _to_bucket
v = TickStore._ensure_supported_dtypes(v)
File "///.pyenv/versions/crypto-rl/lib/python3.7/site-packages/arctic/tickstore/tickstore.py", line 662, in _ensure_supported_dtypes
raise UnhandledDtypeException("Casting object column to string failed")
arctic.exceptions.UnhandledDtypeException: Casting object column to string failed

websocket ConnectionRefusedError

I am getting ConnectionRefusedError on my MBP M1 Chip. I also tried run recorder in a docker but still gets the same error message. Any idea where is this error coming from? Many thanks!

(cpt) changli@Changs-MBP crypto-rl % python recorder.py 
/opt/anaconda3/envs/cpt/lib/python3.8/site-packages/arctic/store/_pandas_ndarray_store.py:8: FutureWarning: The Panel class is removed from pandas. Accessing it from the top-level namespace will also be removed in the next version
  from pandas import Panel
[2022-03-21 01:16:57,221] Starting recorder with basket = [('BTC-USD', 'tBTCUSD'), ('ETH-USD', 'tETHUSD')]
[2022-03-21 01:16:57,231] Process started up for BTC-USD
/opt/anaconda3/envs/cpt/lib/python3.8/site-packages/arctic/store/_pandas_ndarray_store.py:8: FutureWarning: The Panel class is removed from pandas. Accessing it from the top-level namespace will also be removed in the next version
  from pandas import Panel
[2022-03-21 01:16:57,677] COINBASE client instantiated.
[2022-03-21 01:16:57,677] init_db_connection for BTC-USD...
[2022-03-21 01:16:57,677] Connecting to mongo: localhost (localhost)
[2022-03-21 01:16:57,721] BITFINEX client instantiated.
[2022-03-21 01:16:57,721] init_db_connection for tBTCUSD...
[2022-03-21 01:16:57,721] Connecting to mongo: localhost (localhost)
[2022-03-21 01:16:57,738] run() initiated on : BTC-USD
[2022-03-21 01:16:57,738] run() initiated on : tBTCUSD
[2022-03-21 01:16:57,739] Recorder: Gathered 2 tasks
[2022-03-21 01:16:57,758] Recorder: Finally done for BTC-USD and tBTCUSD.
Process Recorder-1:
Traceback (most recent call last):
  File "/opt/anaconda3/envs/cpt/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/Users/changli/sg21CodeLab/Quant/Crypto/crypto-rl/recorder.py", line 59, in run
    loop.run_until_complete(tasks)
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/Users/changli/sg21CodeLab/Quant/Crypto/crypto-rl/data_recorder/connector_components/client.py", line 49, in subscribe
    self.ws = await websockets.connect(self.ws_endpoint)
  File "/opt/anaconda3/envs/cpt/lib/python3.8/site-packages/websockets/legacy/client.py", line 650, in __await_impl_timeout__
    return await asyncio.wait_for(self.__await_impl__(), self.open_timeout)
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/tasks.py", line 494, in wait_for
    return fut.result()
  File "/opt/anaconda3/envs/cpt/lib/python3.8/site-packages/websockets/legacy/client.py", line 654, in __await_impl__
    transport, protocol = await self._create_connection()
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/base_events.py", line 1025, in create_connection
    raise exceptions[0]
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/base_events.py", line 1010, in create_connection
    sock = await self._connect_sock(
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/base_events.py", line 924, in _connect_sock
    await self.sock_connect(sock, address)
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/selector_events.py", line 496, in sock_connect
    return await fut
  File "/opt/anaconda3/envs/cpt/lib/python3.8/asyncio/selector_events.py", line 528, in _sock_connect_cb
    raise OSError(err, f'Connect call failed {address}')
ConnectionRefusedError: [Errno 61] Connect call failed ('0.0.0.0', 443)
[2022-03-21 01:17:02,745] Both Coinbase and Bitfinex are still warming up...
[2022-03-21 01:17:03,749] Both Coinbase and Bitfinex are still warming up...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.