Giter Site home page Giter Site logo

timescale / nft-starter-kit Goto Github PK

View Code? Open in Web Editor NEW
111.0 12.0 26.0 20.89 MB

Timescale NFT Starter Kit

Home Page: https://docs.timescale.com/timescaledb/latest/tutorials/analyze-nft-data/

Python 92.47% Dockerfile 2.65% Shell 4.89%
nft crypto timescaledb superset sql postgresql grafana

nft-starter-kit's Introduction

Timescale NFT Starter Kit

The Timescale NFT Starter Kit is a step-by-step guide to get up and running with collecting, storing, analyzing and visualizing NFT data from OpenSea, using PostgreSQL and TimescaleDB.

The NFT Starter Kit will give you a foundation for analyzing NFT trends so that you can bring some data to your purchasing decisions, or just learn about the NFT space from a data-driven perspective. It also serves as a solid foundation for your more complex NFT analysis projects in the future.

We recommend following along with the NFT Starter Kit tutorial to get familiar with the contents of this repository.

For more information about the NFT Starter Kit, see the announcement blog post.

Project components

We provide multiple standalone components to help your data exploration journey at each level.

Design database schema

Get data

Build dashboards

Analyze data

Get started

Whichever component you are most interested in, first clone the repository:

git clone https://github.com/timescale/nft-starter-kit.git
cd nft-starter-kit

Setting up the pre-built Superset dashboards

This part of the project is fully Dockerized. TimescaleDB and the Superset dashboard is built out automatically using docker-compose. After completing the steps below, you will have a local TimescaleDB and Superset instance running in containers - containing 500K+ NFT transactions from OpenSea.

superset dashboard

The Docker service uses port 8088 (for Superset) and 6543 (for TimescaleDB) so make sure there's no other services using those ports before starting the installation process.

Prerequisites

  • Docker

  • Docker compose

    Verify that both are installed:

    docker --version && docker-compose --version

Instructions

  1. Run docker-compose up --build in the /pre-built-dashboards folder:

    cd pre-built-dashboards
    docker-compose up --build

    See when the process is done (it could take a couple of minutes):

    timescaledb_1      | PostgreSQL init process complete; ready for start up.
  2. Go to http://0.0.0.0:8088/ in your browser and login with these credentials:

    user: admin
    password: admin
  3. Open the Databases page inside Superset (http://0.0.0.0:8088/databaseview/list/). You will see exactly one item there called NFT Starter Kit.

  4. Go check out your NFT dashboards!

    Collections dashboard: http://0.0.0.0:8088/superset/dashboard/1

    Assets dashboard: http://0.0.0.0:8088/superset/dashboard/2

Running the data ingestion script

If you'd like to ingest data into your database (be it a local TimescaleDB, or in Timescale Cloud) straight from the OpenSea API, follow these steps to configure the ingestion script:

Prerequisites

Instructions

  1. Go to the root folder of the project:
    cd nft-starter-kit
  2. Create a new Python virtual environment and install the requirements:
    virtualenv env && source env/bin/activate
    pip install -r requirements.txt
  3. Replace the parameters in the config.py file:
    DB_NAME="tsdb"
    HOST="YOUR_HOST_URL"
    USER="tsdbadmin"
    PASS="YOUR_PASSWORD_HERE"
    PORT="PORT_NUMBER"
    OPENSEA_START_DATE="2021-10-01T00:00:00" # example start date (UTC)
    OPENSEA_END_DATE="2021-10-06T23:59:59" # example end date (UTC)
    OPENSEA_APIKEY="YOUR_OPENSEA_APIKEY" # need to request from OpenSea's docs
  4. Run the Python script:
    python opensea_ingest.py
    This will start ingesting data in batches, ~300 rows at a time:
    Start ingesting data between 2021-10-01 00:00:00+00:00 and 2021-10-06 23:59:59+00:00
    ---
    Fetching transactions from OpenSea...
    Data loaded into temp table!
    Data ingested!
    Data has been backfilled until this time: 2021-10-06 23:51:31.140126+00:00
    ---
    You can stop the ingesting process anytime (Ctrl+C), otherwise the script will run until all the transactions have been ingested from the given time period.

Ingest the sample data

If you don't want to spend time waiting until a decent amount of data is ingested, you can just use our sample dataset which contains 500K+ sale transactions from OpenSea (this sample was used for the Superset dashboard as well)

Prerequisites

Instructions

  1. Go to the folder with the sample CSV files (or you can also download them from here):
    cd pre-built-dashboards/database/data
  2. Connect to your database with PSQL:
    psql -x "postgres://host:port/tsdb?sslmode=require"
    If you're using Timescale Cloud, the instructions under How to Connect provide a customized command to run to connect directly to your database.
  3. Import the CSV files in this order (it can take a few minutes in total):
    \copy accounts FROM 001_accounts.csv CSV HEADER;
    \copy collections FROM 002_collections.csv CSV HEADER;
    \copy assets FROM 003_assets.csv CSV HEADER;
    \copy nft_sales FROM 004_nft_sales.csv CSV HEADER;
  4. Try running some queries on your database:
    SELECT count(*), MIN(time) AS min_date, MAX(time) AS max_date FROM nft_sales 

nft-starter-kit's People

Contributors

avthars avatar contentana avatar dependabot[bot] avatar zseta avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nft-starter-kit's Issues

Error loading chart datasources. Filters may not work correctly.

Hi.

I'm new and starting to set up the nft-starter-kit.
After following all steps of the tutorial im heading towards the dashboard and my data isn't loading.
Im having the following error "Error loading chart datasources. Filters may not work correctly." I haven't set any filters so I think thats not the problem.

Thanks,

mmm

Captura de Pantalla 2021-11-21 a la(s) 11 39 19

Request blocked

Followed the instructions, got the db connected. When running opensea_ingest.py i'm seeing

Fetching transactions from OpenSea...
None
Request blocked... retrying...

outputted in console... I realized now I dont have an OpenSea API key. Is this why?

Error loading some charts/data

Hello and thank you for the amazing work you have put together.

I'm currently having some issues with some charts not showing and seeing some errors in the terminal.

nft-starter-kit_1 | The field timeseries_limit is deprecated, please use series_limit instead.
nft-starter-kit_1 | 2022-01-13 16:09:12,392:WARNING:superset.common.query_object:The field timeseries_limit is deprecated, please use series_limit instead.
nft-starter-kit_1 | 2022-01-13 16:09:13,335:INFO:werkzeug:172.20.0.1 - - [13/Jan/2022 16:09:13] "POST /superset/log/?explode=events&dashboard_id=2 HTTP/1.1" 200 -
nft-starter-kit_1 | Query SELECT bucket AS __timestamp,
nft-starter-kit_1 | asset_id AS asset_id,
nft-starter-kit_1 | asset_name AS asset_name,
nft-starter-kit_1 | max(volume_eth) AS "MAX(volume_eth)"
nft-starter-kit_1 | FROM public.superset_assets_daily
nft-starter-kit_1 | GROUP BY asset_id,
nft-starter-kit_1 | asset_name,
nft-starter-kit_1 | bucket
nft-starter-kit_1 | ORDER BY "MAX(volume_eth)" DESC
nft-starter-kit_1 | LIMIT 50000 on schema public failed
nft-starter-kit_1 | Traceback (most recent call last):
nft-starter-kit_1 | File "/app/superset/connectors/sqla/models.py", line 1601, in query
nft-starter-kit_1 | df = self.database.get_df(sql, self.schema, mutator=assign_column_label)
nft-starter-kit_1 | File "/app/superset/models/core.py", line 431, in get_df
nft-starter-kit_1 | df = result_set.to_pandas_df()
nft-starter-kit_1 | File "/app/superset/result_set.py", line 202, in to_pandas_df
nft-starter-kit_1 | return self.convert_table_to_df(self.table)
nft-starter-kit_1 | File "/app/superset/result_set.py", line 177, in convert_table_to_df
nft-starter-kit_1 | return table.to_pandas(integer_object_nulls=True)
nft-starter-kit_1 | File "pyarrow/array.pxi", line 757, in pyarrow.lib._PandasConvertible.to_pandas
nft-starter-kit_1 | File "pyarrow/table.pxi", line 1748, in pyarrow.lib.Table._to_pandas
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 789, in table_to_blockmanager
nft-starter-kit_1 | blocks = _table_to_blocks(options, table, categories, ext_columns_dtypes)
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 1130, in _table_to_blocks
nft-starter-kit_1 | return [_reconstruct_block(item, columns, extension_columns)
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 1130, in
nft-starter-kit_1 | return [_reconstruct_block(item, columns, extension_columns)
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 733, in _reconstruct_block
nft-starter-kit_1 | dtype = make_datetimetz(item['timezone'])
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 758, in make_datetimetz
nft-starter-kit_1 | tz = pa.lib.string_to_tzinfo(tz)
nft-starter-kit_1 | File "pyarrow/types.pxi", line 1927, in pyarrow.lib.string_to_tzinfo
nft-starter-kit_1 | File "pyarrow/error.pxi", line 143, in pyarrow.lib.pyarrow_internal_check_status
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pytz/init.py", line 188, in timezone
nft-starter-kit_1 | raise UnknownTimeZoneError(zone)
nft-starter-kit_1 | pytz.exceptions.UnknownTimeZoneError: '+00'
nft-starter-kit_1 | 2022-01-13 16:09:14,511:WARNING:superset.connectors.sqla.models:Query SELECT bucket AS __timestamp,
nft-starter-kit_1 | asset_id AS asset_id,
nft-starter-kit_1 | asset_name AS asset_name,
nft-starter-kit_1 | max(volume_eth) AS "MAX(volume_eth)"
nft-starter-kit_1 | FROM public.superset_assets_daily
nft-starter-kit_1 | GROUP BY asset_id,
nft-starter-kit_1 | asset_name,
nft-starter-kit_1 | bucket
nft-starter-kit_1 | ORDER BY "MAX(volume_eth)" DESC
nft-starter-kit_1 | LIMIT 50000 on schema public failed
nft-starter-kit_1 | Traceback (most recent call last):
nft-starter-kit_1 | File "/app/superset/connectors/sqla/models.py", line 1601, in query
nft-starter-kit_1 | df = self.database.get_df(sql, self.schema, mutator=assign_column_label)
nft-starter-kit_1 | File "/app/superset/models/core.py", line 431, in get_df
nft-starter-kit_1 | df = result_set.to_pandas_df()
nft-starter-kit_1 | File "/app/superset/result_set.py", line 202, in to_pandas_df
nft-starter-kit_1 | return self.convert_table_to_df(self.table)
nft-starter-kit_1 | File "/app/superset/result_set.py", line 177, in convert_table_to_df
nft-starter-kit_1 | return table.to_pandas(integer_object_nulls=True)
nft-starter-kit_1 | File "pyarrow/array.pxi", line 757, in pyarrow.lib._PandasConvertible.to_pandas
nft-starter-kit_1 | File "pyarrow/table.pxi", line 1748, in pyarrow.lib.Table._to_pandas
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 789, in table_to_blockmanager
nft-starter-kit_1 | blocks = _table_to_blocks(options, table, categories, ext_columns_dtypes)
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 1130, in _table_to_blocks
nft-starter-kit_1 | return [_reconstruct_block(item, columns, extension_columns)
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 1130, in
nft-starter-kit_1 | return [_reconstruct_block(item, columns, extension_columns)
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 733, in _reconstruct_block
nft-starter-kit_1 | dtype = make_datetimetz(item['timezone'])
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pyarrow/pandas_compat.py", line 758, in make_datetimetz
nft-starter-kit_1 | tz = pa.lib.string_to_tzinfo(tz)
nft-starter-kit_1 | File "pyarrow/types.pxi", line 1927, in pyarrow.lib.string_to_tzinfo
nft-starter-kit_1 | File "pyarrow/error.pxi", line 143, in pyarrow.lib.pyarrow_internal_check_status
nft-starter-kit_1 | File "/usr/local/lib/python3.8/site-packages/pytz/init.py", line 188, in timezone
nft-starter-kit_1 | raise UnknownTimeZoneError(zone)
nft-starter-kit_1 | pytz.exceptions.UnknownTimeZoneError: '+00'
nft-starter-kit_1 | 2022-01-13 16:09:14,530:INFO:werkzeug:172.20.0.1 - - [13/Jan/2022 16:09:14] "POST /api/v1/chart/data?form_data=%7B%22slice_id%22%3A10%7D&dashboard_id=2&force=true HTTP/1.1" 400 -

Screen Shot 2022-01-13 at 11 12 23 AM

Screen Shot 2022-01-13 at 11 13 05 AM

Data ingestion script gives KeyError

Hi there!

I have done all the setup for this project (installed postgreSQL & timescaleDB, connected to timescaleDB cloud instance, and set up config.py). However, upon execution I got a KeyError:

nft-scraper-issue

Do you know what went wrong and how I could fix this?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.