This repo adds possibility to run Feast on spark.
pip install feast
git clone https://github.com/qooba/feast-pyspark.git
cd feast-pyspark
pip install -e .
feast init feature_repo
cd feature_repo
To configure the offline store edit feature_store.yaml
project: feature_repo
registry: data/registry.db
provider: local
online_store:
...
offline_store:
type: feast_pyspark.SparkOfflineStore
spark_conf:
spark.master: "local[*]"
spark.ui.enabled: "false"
spark.eventLog.enabled: "false"
spark.sql.session.timeZone: "UTC"
Example features.py
:
from google.protobuf.duration_pb2 import Duration
from feast import Entity, Feature, FeatureView, ValueType
from feast_pyspark import DeltaDataSource
from feast.data_format import ParquetFormat
my_stats = DeltaDataSource(
path="dataset/all",
event_timestamp_column="datetime",
#range_join=10,
)
my_entity = Entity(name="entity_id", value_type=ValueType.INT64, description="entity id",)
mystats_view = FeatureView(
name="my_statistics",
entities=["entity_id"],
ttl=Duration(seconds=3600*24*20),
features=[
Feature(name="f0", dtype=ValueType.FLOAT),
Feature(name="f1", dtype=ValueType.FLOAT),
Feature(name="f2", dtype=ValueType.FLOAT),
Feature(name="f3", dtype=ValueType.FLOAT),
Feature(name="f4", dtype=ValueType.FLOAT),
Feature(name="f5", dtype=ValueType.FLOAT),
Feature(name="f6", dtype=ValueType.FLOAT),
Feature(name="f7", dtype=ValueType.FLOAT),
Feature(name="f8", dtype=ValueType.FLOAT),
Feature(name="f9", dtype=ValueType.FLOAT),
Feature(name="y", dtype=ValueType.FLOAT),
], online=True, input=my_stats, tags={},)
feast-spark-offline-store - spark configuration and session
feast-postgres - parts of Makefiles and github workflows