Main goal is to get a working notebook with the five sections.
No markdown cells or final documentation required.
Implementation Breakdown
This feature will be decomposed into the following notebook sections:
- Setup and Data Loading - Dependencies, Snowflake connection, load data via Arrow into getml
- Annotations - Set roles (join_key, time_stamp, target, numerical, categorical)
- Data Model - Define StarSchema with store as entity, join peripheral tables
- Training - FastProp feature learning, fit pipeline
- Feature Export - Transform to Arrow, write to Snowflake, register External FeatureView
Technical Context
Input: Prepared population table (WEEKLY_SALES_BY_STORE_WITH_TARGET) from data infrastructure (#42)
Output: External FeatureView registered in Snowflake Feature Store
Key APIs:
- Snowflake:
snowflake.snowpark.Session, snowflake.ml.feature_store
- getml:
DataFrame.from_arrow(), pipe.transform(), .to_arrow()
- External FV:
FeatureView(refresh_freq=None)
File: getml-demo/integration/snowflake/notebooks/snowflake_feature_store.ipynb
Main goal is to get a working notebook with the five sections.
No markdown cells or final documentation required.
Implementation Breakdown
This feature will be decomposed into the following notebook sections:
Technical Context
Input: Prepared population table (
WEEKLY_SALES_BY_STORE_WITH_TARGET) from data infrastructure (#42)Output: External FeatureView registered in Snowflake Feature Store
Key APIs:
snowflake.snowpark.Session,snowflake.ml.feature_storeDataFrame.from_arrow(),pipe.transform(),.to_arrow()FeatureView(refresh_freq=None)File:
getml-demo/integration/snowflake/notebooks/snowflake_feature_store.ipynb