Getting Started with Prophet Model using Snowflake ML
Overview
This guide shows how to create and log a Prophet forecasting model using Snowflake ML.
Snowflake also lets you log models beyond the built-in types like Prophet model , as long as they’re serializable and extend the CustomModel class from snowflake.ml.model.
You can get more information here
What is Prophet
The Prophet model is a time series forecasting tool developed by Facebook, designed to handle seasonality, holidays, and trend changes in data. It’s especially useful for business time series (like sales or traffic) and is robust to missing data and outliers.
Prerequisites
- Access to snowflake account with ACCOUNTADMIN role
What You’ll Learn
-
How to build a time series forecasting model using Facebook Prophet
-
How to wrap the model using Snowflake’s CustomModel class
-
How to log and register the model in Snowflake ML
-
How to run predictions using the logged model directly in Snowflake
What You’ll Build
-
A Prophet model for time series forecasting, developed in a Snowflake Notebook
-
Model training and inference running directly in a Snowflake warehouse
-
The model logged and registered in the Snowflake ML registry for future use
Setup
Step 1. In Snowsight, create a SQL Worksheet and open setup.sql to execute all statements in order from top to bottom.
Step 2. Download the SNF_PROPHET_FORECAST_MODEL.ipynb and PROPHET_PREDICTION.ipynb
Step 3. In Snowsight , switch to the FR_SCIENTIST role and import the notebook file downloaded in step 2. Use Database as ML_MODELS , schema as DS and warehouse as ML_FS_WH
Run Notebook
-
Set up environment and import libraries
After importing the necessary libraries and setting up the environment, create synthetic data. -
Train the model on synthetic data
Train the Prophet model using the synthetic dataset. -
Pickle the trained model and upload it to the stage
Serialize the trained model usingpickleand upload it to a Snowflake stage.
## Uplad the model into stage session.file.put("ProphetModel.pkl", f"@{stage_nm}/", auto_compress=False)
- Create the custom model
# Initialize ModelContext with keyword arguments # my_model can be any supported model type # my_file_path is a local pickle file path mc = custom_model.ModelContext( artifacts={ 'config': 'ProphetModel.pkl' } ) # Define a custom model class that utilizes the context class MyProphetModel(custom_model.CustomModel): def __init__(self,context:custom_model.ModelContext) -> None: super().__init__(context) ## use 'file_path to load the piecked object with open(self.context['config'],'rb') as f: self.model =pickle.load(f) @custom_model.inference_api def predict(self,X:pd.DataFrame) -> pd.DataFrame: X_copy = X.copy() X_copy['ds']=pd.to_datetime(X_copy['ds'])# ensure correrct datetime forecast = self.model.predict(X_copy) res_df = forecast[["ds", "yhat", "yhat_lower", "yhat_upper"]] return res_df
- Model registry
custom_mv = reg.log_model( forecast_model, model_name="Prophet_forcast_model", version_name="v1", conda_dependencies=["prophet"], sample_input_data= df_1, options={'relax_version': False}, comment = 'My Prophet forcast experiment using the CustomModel API' )
- Inference Notebook use the PROHPET_PREDICTION notebook
reg = Registry(session,database_name = db,schema_name= schema) model_name='PROPHET_FORCAST_MODEL' version = 'V1' mv = reg.get_model('PROPHET_FORCAST_MODEL').version('VERSION_2') predicted_sales = mv.run(forecast_dates)
Conclusion And Resources
Congratulations! You’ve successfully built a Prophet forecasting model in a Snowflake Notebook and logged the trained model to the Snowflake ML registry, making it ready for inference and future use.
What You Learned
-
How to build a time series forecasting model using Facebook Prophet
-
How to develop and run the model directly in a Snowflake Notebook
-
How to wrap the model using Snowflake’s CustomModel class
-
How to log and register the model in the Snowflake ML registry
-
How to run predictions on new data using the registered model within Snowflake
Related Resources
This content is provided as is, and is not maintained on an ongoing basis. It may be out of date with current Snowflake instances