Giter Site home page Giter Site logo

dbt_pendo's Introduction

Pendo Transformation dbt Package (Docs)

πŸ“£ What does this dbt package do?

  • Produces modeled tables that pendoage Pendo data from Fivetran's connector in the format described by this ERD and builds off the output of our Pendo source package.

  • Enables you to understand how users are experiencing and adopting your product. It achieves thi by:

    • Calculating usage of features, pages, guides, and the overall product at the account and individual visitor level
    • Enhancing event stream tables with visitor and product information, referring pages, and features to track the customer journey through the application
    • Creating daily activity timelines for features, pages, and guides that reflect their adoption rates, discoverability, and usage promotion efficacy
    • Directly tying visitors and features together to determine activation rates, power-usage, and churn risk

The following table provides a detailed list of all models materialized within this package by default.

TIP: See more details about these models in the package's dbt docs site.

Model Description
pendo__account Each record represents a unique account in Pendo, enriched with metrics regarding associated visitors and their feature, page, and overall product activity (total and daily averages). Also includes their aggregated NPS ratings and the frequency and longevity of their product usage.
pendo__feature Each record represents a unique tagged feature in Pendo, enriched with information about the page it lives on, the application and product area it is a part of, and the internal users who created and/or updated it last. Also includes metrics regarding the visitors' time spent using the feature and click-interactions with individual visitors and accounts.
pendo__page Each record represents a unique tagged page in Pendo, enriched with information about its URL rules, the application and product area it is a part of, the internal users who created and/or updated it last, and the features that are currently live on it. Also includes metrics regarding the visitors' time spent on the page and pageview-interactions with individual visitors and accounts.
pendo__visitor Each record represents a unique visitor in Pendo, enriched with metrics regarding associated accounts and the visitor's latest NPS rating, as well as the frequency, longevity, and average length of their daily product usage.
pendo_guide Each record represents a unique guide presented to visitors via Pendo. Includes metrics about the number of visitors and accounts performing various activities upon guides, such as completing or dismissing them.
pendo__account_daily_metrics A daily historical timeline of the overall product, feature, and page activity associated with each account, along with the number of associated users performing each kind of interaction.
pendo__feature_daily_metrics A daily historical timeline, beginning at the creation of each feature, of the accounts and visitors clicking on each feature, the average daily time spent using the feature, and the percent share of total daily feature activity pertaining to this particular feature.
pendo__page_daily_metrics A daily historical timeline, beginning at the creation of each page, of the accounts and visitors loading on each page, the average daily time spent on the page, and the percent share of total daily pageview activity pertaining to this particular page.
pendo__visitor_daily_metrics A daily historical timeline of the overall product, feature, and page activity tracked for an individual visitor. Includes the daily number of different pages and features interacted with.
pendo__guide_daily_metrics A daily historical timeline of the accounts and individual visitors interacting with guides via different types of actions.
pendo__feature_event The event stream of clicks on tagged features in Pendo. Enriched with any visitor and/or account passthrough columns, the previous feature and page that the visitor interacted with, the application and platform the event occurred on, and information on the feature and its product area.
pendo__page_event The event stream of views of tagged pages in Pendo. Enriched with any visitor and/or account passthrough columns, the previous page that the visitor interacted with, the application and platform the event occurred on, and information on the page and its product area.
pendo__guide_event The event stream of different kinds of interactions visitors have with guides. Enriched with any visitor and/or account passthrough columns, as well as the application and platform that the event occurred on.
pendo__visitor_feature Each record represents a unique combination of visitors and features, aimed at making "power-users" of particular features easy to find. Includes metrics reflecting the longevity and frequency of feature usage.

🎯 How do I use the dbt package?

Step 1: Prerequisites

To use this dbt package, you must have the following:

  • At least one Fivetran Pendo connector syncing data into your destination.
  • A BigQuery, Snowflake, Redshift, PostgreSQL, or Databricks destination.

Databricks Dispatch Configuration

If you are using a Databricks destination with this package you will need to add the below (or a variation of the below) dispatch configuration within your dbt_project.yml. This is required in order for the package to accurately search for macros within the dbt-labs/spark_utils then the dbt-labs/dbt_utils packages respectively.

dispatch:
  - macro_namespace: dbt_utils
    search_order: ['spark_utils', 'dbt_utils']

Step 2: Install the package

Include the following pendo_source package version in your packages.yml file.

TIP: Check dbt Hub for the latest installation instructions or read the dbt docs for more information on installing packages.

# packages.yml
packages:
  - package: fivetran/pendo
    version: [">=0.5.0", "<0.6.0"] # we recommend using ranges to capture non-breaking changes automatically

Do NOT include the pendo_source package in this file. The transformation package itself has a dependency on it and will install the source package as well.

Step 3: Define database and schema variables

By default, this package runs using your destination and the pendo schema. If this is not where your Pendo data is (for example, if your Pendo schema is named pendo_fivetran), add the following configuration to your root dbt_project.yml file:

vars:
  pendo_database: your_database_name
  pendo_schema: your_schema_name 

(Optional) Step 4: Additional configurations

Expand for configurations

Passthrough Columns

This package includes all of the source columns that are defined in the macros folder. We recommend including custom columns in this package because the transformation models only bring in the standard columns for the EVENT, FEATURE_EVENT, PAGE_EVENT, ACCOUNT_HISTORY, and VISITOR_HISTORY tables.

You can add more columns using our passthrough column variables. These variables allow the passthrough columns to be aliased (alias) and casted (transform_sql) if you want, although it is not required. You can configure the datatype casting using a SQL snippet within the transform_sql key. You may add the desired SQL snippet while omitting the as field_name part of the casting statement - we rename this column with the alias attribute - and your custom passthrough columns will be casted accordingly.

Use the following format for declaring the respective passthrough variables:

vars:
  pendo__feature_event_pass_through_columns: # will be passed to pendo__feature_event and stg_pendo__feature_event
    - name:           "custom_crazy_field_name"
      alias:          "normal_field_name"
  pendo__page_event_pass_through_columns: # will be passed to pendo__page_event and stg_pendo__page_event
    - name:           "property_field_id"
      alias:          "new_name_for_this_field_id"
      transform_sql:  "cast(new_name_for_this_field as int64)"
    - name:           "this_other_field"
      transform_sql:  "cast(this_other_field as string)"
  pendo__account_history_pass_through_columns: # will be passed to pendo__account, pendo__feature_event, and pendo__page_event
    - name:           "well_named_field_1"
  pendo__visitor_history_pass_through_columns: # will be passed to pendo__visitor, pendo__feature_event, and pendo__page_event 
    - name:           "well_named_field_2"
  pendo__event_pass_through_columns: # will be passed to stg_pendo__event (in source package only)
    - name:           "well_named_field_3"

Changing the Build Schema

By default, this package builds the Pendo final models within a schema titled (<target_schema> + _pendo), intermediate models in (<target_schema> + _int_pendo), and staging models within a schema titled (<target_schema> + _stg_pendo) in your target database. If this is not where you would like your modeled Pendo data to be written to, add the following configuration to your dbt_project.yml file:

...
models:
  pendo:
    +schema: my_new_schema_name # leave blank for just the target_schema
    intermediate:
      +schema: my_new_schema_name # leave blank for just the target_schema
  pendo_source:
    +schema: my_new_schema_name # leave blank for just the target_schema

NOTE: If your profile does not have permissions to create schemas in your destination, you can set each +schema to blank. The package will then write all tables to your pre-existing target schema.

Change the source table references

If an individual source table has a different name than the package expects, add the table name as it appears in your destination to the respective variable:

IMPORTANT: See this project's dbt_project.yml variable declarations to see the expected names.

vars:
  pendo_source:
    pendo_<default_source_table_name>_identifier: your_table_name 

🚨 Snowflake Users 🚨

You may need to provide the case-sensitive spelling of your source tables that are also Snowflake reserved words.

In this package, this would apply to the GROUP source. If you are receiving errors for this source, include the following in your dbt_project.yml file:

vars:
  pendo_group_identifier: '"Group"' # as an example, must include this quoting pattern and adjust for your exact casing

Note! if you have sources defined in one of your project's yml files, for example if you have a yml file with a sources level like in the following example, the prior code will not work.

Instead you will need to add the following where your group source table is defined in your yml:

sources:
  tables:
    - name: group 
      # Add the below
      identifier: GROUP # Or what your group table is named, being mindful of casing
      quoting:
        identifier: true

(Optional) Step 5: Orchestrate your models with Fivetran Transformations for dbt Coreβ„’

Expand for details

Fivetran offers the ability for you to orchestrate your dbt project through Fivetran Transformations for dbt Coreβ„’. Learn how to set up your project for orchestration through Fivetran in our Transformations for dbt Core setup guides.

πŸ” Does this package have dependencies?

This dbt package is dependent on the following dbt packages. Please be aware that these dependencies are installed by default within this package. For more information on the following packages, refer to the dbt hub site.

IMPORTANT: If you have any of these dependent packages in your own packages.yml file, we highly recommend that you remove them from your root packages.yml to avoid package version conflicts.

packages:
    - package: fivetran/fivetran_utils
      version: [">=0.4.0", "<0.5.0"]

    - package: dbt-labs/dbt_utils
      version: [">=1.0.0", "<2.0.0"]

    - package: fivetran/pendo_source
      version: [">=0.5.0", "<0.6.0"]

    - package: dbt-labs/spark_utils
      version: [">=0.3.0", "<0.4.0"]

πŸ™Œ How is this package maintained and can I contribute?

Package Maintenance

The Fivetran team maintaining this package only maintains the latest version of the package. We highly recommend you stay consistent with the latest version of the package and refer to the CHANGELOG and release notes for more information on changes across versions.

Contributions

A small team of analytics engineers at Fivetran develops these dbt packages. However, the packages are made better by community contributions!

We highly encourage and welcome contributions to this package. Check out this dbt Discourse article on the best workflow for contributing to a package!

πŸͺ Are there any resources available?

  • If you have questions or want to reach out for help, please refer to the GitHub Issue section to find the right avenue of support for you.
  • If you would like to provide feedback to the dbt package team at Fivetran or would like to request a new dbt package, fill out our Feedback Form.
  • Have questions or want to just say hi? Book a time during our office hours on Calendly or email us at [email protected].

dbt_pendo's People

Contributors

fivetran-avinash avatar fivetran-catfritz avatar fivetran-chloe avatar fivetran-jamie avatar fivetran-joemarkiewicz avatar fivetran-reneeli avatar fivetran-sheringuyen avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbt_pendo's Issues

Divide by zero error

Hi there,

We are using the pendo package for metrics aggregation but running into an issue with a divide by zero in the intermediate model below:

round(100.0 * sum_pageviews / total_pageviews, 3) as percent_of_daily_pageviews,

There are a few divisions done with 'numerator / denominator' which is causing us an issue in our dbt models running on snowflake (we're currently in ramp up on pendo so low volume, this might not be noticed in busy environments)

[2022-08-24, 19:53:14 UTC] {pod_manager.py:226} INFO - 19:53:14  Database Error in model int_pendo__page_daily_metrics (models/intermediate/daily_metrics/int_pendo__page_daily_metrics.sql)
[2022-08-24, 19:53:14 UTC] {pod_manager.py:226} INFO - 19:53:14    100051 (22012): Division by zero
[2022-08-24, 19:53:14 UTC] {pod_manager.py:226} INFO - 19:53:14    compiled SQL at target/run/pendo/models/intermediate/daily_metrics/int_pendo__page_daily_metrics.sql

I'd be happy to raise a PR but am not 100% sure of style or cross database compatibility requirements. On snowflake this could be a DIV0 but i'm guessing we'd need to do something like this for maximum compatibility:

 round(100.0 * sum_pageviews / NULLIF(total_pageviews,0), 3) as percent_of_daily_pageviews, 

snowflake table naming issue with default config

Hi there, I noticed that out of the box a table is not compatible with Snowflake. To resolve this a config can easily be added for snowflake not to choke on the table called 'group'

pendo_source:
group: '"VAULT_DB"."PENDO"."GROUP"'

I'm wondering if there is appetite to fix this so that the default works out of the box? Happy to implement this with some guidance. May be its just a simple addition to the readme but there might be a better way this can be addressed?

Remove time zone before datediff

Describe the bug

dbt run throws this error:

17:23:24  Database Error in model int_pendo__page_info (models\intermediate\int_pendo__page_info.sql)
17:23:24    function pg_catalog.date_diff("unknown", timestamp with time zone, timestamp without time zone) does not exist
17:23:24    HINT:  No function matches the given name and argument types. You may need to add explicit type casts.
17:23:24    compiled SQL at target\run\pendo\models\intermediate\int_pendo__page_info.sql

The <target_schema>_int_pendo.int_pendo__page_info table/view doesn't get built.

Here's the compiled SQL at target\run\pendo\models\intermediate\int_pendo__page_info.sql (where the target schema is public):

  create view "dev"."public_int_pendo"."int_pendo__page_info__dbt_tmp" as (
    with page as (

    select *
    from "dev"."public_int_pendo"."int_pendo__latest_page"
),

application as (

    select *
    from "dev"."public_int_pendo"."int_pendo__latest_application"
),

pendo_user as (

    select *
    from "dev"."public_stg_pendo"."stg_pendo__user"
),

product_area as (

    select *
    from "dev"."public_stg_pendo"."stg_pendo__group"
),

page_rule as (

    select *
    from "dev"."public_int_pendo"."int_pendo__latest_page_rule"
),

agg_page_rule as (

    select 
        page_id,
        -- should we use a different/more apparent delimiter?
        
    listagg(rule, ', ')

 as rules 
        
    from page_rule
    group by 1
),

feature as (

    select *
    from "dev"."public_int_pendo"."int_pendo__latest_feature"
),

active_features as (

    select
        page_id,
        count(feature_id) as count_active_features

    from feature

    -- give a buffer of a month
    where 

    datediff(
        day,
        valid_through,
        
    getdate()

        )

 <= 30

    group by 1
),

page_join as (

    select 
        page.*,
        agg_page_rule.rules,
        product_area.group_name as product_area_name,
        application.display_name as app_display_name,
        application.platform as app_platform,
        creator.first_name || ' ' || creator.last_name as created_by_user_full_name,
        creator.username as created_by_user_username,
        updater.first_name || ' ' || updater.last_name as last_updated_by_user_full_name,
        updater.username as last_updated_by_user_username,
        coalesce(active_features.count_active_features, 0) as count_active_features

    from page 
    left join application 
        on page.app_id = application.application_id
    left join pendo_user as creator
        on page.created_by_user_id = creator.user_id 
    left join pendo_user as updater
        on page.last_updated_by_user_id = updater.user_id
    left join product_area
        on page.group_id = product_area.group_id 
    left join agg_page_rule
        on page.page_id = agg_page_rule.page_id
    left join active_features
        on page.page_id = active_features.page_id

)

select *
from page_join
  ) ;

The relevant bit (generated by this dbt-utils macro):

    datediff(
        day,
        valid_through,
        
    getdate()

        )

GETDATE() returns a timestamp without a time zone, valid_through apparently has a time zone; if it's already in UTC, then it should simply be cast to TIMESTAMP, as DATEDIFF() doesn't accept TIMESTAMPTZ.

Steps to reproduce

  1. Connect Pendo to Redshift via Fivetran.
  2. Finish initial sync. Ensure these tables are populated: pendo.application_history, pendo.feature_history, pendo.page_history, pendo.page_rule_history, pendo.group, pendo.user.
  3. In the local dbt project, install dbt_pendo_source and dbt_pendo.
  4. (In the virtual env) dbt run

Expected behavior

This table/view and its downstream tables/views should exist:

SELECT * FROM <target_schema>_int_pendo.int_pendo__page_info

Project variables configuration

name: 'my_dwh'
version: '1.0.1'
config-version: 2

profile: 'dwh_admin'
model-paths: ["models"]
analysis-paths: ["analyses"]
test-paths: ["tests"]
seed-paths: ["seeds"]
macro-paths: ["macros"]
snapshot-paths: ["snapshots"]

target-path: "target"
clean-targets:
  - "target"
  - "dbt_packages"

Package Version

packages:
  - package: fivetran/pendo
    version: 0.2.0
  - package: fivetran/pendo_source
    version: 0.2.0

Warehouse

  • BigQuery
  • Redshift
  • Snowflake
  • Postgres
  • Databricks
  • Other (provide details below)

Additional context

This bug was reported to the dbt-utils package repo, but they decided not to do any type casting there. It should probably be done here instead.

Screenshots

Please indicate the level of urgency

Not urgent for me, but this affects anyone depending on tables built from int_pendo__page_info, which aren't created due to this error. That includes pendo__page_daily_metrics, pendo__page, and pendo__page_event.

Are you interested in contributing to this package?

  • Yes, I can do this and open a PR for your review.
  • Possibly, but I'm not quite sure how to do this. I'd be happy to do a live coding session with someone to get this fixed.
  • No, I'd prefer if someone else fixed this. I don't have the time and/or don't know what the root cause of the problem is.

[Feature] Update README

Is there an existing feature request for this?

  • I have searched the existing issues

Describe the Feature

The README needs to updated to the current format.

Describe alternatives you've considered

No response

Are you interested in contributing this feature?

  • Yes.
  • Yes, but I will need assistance and will schedule time during your office hours for guidance.
  • No.

Anything else?

No response

[Feature] Add "union multiple connections" feature to dbt_pendo package

Is there an existing feature request for this?

  • I have searched the existing issues

Describe the Feature

If you have multiple Pendo connectors in Fivetran and would like to use this package on all of them simultaneously, it would be useful to have the functionality to do so. The dbt_shopify package has this feature: https://github.com/fivetran/dbt_shopify?tab=readme-ov-file#union-multiple-connectors

The package will union all of the data together and pass the unioned table into the transformations. You will be able to see which source it came from in the source_relation column of each model.

Describe alternatives you've considered

  • Considered not unioning them and using the connector. This gives major gaps in our data from Pendo and isn't a viable solution
  • Considered continuing to use our bespoke transformations for Pendo data and unioning them manually. This is extra work and we'd prefer to upstream this dependency by using dbt_pendo , but this is what we have chosen to do for now.

Are you interested in contributing this feature?

  • Yes.
  • Yes, but I will need assistance and will schedule time during your office hours for guidance.
  • No.

Anything else?

No response

BUG - group field names don't exist

Are you a current Fivetran customer?
yes

Describe the bug
package doesn't run with latest pendo landing schema

Steps to reproduce
Run package on latest pendo schema

Expected behavior
works

Project variables configuration

dbt run --profiles-dir user/.dbt --profile sustain-nonprod-snowflake --var '{"pendo_database": "vault_db", "pendo_schema": "pendo", "group": "\"VAULT_DB\".\"PENDO\".\"GROUP\"", "pendo_database":"vault_db", "pendo_schema":"pendo"}'

Package Version
same as in master

Warehouse

  • BigQuery
  • Redshift
  • [ x] Snowflake
  • Postgres
  • Databricks
  • Other (provide details below)

Additional context

Screenshots

Please indicate the level of urgency
I have a fix which I'm happy to contribute

Are you interested in contributing to this package?

  • [x ] Yes, I can do this and open a PR for your review.
  • Possibly, but I'm not quite sure how to do this. I'd be happy to do a live coding session with someone to get this fixed.
  • No, I'd prefer if someone else fixed this. I don't have the time and/or don't know what the root cause of the problem is.

[Feature] Ensure dbt compile proper execution

Is there an existing feature request for this?

  • I have searched the existing issues

Describe the Feature

When starting on a brand-new schema, dbt compile fails, most likely due to the calendar spine model and this function, at least according to Buildkite errors..

Describe alternatives you've considered

It's been suggested in other issues that changing the reference to a source might fix the issue.

Are you interested in contributing this feature?

  • Yes.
  • Yes, but I will need assistance and will schedule time during your office hours for guidance.
  • No.

Anything else?

No response

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.