Skip to main content

Snowflake Integration Guide for PointFive

This guide explains how PointFive integrates with Snowflake to analyze consumption metadata and uncover cost-saving opportunities. It also outlines the secure, read-only script used to configure the integration in your Snowflake environment.

Updated over 2 months ago

To enable Snowflake cost optimization, PointFive securely connects to your Snowflake environment and analyzes consumption metadata to identify cost-saving opportunities.

The integration is completed through a guided setup wizard in the PointFive UI.

The script used during the integration must be executed by a user with the ACCOUNTADMIN role in your Snowflake account.

Note: The first integration you run within an organization must be executed in an account that has the ORGADMIN role enabled. All subsequent integrations within the same organization do not require ORGADMIN.

Step 1: Initial Configuration

During the first step of the wizard, you will:

  • Allow access to Query History – This enables PointFive to analyze query metadata and query text in order to surface optimization opportunities.

  • Configure Network Access Policy (if required) – If your Snowflake environment restricts external connections, you may need to allow access from PointFive to enable the integration.

These selections determine the configuration that will be included in the integration script.

Step 2: Generate and Run the Integration Script

After completing the initial configuration, PointFive generates a Snowflake SQL script tailored to your environment and selected settings.

This script:

  • Creates the required user and role

  • Assigns the necessary read-only permissions

  • Configures warehouse, storage, and network access (if applicable)

The following sections of this document explain the purpose and function of each component within the generated script.

Environment Setup (Role & Database)

The script begins by creating permissions within your Snowflake environment. This ensures that PointFive’s activities are isolated and do not interfere with your production databases.

  • Custom Role (POINTFIVE_INTEGRATION_ROLE_...): A specific role created to adhere to the "principle of least privilege." This role is granted only the permissions necessary to view usage metadata.

  • RSA_PUBLIC_KEY: The public key used for Key-Pair authentication, ensuring PointFive connects securely without a password.

  • Dedicated Database & Schema: The script creates a database and schema (POINTFIVE_INTEGRATION_DB_...) to house the views that PointFive will query. PointFive will only have access to data within this DB.

Usage Metadata Views

This is the core of the integration. The script creates specialized Views of your Snowflake system tables.

  • Account Usage Views: Maps metadata from SNOWFLAKE.ACCOUNT_USAGE (e.g., QUERY_HISTORY, WAREHOUSE_METERING_HISTORY).

  • Time Column Optimization: Most views include a TIME_COLUMN (casting timestamps to dates). This allows PointFive to perform efficient, time-series analysis of your costs.

Storage Integration (S3 Access)

To perform deep analysis, PointFive periodically exports metadata to a secure S3 bucket.

  • Secure Bridge: This creates a storage integration that allows Snowflake to communicate directly with PointFive’s AWS environment.

  • IAM Role Assumption: The integration uses an IAM role AWS ARN to establish a trust relationship. No static credentials or passwords for AWS are used.

  • Restricted Path: Access is restricted strictly to the path defined in STORAGE_LOCATION_WITH_ACCOUNT.

Service User & Authentication

PointFive connects to your Snowflake account using a Service User.

  • Key-Pair Authentication: The user is configured with an RSA Public Key. This is significantly more secure than a standard password, as it requires a private key held only by PointFive to establish a session.

  • Dedicated Warehouse: A dedicated X-SMALL warehouse is created. This ensures that PointFive’s metadata export queries never slow down other workloads.

    • Auto-Suspend: Set to 60 seconds to minimize cost.

    • Auto-Resume: Enabled so the warehouse only runs when PointFive is actively generating an insight report.

  • Controlled Costs: while it depends on your Snowflake usage patterns, we estimate the cost to be around 120$ per year per 1M$ of Snowflake spend

Query Operator Stats Procedure (Performance Analysis)

PointFive needs access to Snowflake's GET_QUERY_OPERATOR_STATS table function to provide deep query performance insights, which returns detailed execution statistics for individual query operators (joins, filters, aggregations, etc.).

Why a Stored Procedure?

Calling GET_QUERY_OPERATOR_STATS directly requires the MANAGE WAREHOUSES account-level privilege. Instead of granting this powerful privilege to the PointFive integration role, the script uses a secure stored procedure pattern to scope access tightly:

  • Separate Service Role: A dedicated service role (POINTFIVE_QUERY_STATS_SERVICE_ROLE_...) is created and granted MANAGE WAREHOUSES. This role is not granted to PointFive's integration role.

  • EXECUTE AS OWNER: The stored procedure runs with the privileges of its owner (the service role), not the caller (PointFive). This prevents PointFive from ever directly holding the MANAGE WAREHOUSES privilege.

  • Single-Query Scope: The procedure accepts a single query_id and returns operator stats only for that query, preventing scanning or enumeration of queries.

  • PII Protection: The procedure automatically excludes the OPERATOR_ATTRIBUTES column, which may contain fragments of SQL text, unless Query Text analysis is explicitly opted-in.

The script then grants only USAGE on the procedure to the PointFive integration role, ensuring the integration can call it but inherits none of the elevated privileges.

Security & Network Policy

To further harden the integration, the script applies a strict Network Policy.

  • IP Whitelisting: The service user is restricted to connecting only from PointFive’s verified static IP addresses (44.210.32.226, etc.). Even with the correct RSA key, access will be denied if the request comes from an unauthorized network.

Execution & Output

The script concludes by gathering technical identifiers generated by Snowflake (such as the STORAGE_AWS_EXTERNAL_ID).

Next Step: Once the script finishes, copy the JSON object returned in the Results tab of your Snowflake worksheet and paste it into the installation wizard. This information allows us to finalize the secure handshake between our two systems.

How to Execute the script in Snowflake

  1. Open Snowflake: Use the Snowsight UI or a worksheet.

  2. Role: Ensure you are using the ACCOUNTADMIN role (required to create integrations and network policies).

  3. Paste & Run: Start the integration wizard in PointFIve UI (see below). Copy the integration script and paste the full script and run it as a single block.

Copy Result: The script will return a JSON object (as shown in the final RETURN statement). Paste the result into the wizard to finish the integration process.

Did this answer your question?