Skip to main content


Best practices for securing Snowflake

Istock 1325306866

Understanding best practices for securing Snowflake and having a concrete implementation plan is a critical Day Zero deliverable. Snowflake is a secure, cloud-based data warehouse. There are no hardware or software components to select, install, configure, or maintain. There is virtually no software to install, configure, or manage. Snowflake takes care of ongoing maintenance and management. However, it is the responsibility of each individual organization for configure and maintain proper security controls. There are three security layers that you should use to help secure your data: network security, IAM and data encryption. Following best practices for Snowflake security can help protect your data from unauthorized access and theft.

Network Security Controls

Network security controls are a set of measures that you can take to protect your network from unauthorized access and theft. These controls can help protect your data, systems, and resources.

Secure communication

Snowflake provides a number of different drivers and connectors that you can use to connect to your data. All drivers must conform to the same security controls. Data at rest is encrypted with AES-256. HTTP Strict Transport Security (HSTS) is enforced for all client communications to ensure customer data is transmitted over HTTPS and refuse insecure SSL certificates. Connections are encrypted using TLS 1.2, which is a secure encryption protocol that helps protect data as it is transmitted between devices. It is the most current version of the TLS protocol and offers the strongest security. Snowflake supports TLS 1.2 and higher. Authentication required for all connections. Customer-configured Network Policies allow you to limit client communication to specific IP addresses using IP allowlisting.

Network policies

A network policy is a set of rules that govern how IP addresses may connect to your Snowflake account. Use network policies to allow “known” client locations (IP ranges) while blocking others. Check if you need to create different network policies for service account users who are connecting from a client application, SCIM, or Snowflake OAuth integrations.

Using private DNS for a customer who is using both on-premises and cloud provider network resources is a best practice. You can then deploy an on-premises DNS forwarding rule for the Snowflake account. After you’ve established private connectivity, you may restrict access to the public endpoint by creating an account-level network policy that allows only your network’s private IP range to connect into Snowflake.

Identity and Access Management

IAM is a way to control who can do what in Snowflake and is a key component in Snowflake security. You can use it to manage users and roles, and to control who has access to data including regulated sensitive data like PII/PHI.


To provision and externally manage users and roles in Snowflake, use System for Cross-domain Identity Management (SCIM) when supported by your Identity Provider. Identity Providers may be further customized to synchronize users and roles with your Active Directory people and groups. Currently, Snowflake supports SCIM 2.0 to integrate Snowflake with Okta and Microsoft Azure AD. You can configure a different provider as a Custom provider.

Snowflake security best practices recommend using federated single sign-on (SSO) for most users. Enable multi-factor authentication (MFA) using your current corporate MFA or Snowflake’s built-in Duo solution for additional security. Service accounts and users with the ACCOUNTADMIN system role may use passwords. With passwords, use a secrets management or privileged access management solution. For example, the HashiCorp Vault Database Secrets Engine automates the creation of unique, ephemeral Snowflake Database User credentials using the Snowflake account’s database engine. OAuth is the preferred method for connecting external applications to Snowflake. Okta, external browser authentication, key pair authentication and passwords (as a last resort) are also supported.

Access controls

Consider Snowflake security and business practices when developing your object access model. Create a group of access roles with different permissions on objects, and assigning them to functional roles as necessary. There is no technical difference between an object access role and a functional role. The distinction is in how they are logically used to combine sets of permissions and assign them to groups of users. Grant permissions on database objects or account objects to access roles. Grant access roles to functional roles to create a role hierarchy.

Consider implementing column-level access controls to restrict access to sensitive information present in certain columns such as PII, PHI, or financial data. Dynamic Data Masking is a built-in function that dynamically encrypts column data depending on who’s requesting it. External Tokenization can detokenize data at query time for authorized users by utilizing partner technologies. Restrict access to certain rows in a table to only certain users using row-level policies.

Data Encryption

Snowflake is designed to minimize risk by encrypting data at rest and in motion. End-to-end encryption (E2EE) is a form of communication in which no one but end users can read the data. All data files are encrypted throughout each stage of a data movement pipeline.

Snowflake encrypts all data using a key hierarchy (with cloud HSM-backed root of trust), which provides enhanced security by encrypting individual pieces of information with a distinct key. Snowflake also allows customers to use a customer-managed key (CMK) in the encryption process, via a feature called Tri-Secret Secure.


Snowflake security invoves a deep understanding of the three security layers: network security, IAM and data encryption. By following best practices for securing Snowflake, you can help protect your data from unauthorized access and theft.

If you’re ready to move to the next level of your data-driven enterprise journey, contact with Data Solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

David Callaghan, Solutions Architect

As a solutions architect with Perficient, I bring twenty years of development experience and I'm currently hands-on with Hadoop/Spark, blockchain and cloud, coding in Java, Scala and Go. I'm certified in and work extensively with Hadoop, Cassandra, Spark, AWS, MongoDB and Pentaho. Most recently, I've been bringing integrated blockchain (particularly Hyperledger and Ethereum) and big data solutions to the cloud with an emphasis on integrating Modern Data produces such as HBase, Cassandra and Neo4J as the off-blockchain repository.

More from this Author

Follow Us