Build completely private APIs in Snowflake
Glenn Gillen
VP of Product, GTM
If you've stored data in Snowflake then chances are you've also got an enterprise application that needs to access it, and a common way to do that is through an HTTP API. This is your organization's private data though, so making it available through an API that is accessible on the public internet is not acceptable.
In this guide I'm going to walk you through how to build, deploy, and host a private custom API powered by Snowflake.
The API will not have an endpoint exposed to the Internet. Your application will, instead, access this API over private endpoints that are only available within your enterprise's VPC and other private environments.
The example will build a reporting endpoint (in Python) to return data from the TPC-H dataset already included in your Snowflake account.
Prerequisites
- A Snowflake account in an AWS commercial region.
- Privileges necessary to create a user, database, warehouse, compute pool, repository, network rule, external access integration, and service in Snowflake.
- Privileges necessary to access the tables in the
SNOWFLAKE_SAMPLE_DATA.TPCH_SF10
database and schema. - Access to run SQL in the Snowflake console or SnowSQL
- An Ockam account to securely expose your private API
- Docker
- Git
- Basic knowledge of Snowflake, Docker, Git, SQL, and Python
Setup Snowflake
Python API code
The code in this guide comes from a lab that Brad Culberson and the team at Snowflake deliver to customers. You won't need to make any changes to the code, but you will need to clone the code locally so that we can build the container images with Docker.
The API has implementations using both
Snowflake Connector (src/connector.py
) and Snowpark (src/snowpark.py
) to
query the data. The two implementations are only there to serve as examples
of how each approach would be used — you only need to do one! Choose
whichever approach you prefer for your own APIs.
I will only discuss details in the Snowflake Connector approach in this post given its SQL syntax is more recognisable to those who might not have used Snowflake before.
Build & publish app container
We're ready to build the API into a Docker container and make the resulting image available in an image repository in Snowflake.
Run in Snowflake
Switch back to your Snowflake console or SnowSQL and we will configure Snowflake to run our container.
Setup Ockam
It's now time to setup Ockam to allow you to securely connect your private systems. We're going to run the following commands in a terminal on your local workstation.
Clean up
Once you're done with this demo you may want to remove everything we've setup.
Next steps
In this demo we've been able to provide an example of an API that will return data in Snowflake as JSON, and shown how we can restrict connectivity to the API to only specific clients within our Organization.
Any data that within Snowflake can be securely shared with any other system that is able to retrieve data via a JSON REST API. You could further extend this example by adding authentication to the python app, and implementing Role Based Access Controls (RBAC) to further lock down which data an authenticated client can retrieve.
If you'd like to explore some other capabilities of Ockam I'd recommend:
- Real-time pipelines from Snowflake to Kafka
- Adding security as a feature in your SaaS product
- Zero-trust data streaming with Redpanda Connect
Previous Article
Real-Time Data Ingestion from Kafka to Snowflake
Next Article
Real-Time CDC Pipelines from PostgreSQL to Snowflake