authors | date | version | snow_cli_version | tags | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Kamesh Sampath |
2024-06-10 |
v1 |
2.4.1 |
|
Snowflake CLI is next gen command line utility to interact with Snowflake.
Note
- All commands has
--help
option - All command allows output format to be
table
(default) orjson
.
pip install -U snowflake-cli-labs
or
pipx install snowflake-cli-labs
snow --help
General information about version of CLI and Python, default configuration path etc.,
snow --info
Tip
The connection configuration config.toml
by default is stored under$HOME/.snowflake
.
If you wish to change it set the environment variable $SNOWFLAKE_HOME
1 to director where
you want to store the config.toml
snow connection add
Adding connection cheatsheets
following the prompts,
snow connection add
Adding connection using command options,
snow connection add --connection-name cheatsheets \
--account <your-account-identifier> \
--user <your-user> \
--password <your-password>
Note
Currently need to follow the prompts for the defaults or add other parameters
snow connection list
snow connection set-default cheatsheets
snow connection test -c cheatsheets
Tip
If you don't specify -c
, then it test with default connection that was set in
the config
Supported LLMs2
- Large
- reka-core
- llama3-70b
- mistral-large
- Medium
- snowflake-arctic(default)
- reka-flash
- mixtral-8x7b
- llama2-70b-chat
- Small
- llama3-8b
- mistral-7b
- gemma-7b
Generate a response for a given prompt,
snow cortex complete "Tell me about Snowflake"
With a specific supported LLM,
snow cortex complete "Tell me about Snowflake" --model=mistral-7b
With history,
snow cortex complete --file samples/datacloud.json
Get answer for the question from a text,
snow cortex extract-answer 'what does snowpark do ?' 'Snowpark provides a set of libraries and runtimes in Snowflake to securely deploy and process non-SQL code, including Python, Java and Scala.'
Get answers for the questions from a text file,
snow cortex extract-answer 'What does Snowflake eliminate?' --file samples/answers.txt
snow cortex extract-answer 'What non-SQL code Snowpark process?' --file samples/answers.txt
Sentiment Score | Sentiment |
---|---|
1 | Positive |
-1 | Negative |
A positive sentiment (score: 0.64
) from a text,
snow cortex sentiment 'Snowflake is a awesome company to work.'
A negative sentiment ( approx score -0.4
) from a text,
snow cortex sentiment --file samples/sentiment.txt
From a text,
snow cortex summarize 'SnowCLI is next gen command line utility to interact with Snowflake. It supports manipulating lot of Snowflake objects from command line.'
From a file,
snow cortex summarize --file samples/asl_v2.txt
Currently supported languages
- English(
en
) - French(
fr
) - German(
de
) - Polish(
pl
) - Japanese(
ja
) - Korean(
ko
) - Italian(
it
) - Portuguese(
pt
) - Spanish(
es
) - Swedish(
sv
) - Russian(
ru
)
Translate from English to French a text,
snow cortex translate --from en --to fr 'snowflake is an awesome company to work for.'
Translate from English to Spanish a text from a file,
snow cortex translate --from en --to es --file samples/translate.txt
Simple one line query,
snow sql -q 'CREATE DATABASE FOO'
Loading DDL/DML commands from a file,
snow sql --filename my_objects.sql
Using Standard Input(stdin
)
cat <<EOF | snow sql --stdin
CREATE OR REPLACE DATABASE FOO;
USE DATABASE FOO;
CREATE OR REPLACE SCHEMA CLI;
USE SCHEMA CLI;
CREATE OR ALTER TABLE employees(
id int,
first_name string,
last_name string,
dept int
);
EOF
Use the following command to see the list of supported objects,
snow object list --help
List all available warehouses for the current role,
snow object list warehouse
List all available databases for the current role,
snow object list database
List all databases in JSON
format,
snow object list database --format json
Tip
With JSON
you can extract values using tools like jq
e.g. to get only names of the databases
snow object list database --format json | jq '.[].name'
List databases that starts with snow
,
snow object list database --like '%snow%'
List all schemas,
snow object list schema
Filtering schemas by database named foo
,
snow object list schema --in database foo
List all tables
snow object list table
List tables in a specific schema cli
of a database foo
,
snow object list table --database foo --in schema cli
Let us describe the table employees
in the foo
database' schema cli
,
snow object describe table employees --database foo --schema cli
Drop an table named employees
in schema cli
of database foo
,
snow object drop table employees --database foo --schema cli
Create a Streamlit application and deploy to Snowflake,
snow streamlit init streamlit_app
Create a warehouse that the Streamlit application will use,
snow sql -q 'CREATE WAREHOUSE my_streamlit_warehouse'
Create a database that the Streamlit application will use,
snow sql -q 'CREATE DATABASE my_streamlit_app'
Important
Ensure you are in the Streamlit application folder before running the command.
snow streamlit deploy --database=my_streamlit_app
List all available streamlit applications,
snow streamlit list
Get details about a streamlit application streamlit_app
in schema of public
of databasemy_streamlit_app
,
snow streamlit describe streamlit_app --schema=public --database=my_streamlit_app
Note
When describing Streamlit application either provide the schema as parameter or use fully qualified name
Get the streamlit application URL i.e. the URL used to access the hosted application,
snow streamlit get-url streamlit_app --database=my_streamlit_app
Drop a streamlit application named streamlit_app
in schema of public
of databasemy_streamlit_app
,
snow streamlit drop streamlit_app --schema=public --database=my_streamlit_app
SnowCLI allows managing the internal stages.
Create a stage named cli_stage
in schema cli
of database foo
,
snow stage create cli_stage --schema=cli --database=foo
Get details of stage,
snow stage describe cli_stage --schema=cli --database=foo
List all available stages,
snow stage list
List stages in specific to a database named foo
,
snow stage list --in database foo
List stages by name that starts with cli
in database foo
,
snow stage list --like 'cli%' --in database foo
Download employees.csv,
curl -sSL -o employees.csv https://raw.githubusercontent.com/Snowflake-Labs/sf-cheatsheets/main/samples/employees.csv
Copy employees.csv
to stage cli_stage
to a path /data
,
snow stage copy employees.csv '@cli_stage/data' --schema=cli --database=foo
List all files in stage cli_stage
in schema cli
of database foo
,
snow stage list-files cli_stage --schema=cli --database=foo
List files by pattern,
snow stage list-files cli_stage --pattern='.*[.]csv' --schema=cli --database=foo
Download the load_employees.sql,
curl -sSL -o load_employees.sql https://raw.githubusercontent.com/Snowflake-Labs/sf-cheatsheets/main/samples/load_employees.sql
Copy load_employees.sql
to stage cli_stage
at path /sql
,
snow stage copy load_employees.sql '@cli_stage/sql' --schema=cli --database=foo
Execute the SQL3 from stage,
snow stage execute '@cli_stage/sql/load_employees.sql' --schema=cli --database=foo
Note
Execute takes the glob pattern, allowing to specify the file pattern to execute. @stage/*
or @stage/*.sql
both executes only sql files
Query all employees to make sure the load worked,
snow sql --schema=cli --database=foo -q 'SELECT * FROM EMPLOYEES'
Download variables.sql,
curl -sSL -o variables.sql https://raw.githubusercontent.com/Snowflake-Labs/sf-cheatsheets/main/samples/variables.sql
Copy the variables.sql
to stage,
snow stage copy variables.sql '@cli_stage/sql' --schema=cli --database=foo
Execute files from stage with values for template variables({{.dept}}
in variables.sql),
snow stage execute '@cli_stage/sql/variables.sql' --variable="dept=1" --schema=cli --database=foo
Executing variables.sql
would have created a view named EMPLOYEE_DEPT_VIEW
, list the view it to see the variables replaced,
snow object list view --like 'emp%' --database=foo --schema=cli
Note
SnowCLI allows processing templating using {{...}}
and &{...}
{{...}}
is a preferred templating i.g Jinja templating for server side processing&{...}
is a preferred templating for client side processing- All client side context variables can be accessed using
&{ctx.env.<var>}
e.g.&{ctx.env.USER}
returns the current OS user
Remove all files from stage cli_stage
on path /data
snow stage remove cli_stage 'data/' --schema=cli --database=foo
Create a Snowflake Native App my_first_app
in current working directory,
snow app init my_first_app
Create a Snowflake Native App in directory my_first_app
snow app init --name 'my-first-app' my_first_app
Note
Since the name becomes a part of the application URL its recommended to have URL safe names
Create a Snowflake Native App with Streamlit Python template4
snow app init my_first_app --template streamlit-python
Note
You can also create your Snowflake Native App template and use --template-repo
instead, to scaffold your Native App using your template.
From the application directory i.e. cd my_first_app
snow app run
![IMPORTANT] The version name should be valid SQL identifier i.e. no dots, no dashes and start with a character usually version labels use
v
.
Create a development version named dev
,
snow app version create
Create a development version named v1_0
,
snow app version create v1_0
List available versions
snow app version list
snow app version drop v1_0
Deploy a particular version of an application,
snow app run --version=v1_0
Deploy a particular version and patch,
snow app run --version=v1_0 --patch=1
Note
Version patches
are automatically incremented when creating version with same name
Open the application on a browser,
snow app open
Synchronize the local application file changes with stage and don't create/update the running application,
snow app deploy
snow app teardown
If the application has version associated then drop the version,
snow app version drop
And then drop the application
snow app teardown
Drop application and its associated database objects,
snow app teardown --cascade
Important
- SPCS is available only on certain AWS regions and not available for trial accounts
- All Snowpark Containers are run using a defined compute pool.
List of available instance families5
- CPU_X64_XS
- CPU_X64_S
- CPU_X64_M
- CPU_X64_L
- HIGHMEM_X64_S
- HIGHMEM_X64_M
- HIGHMEM_X64_L
- GPU_NV_S
- GPU_NV_M
- GPU_NV_L
Create a compute pool named my_xs_compute_pool
with family CPU_X64_XS
,
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS
Create with if not exists
,
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS --if-not-exists
Create with initially suspended (default: not suspended initially),
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS --init-suspend
Create with auto suspend(default: 3600 secs
) set to 2 mins(120 secs)
,
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS --auto-suspend-secs=120
Create with minimum nodes(scale down) as 1
(default) and maximum nodes(scale up) as 3
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS --min-nodes=1 --max-nodes=3
Create with auto resume on service/job request,
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS --auto-resume
Create with auto-resume disabled,
Note
Auto Resume disabled requires the compute pool to be started manually.
snow spcs compute-pool create my_xs_compute_pool \
--family CPU_X64_XS --no-auto-resume
List all available compute pools for current role,
snow spcs compute-pool list
List compute pools like my_xs%
snow spcs compute-pool list --like 'my_xs%'
Get details about a compute pool,
snow spcs compute-pool describe my_xs_compute_pool
To know the current status of a compute pool,
snow spcs compute-pool status my_xs_compute_pool
Suspend a compute pool,
snow spcs compute-pool suspend my_xs_compute_pool
Resume a compute pool,
snow spcs compute-pool resume my_xs_compute_pool
You can set/unset
the following properties on a compute pool after it's created,
Option | Description |
---|---|
--min-nodes |
Minimum Node(s) |
--max-nodes |
Maximum Nodes(s) |
--auto-resume |
Enable Auto Resume |
--no-auto-resume |
Disable Auto Resume |
--auto-suspend-secs |
Auto Suspend in seconds |
--comment |
Comment |
Add a comment
to the compute pool,
snow spcs compute-pool set --comment 'my small compute pool' my_xs_compute_pool
Remove the comment
from compute pool,
snow spcs compute-pool unset --comment my_xs_compute_pool
Delete all services running on a compute pool
snow spcs compute-pool stop-all my_xs_compute_pool
Drop the compute pool,
snow spcs compute-pool drop my_xs_compute_pool
Important
This requires Docker on local system
snow spcs image-registry login
Get current user
token to access image registry,
snow spcs image-registry token
Get image registry URL,
snow spcs image-registry url
Important
- A Database and Schema is required to create the Image Repository
- Services can't be created using
ACCOUNTADMIN
, a custom role is required
The SQL script defines role, grants and warehouse.
As ACCOUNTADMIN
run the script to setup required Snowflake resources,
- A Role named
cheatsheets_spcs_demo_role
to create Snowpark Container Services - A Database named
CHEATSHEETS_DB
where the services will be attached to - A Schema named
DATA_SCHEMA
on DBCHEATSHEETS_DB
to hold the image repository. - A Warehouse
cheatsheets_spcs_wh_s
which will be used to run query from services.
Set your Snowflake account user name,
export SNOWFLAKE_USER=<your snowflake user>
Run the spcs_setup.sql
create the aforementioned Snowflake objects,
curl https://raw.githubusercontent.com/Snowflake-Labs/sf-cheatsheets/main/samples/spcs_setup.sql |
snow sql --stdin
Create a image repository named my_image_repository
,
snow spcs image-repository create my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Create with if not exists,
snow spcs image-repository create my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role' \
--if-not-exists
Replace image repository my_image_repository
,
snow spcs image-repository create my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role' \
--replace
List all image repositories in the database and schema,
snow spcs image-repository list \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Get URL of the image repository my_image_repository
,
snow spcs image-repository url my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Let us push a sample image to repository,
IMAGE_REPOSITORY=$(snow spcs image-repository url my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role')
docker pull --platform=linux/amd64 nginx
docker tag nginx "$IMAGE_REPOSITORY/nginx"
docker push "$IMAGE_REPOSITORY/nginx"
List all images in repository my_image_repository
,
snow spcs image-repository list-images my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
List all tags for image nginx
in repository my_image_repository
,
Important
The --image-name
should be fully qualified name. Use list-images
to get
the fully qualified image name
snow spcs image-repository list-tags my_image_repository \
--image-name=/CHEATSHEETS_DB/DATA_SCHEMA/my_image_repository/nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
snow spcs image-repository drop my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Create a SPCS service specification6 file,
Tip
Tools like jq can help extract data from the command output e.g. to get the image name
export IMAGE=$(snow spcs image-repository list-images my_image_repository \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' --format json | jq -r '.[0].image')
cat <<EOF | tee work/service-spec.yaml
spec:
containers:
- name: nginx
image: $IMAGE
readinessProbe:
port: 80
path: /
endpoints:
- name: nginx
port: 80
public: true
EOF
Create a Service named nginx
using compute pool my_xs_compute_pool
and
specification work/service-spec.yaml
,
snow spcs service create nginx \
--compute-pool=my_xs_compute_pool \
--spec-path=work/service-spec.yaml \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Create a Service if not exists,
snow spcs service create nginx \
--compute-pool=my_xs_compute_pool \
--spec-path=work/service-spec.yaml \
--if-not-exists \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Create with minimum instances 1
(default) and maximum instances to be 3
,
snow spcs service create nginx \
--compute-pool=my_xs_compute_pool \
--spec-path=work/service-spec.yaml \
--min-instances=1 \
--max-instances=3 \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Create service that uses a specific warehouse cheatsheets_spcs_wh_s
for all its queries,
snow spcs service create nginx \
--compute-pool=my_xs_compute_pool \
--spec-path=work/service-spec.yaml \
--query-warehouse='cheatsheets_spcs_wh_s' \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Check service status,
Note
It will take few minutes for the service to be in READY
status
snow spcs service status nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Get more details about the service,
snow spcs service describe nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Check the logs of service with the container named nginx
with instance 0
,
Note
Find instanceId
and containerName
using the command describe
command.
snow spcs service logs nginx \
--container-name=nginx \
--instance-id=0 \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
List all available services,
snow spcs service list \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Query services in
database,
snow spcs service list --in database CHEATSHEETS_DB \
--role='cheatsheets_spcs_demo_role'
Query services in
database and like ng%
,
snow spcs service list --in database CHEATSHEETS_DB --like 'ng%' \
--role='cheatsheets_spcs_demo_role'
List the service endpoint for the service nginx
,
snow spcs service list-endpoints nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Note
Open the ingress_url
on the browser will take you to NGINX home page after
authentication
Suspend the service,
snow spcs service suspend nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Resume the service,
snow spcs service resume nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Note
Resume service will take few minutes, use the status
command to check the status
You can set/unset
the following properties on a service even after it's created,
Option | Description |
---|---|
--min-instances |
Minimum number of service instance(s), typically used while scaling down |
--max-instances |
Maximum number of service instance(s), typically used while scaling up |
--auto-resume |
Enable auto resume |
--no-auto-resume |
Disable auto resume |
--query-warehouse |
The Warehouse to use while doing query from the service |
--comment |
Comment for the service |
Add a comment to the service,
snow spcs service set --comment 'the nginx service' nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Use service describe
to check on the updated property
Remove the comment from the service,
snow spcs service unset --comment nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Upgrade the service nginx
with new specification e.g a tag upgrade or probe updates etc.,
snow spcs service upgrade nginx \
--spec-path=work/service-spec_V2.yaml \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Drop a service named nginx
snow spcs service drop nginx \
--database='CHEATSHEETS_DB' \
--schema='DATA_SCHEMA' \
--role='cheatsheets_spcs_demo_role'
Note
SPCS has compute associated with it, run the clean up script to clean the Snowflake resources created as part of this cheatsheet.
curl https://raw.githubusercontent.com/Snowflake-Labs/sf-cheatsheets/main/samples/spcs_cleanup.sql |
snow sql --stdin
- Snowflake Developers::Quickstart
- Snowflake Developers::Getting Started With Snowflake CLI
- Intro to Snowpark Container Services
- Build a Data App and run it on Snowpark Container Services
- Build Rag Based Equipment Maintenance App Using Snowflake Cortex
- Build a Retrieval Augmented Generation (RAG) based LLM assistant using Streamlit and Snowflake Cortex
- Snowflake CLI
- Execute Immediate Jinja Templating
- Snowpark Native App Framework
- Snowpark Container Services
- Snowflake Cortex LLM Functions
Footnotes
-
https://docs.snowflake.com/developer-guide/snowflake-cli-v2/connecting/specify-credentials#how-to-use-environment-variables-for-snowflake-credentials ↩
-
https://docs.snowflake.com/en/user-guide/snowflake-cortex/llm-functions#choosing-a-model ↩
-
https://docs.snowflake.com/en/sql-reference/sql/execute-immediate ↩
-
https://docs.snowflake.com/en/sql-reference/sql/create-compute-pool ↩
-
https://docs.snowflake.com/en/developer-guide/snowpark-container-services/specification-reference ↩