SnowPro Core Certification: 通过考试和下一步(7/8)【本系列完结】
✅考试报名、平台
✅模拟考试
✅实战
✅可能遇到的问题
✅证书
✅下一步
- SnowPro® Advanced Data Engineer
- SnowPro® Advanced Data Analyst
- SnowPro® Advanced Administrator
- SnowPro® Advanced Architect
- SnowPro® Advanced Data Scientist
Snowflake SQL API provides operations that we can use to:
Submit SQL statements for execution.
Check the status of the execution of a statement.
Cancel the execution of a statement.
Fetch query results concurrently.
Snowflake has the following Cloud Partner Categories:
Data Integration
Business Intelligence (BI)
Machine Learning & Data Science
Security Governance & Observability
SQL Development & Management, and
Native Programmatic Interfaces.
The benefits of The Data Cloud are Access, Governance, and Action (AGA).
Access means that organizations can easily discover data and share it internally or with third parties without regard to geographical location.
Governance is about setting policies and rules and protecting the data in a way that can unlock new value and collaboration while maintaining the highest levels of security and compliance.
Action means you can empower every part of your business with data to build better products, make faster decisions, create new revenue streams and realize the value of your greatest untapped asset, your data.
STANDARD & BUSINESS CRITICAL FEATURE
Database and share replication are available to all accounts.
Replication of other account objects & failover/failback require Business Critical Edition (or higher). To inquire about upgrading, please contact Snowflake Support.
The SHOW PARAMETERS command determines whether a network policy is set on the account or for a specific user.
For Account level: SHOW PARAMETERS LIKE 'network_policy' IN ACCOUNT;
For User level : SHOW PARAMETERS LIKE 'network_policy' IN USER <username>;
Example - SHOW PARAMETERS LIKE 'network_policy' IN USER john;
Tri-Secret Secure refers to the combination of a Snowflake-managed key and a customer-managed key, which results in the creation of a composite master key to protect your data. Tri-Secret Secure requires the Business Critical edition as a minimum and can be activated by contacting Snowflake support.
https://docs.snowflake.com/en/user-guide/security-encryption-manage
By default, Snowflake manages encryption keys automatically, requiring no customer intervention. Snowflake-managed keys are
--> rotated regularly (at 30-day intervals), and
--> an annual rekeying process re-encrypts data with new keys. The data encryption and key management processes are entirely transparent to the users. Snowflake uses
--> AES 256-bit encryption to encrypt data at rest.
https://docs.snowflake.com/en/user-guide/security-encryption-manage
Snowflake encrypts all data in transit using Transport Layer Security (TLS) 1.2. This applies to all Snowflake connections, including those made through the Snowflake Web interface, JDBC, ODBC, and the Python connector.
All Snowflake-managed keys are automatically rotated by Snowflake when they are more than 30 days old. Active keys are retired, and new keys are created. When Snowflake determines the retired key is no longer needed, the key is automatically destroyed. When active, a key is used to encrypt data and is available for usage by the customer. When retired, the key is used solely to decrypt data and is only available for accessing the data.
https://docs.snowflake.com/en/user-guide/security-encryption-end-to-end
MFA is enabled by default for all Snowflake accounts and is available in all Snowflake editions.
All Snowflake client tools, including the web interface, SnowSQL, and the various connectors and drivers, support MFA.
Snowpipe is a snowflake-managed serverless service. A Snowflake user can not log into it; therefore, it doesn't require MFA. https://docs.snowflake.com/en/user-guide/security-mfa
Multi-factor authentication adds additional protection to the login process in Snowflake. Snowflake provides key pair authentication as a more secure alternative to the traditional username/password authentication approach. Additionally, Snowflake offers federated authentication, enabling users to access their accounts via a single sign-on (SSO). Users authenticate using SAML 2.0-compliant single sign-on (SSO) via an external identity provider (IdP).
Snowflake strongly recommends that all users with the ACCOUNTADMIN role be required to use MFA.
Okta and Microsoft ADFS provide native Snowflake support for federated authentication and SSO.
After a specified period of time (defined by the IdP), a user’s session in the IdP automatically times out, but this does not affect their Snowflake sessions. Any Snowflake sessions that are active at the time remain open and do not require re-authentication. However, to initiate any new Snowflake sessions, the user must log into the IdP again.
Snowflake SQL API supports Oauth, and Key Pair authentication.
Snowflake supports masking policies that may be applied to columns and enforced at the column level to provide column-level security. Column-level security is achieved by dynamic data masking or external Tokenization.
https://docs.snowflake.com/en/user-guide/security-column
👉IP Addresses
If you provide both Allowed IP Addresses and Blocked IP Addresses, Snowflake applies the Blocked List first. This would block your own access. Additionally, in order to block all IP addresses except a select list, you only need to add IP addresses to ALLOWED_IP_LIST. Snowflake automatically blocks all IP addresses not included in the allowed list.
- c1 ✅ 09/Oct
- c2 ✅ 10/Oct
- c3 ✅ 16/OCT
Notes:
- Kimball: two-layer data warehouse
- Inmon: three-layer data warehouse model
- Data sources: ERP, CRM, Files etc
- TDQM, DWQ
💪💪😄😄
- c4 ✅ 19/OCT
- c5 ✅ 22/OCT
Entity definitions (Hub, Link, Sat)
SAL (same-as link), various links, sats
- c6 ✅ 24/OCT
- c7 ✅ 26/OCT
Slow Changing Dimensions (SCD)
Star schemas
Snowflake design (indirect dimensions)
- c8 ✅ 02/Nov
- c9 ✅ 05/Nov
MDM
- c10 ✅ 27/NOV ✌
Metrics and Error Marts
- c11 ✅ 12/Nov
Data Extraction (stage loading: historical/batch)
- c12 ✅ 16/Nov
DV loading
- c13 ✅ 28/Nov
Data Quality
- c14 ✅ 01/Dec 😀😀😀
- c15 ✅ 17/OCT
Business users: Accessing Information Mart to build a multidimensional database
Snowflake supports transforming data while loading it into a table using the COPY INTO <table> command, dramatically simplifying your ETL pipeline for basic transformations. This feature helps you avoid the use of temporary tables to store pre-transformed data when reordering columns during a data load. This feature applies to both bulk loading and Snowpipe.
The COPY command supports:
Column reordering, column omission, and casts using a SELECT statement. There is no requirement for your data files to have the same number and ordering of columns as your target table.
The ENFORCE_LENGTH | TRUNCATECOLUMNS option, which can truncate text strings that exceed the target column length.
The following file format types are supported for COPY transformations:
CSV
JSON
Avro
ORC
Parquet
XML
To parse a staged data file, it is necessary to describe its file format:
The default format is character-delimited UTF-8 text. The default field delimiter is a comma character (,
). The default record delimiter is the new line character. If the source data is in another format, specify the file format type and options.
When querying staged data files, the ERROR_ON_COLUMN_COUNT_MISMATCH
option is ignored. There is no requirement for your data files to have the same number and ordering of columns as your target table.
To transform JSON data during a load operation, you must structure the data files in NDJSON (“Newline delimited JSON”) standard format; otherwise, you might encounter the following error:
Error parsing JSON: more than one document in the input
Specify the format type and options that match your data files.
To explicitly specify file format options, set them in one of the following ways:
Querying staged data files using a SELECT statement: |
|
Loading columns from staged data files using a COPY INTO <table> statement: |
|
If TRUE, the COPY statement produces an error if a loaded string exceeds the target column length.
If FALSE, strings are automatically truncated to the target column length.
TRUNCATECOLUMNS:
If TRUE, strings are automatically truncated to the target column length.
If FALSE, the COPY statement produces an error if a loaded string exceeds the target column length.
SnowPro Badges and Certificates Online Verification https://achieve.snowflake.com/profile/richardhou888/wallet