Weekend Sale Special Limited Time Flat 70% Discount offer - Ends in 0d 00h 00m 00s - Coupon code: 70spcl

Snowflake ARA-R01 SnowPro Advanced: Architect Recertification Exam Exam Practice Test

Page: 1 / 16
Total 162 questions

SnowPro Advanced: Architect Recertification Exam Questions and Answers

Question 1

Which system functions does Snowflake provide to monitor clustering information within a table (Choose two.)

Options:

A.

SYSTEM$CLUSTERING_INFORMATION

B.

SYSTEM$CLUSTERING_USAGE

C.

SYSTEM$CLUSTERING_DEPTH

D.

SYSTEM$CLUSTERING_KEYS

E.

SYSTEM$CLUSTERING_PERCENT

Question 2

Which SQL alter command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

A)Question # 2

B) Question # 2

C) Question # 2

D) Question # 2

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Question 3

Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

Question # 3

Sample data for the CREDITCARDINFO table is as follows:

NAME EXPIRYDATE CREDITCARDNO

JOHN DOE 2022-07-23 4321 5678 9012 1234

if the Snowflake system rotes have not been granted any additional roles, what will be the result?

Options:

A.

The sysadmin can see the CREDICARDND column data in clear text.

B.

The owner of the table will see the CREDICARDND column data in clear text.

C.

Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

D.

Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

Question 4

A table for IOT devices that measures water usage is created. The table quickly becomes large and contains more than 2 billion rows.

Question # 4

The general query patterns for the table are:

1. DeviceId, lOT_timestamp and Customerld are frequently used in the filter predicate for the select statement

2. The columns City and DeviceManuf acturer are often retrieved

3. There is often a count on Uniqueld

Which field(s) should be used for the clustering key?

Options:

A.

lOT_timestamp

B.

City and DeviceManuf acturer

C.

Deviceld and Customerld

D.

Uniqueld

Question 5

A company is designing its serving layer for data that is in cloud storage. Multiple terabytes of the data will be used for reporting. Some data does not have a clear use case but could be useful for experimental analysis. This experimentation data changes frequently and is sometimes wiped out and replaced completely in a few days.

The company wants to centralize access control, provide a single point of connection for the end-users, and maintain data governance.

What solution meets these requirements while MINIMIZING costs, administrative effort, and development overhead?

Options:

A.

Import the data used for reporting into a Snowflake schema with native tables. Then create external tables pointing to the cloud storage folders used for the experimentation data. Then create two different roles with grants to the different datasets to match the different user personas, and grant these roles to the corresponding users.

B.

Import all the data in cloud storage to be used for reporting into a Snowflake schema with native tables. Then create a role that has access to this schema and manage access to the data through that role.

C.

Import all the data in cloud storage to be used for reporting into a Snowflake schema with native tables. Then create two different roles with grants to the different datasets to match the different user personas, and grant these roles to the corresponding users.

D.

Import the data used for reporting into a Snowflake schema with native tables. Then create views that have SELECT commands pointing to the cloud storage files for the experimentation data. Then create two different roles to match the different user personas, and grant these roles to the corresponding users.

Question 6

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

Options:

A.

There were JSON nulls in the recent data imports.

B.

The order of the keys in the JSON was changed.

C.

The recent data imports contained fewer fields than usual.

D.

There were variations in string lengths for the JSON values in the recent data imports.

Question 7

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

Options:

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Question 8

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Question 9

An Architect needs to grant a group of ORDER_ADMIN users the ability to clean old data in an ORDERS table (deleting all records older than 5 years), without granting any privileges on the table. The group’s manager (ORDER_MANAGER) has full DELETE privileges on the table.

How can the ORDER_ADMIN role be enabled to perform this data cleanup, without needing the DELETE privilege held by the ORDER_MANAGER role?

Options:

A.

Create a stored procedure that runs with caller’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

B.

Create a stored procedure that can be run using both caller’s and owner’s rights (allowing the user to specify which rights are used during execution), and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

C.

Create a stored procedure that runs with owner’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

D.

This scenario would actually not be possible in Snowflake – any user performing a DELETE on a table requires the DELETE privilege to be granted to the role they are using.

Question 10

A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements.

Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?

Options:

A.

Create accounts for each tenant in the Snowflake organization.

B.

Create an object for each tenant strategy if row level security is viable for isolating tenants.

C.

Create an object for each tenant strategy if row level security is not viable for isolating tenants.

D.

Create a multi-tenant table strategy if row level security is not viable for isolating tenants.

Question 11

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

Options:

A.

Call the LOGIN_HISTORY Information Schema table function.

B.

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Question 12

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:
  • 1: SnowPro Advanced: Architect | Study Guide 8
  • 2: Snowflake Documentation | Snowpipe Overview 9
  • 3: Snowflake Documentation | Using the Snowpipe REST API 10
  • 4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11
  • 5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12
  • 6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13
  • 7: Snowflake Documentation | Loading Data Using COPY into a Table
  • : SnowPro Advanced: Architect | Study Guide
  • : Snowpipe Overview
  • : Using the Snowpipe REST API
  • : Loading Data Using Snowpipe and AWS Lambda
  • : Supported File Formats and Compression for Staged Data Files
  • : Using Cloud Notifications to Trigger Snowpipe
  • : Loading Data Using COPY into a Table
  • Question 13

    A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

    Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

    Options:

    A.

    Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.

    B.

    From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.

    C.

    Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.

    D.

    Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.

    Question 14

    A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

    What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

    Options:

    A.

    ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

    B.

    ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

    C.

    ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

    D.

    USE ROLE SECURITYADMIN;

    CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

    E.

    USE ROLE USERADMIN;

    CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY

    ALLOWED_IP_LIST = ('10.1.1.20');

    Question 15

    A Snowflake Architect is designing a multiple-account design strategy.

    This strategy will be MOST cost-effective with which scenarios? (Select TWO).

    Options:

    A.

    The company wants to clone a production database that resides on AWS to a development database that resides on Azure.

    B.

    The company needs to share data between two databases, where one must support Payment Card Industry Data Security Standard (PCI DSS) compliance but the other one does not.

    C.

    The company needs to support different role-based access control features for the development, test, and production environments.

    D.

    The company security policy mandates the use of different Active Directory instances for the development, test, and production environments.

    E.

    The company must use a specific network policy for certain users to allow and block given IP addresses.

    Question 16

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP () what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Question 17

    A table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:

    Question # 17

    Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.

    The Architect must design a clustering key for this table to improve the query performance.

    Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?

    Options:

    A.

    C5, C4, C2

    B.

    C3, C4, C5

    C.

    C1, C3, C2

    D.

    C2, C1, C3

    Question 18

    A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

    What would be the MOST efficient solution?

    Options:

    A.

    Ask the partner to create a share and add the company's account.

    B.

    Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

    C.

    Keep the current structure but request that the partner stop changing files, instead only appending new files.

    D.

    Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

    Question 19

    An Architect needs to allow a user to create a database from an inbound share.

    To meet this requirement, the user’s role must have which privileges? (Choose two.)

    Options:

    A.

    IMPORT SHARE;

    B.

    IMPORT PRIVILEGES;

    C.

    CREATE DATABASE;

    D.

    CREATE SHARE;

    E.

    IMPORT DATABASE;

    Question 20

    A company’s client application supports multiple authentication methods, and is using Okta.

    What is the best practice recommendation for the order of priority when applications authenticate to Snowflake?

    Options:

    A.

    1) OAuth (either Snowflake OAuth or External OAuth)

    2) External browser

    3) Okta native authentication

    4) Key Pair Authentication, mostly used for service account users

    5) Password

    B.

    1) External browser, SSO

    2) Key Pair Authentication, mostly used for development environment users

    3) Okta native authentication

    4) OAuth (ether Snowflake OAuth or External OAuth)

    5) Password

    C.

    1) Okta native authentication

    2) Key Pair Authentication, mostly used for production environment users

    3) Password

    4) OAuth (either Snowflake OAuth or External OAuth)

    5) External browser, SSO

    D.

    1) Password

    2) Key Pair Authentication, mostly used for production environment users

    3) Okta native authentication

    4) OAuth (either Snowflake OAuth or External OAuth)

    5) External browser, SSO

    Question 21

    A retail company has 2000+ stores spread across the country. Store Managers report that they are having trouble running key reports related to inventory management, sales targets, payroll, and staffing during business hours. The Managers report that performance is poor and time-outs occur frequently.

    Currently all reports share the same Snowflake virtual warehouse.

    How should this situation be addressed? (Select TWO).

    Options:

    A.

    Use a Business Intelligence tool for in-memory computation to improve performance.

    B.

    Configure a dedicated virtual warehouse for the Store Manager team.

    C.

    Configure the virtual warehouse to be multi-clustered.

    D.

    Configure the virtual warehouse to size 4-XL

    E.

    Advise the Store Manager team to defer report execution to off-business hours.

    Question 22

    An Architect has designed a data pipeline that Is receiving small CSV files from multiple sources. All of the files are landing in one location. Specific files are filtered for loading into Snowflake tables using the copy command. The loading performance is poor.

    What changes can be made to Improve the data loading performance?

    Options:

    A.

    Increase the size of the virtual warehouse.

    B.

    Create a multi-cluster warehouse and merge smaller files to create bigger files.

    C.

    Create a specific storage landing bucket to avoid file scanning.

    D.

    Change the file format from CSV to JSON.

    Question 23

    A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

    What is the recommended way to validate data accessibility by the consumers?

    Options:

    A.

    Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.

    create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

    B.

    Create a row access policy as shown below and assign it to the data share.

    create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

    C.

    Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.

    alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

    D.

    Alter the share settings as shown below, in order to impersonate a specific consumer account.

    alter share sales share set accounts = 'Consumerl’ share restrictions = true

    Question 24

    When loading data from stage using COPY INTO, what options can you specify for the ON_ERROR clause?

    Options:

    A.

    CONTINUE

    B.

    SKIP_FILE

    C.

    ABORT_STATEMENT

    D.

    FAIL

    Question 25

    An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

    Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

    Options:

    A.

    Use the Snowflake Connector for Python, connect to remote storage and download the file.

    B.

    Use the get command in SnowSQL to retrieve the file.

    C.

    Use the get command in Snowsight to retrieve the file.

    D.

    Use the Snowflake API endpoint and download the file.

    Question 26

    A Snowflake Architect Is working with Data Modelers and Table Designers to draft an ELT framework specifically for data loading using Snowpipe. The Table Designers will add a timestamp column that Inserts the current tlmestamp as the default value as records are loaded into a table. The Intent is to capture the time when each record gets loaded into the table; however, when tested the timestamps are earlier than the loae_take column values returned by the copy_history function or the Copy_HISTORY view (Account Usage).

    Why Is this occurring?

    Options:

    A.

    The timestamps are different because there are parameter setup mismatches. The parameters need to be realigned

    B.

    The Snowflake timezone parameter Is different from the cloud provider's parameters causing the mismatch.

    C.

    The Table Designer team has not used the localtimestamp or systimestamp functions in the Snowflake copy statement.

    D.

    The CURRENT_TIMEis evaluated when the load operation is compiled in cloud services rather than when the record is inserted into the table.

    Question 27

    How can an Architect enable optimal clustering to enhance performance for different access paths on a given table?

    Options:

    A.

    Create multiple clustering keys for a table.

    B.

    Create multiple materialized views with different cluster keys.

    C.

    Create super projections that will automatically create clustering.

    D.

    Create a clustering key that contains all columns used in the access paths.

    Question 28

    A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

    * Confirmed Private Link URLs are working by logging in with a username/password account

    * Verified DNS resolution by running nslookups against Private Link URLs

    * Validated connectivity using SnowCD

    * Disabled public access using a network policy set to use the company’s IP address range

    However, the following error message is received when using SSO to log into the company account:

    IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

    What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

    Options:

    A.

    Alter the Azure security integration to use the Private Link URLs.

    B.

    Add the IP address in the error message to the allowed list in the network policy.

    C.

    Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

    D.

    Update the configuration of the Azure AD SSO to use the Private Link URLs.

    E.

    Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

    Question 29

    An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

    How can this requirement be met?

    Options:

    A.

    Use SnowSQL.

    B.

    Use the Snowpipe REST API.

    C.

    Use the Snowflake SQL REST API.

    D.

    Use the Snowflake ODBC driver.

    Question 30

    Which columns can be included in an external table schema? (Select THREE).

    Options:

    A.

    VALUE

    B.

    METADATASROW_ID

    C.

    METADATASISUPDATE

    D.

    METADAT A$ FILENAME

    E.

    METADATAS FILE_ROW_NUMBER

    F.

    METADATASEXTERNAL TABLE PARTITION

    Question 31

    Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

    Options:

    A.

    External table

    B.

    Materialized view

    C.

    Search optimization

    D.

    Result cache

    Question 32

    An Architect needs to design a solution for building environments for development, test, and pre-production, all located in a single Snowflake account. The environments should be based on production data.

    Which solution would be MOST cost-effective and performant?

    Options:

    A.

    Use zero-copy cloning into transient tables.

    B.

    Use zero-copy cloning into permanent tables.

    C.

    Use CREATE TABLE ... AS SELECT (CTAS) statements.

    D.

    Use a Snowflake task to trigger a stored procedure to copy data.

    Question 33

    An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

    1. Update the order in the f_0RDERS fact table.

    2. Load the changed order data into the special table ORDER _REPAIRS.

    This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

    What data processing logic design will be the MOST performant?

    Options:

    A.

    Useone stream and one task.

    B.

    Useone stream and two tasks.

    C.

    Usetwo streams and one task.

    D.

    Usetwo streams and two tasks.

    Question 34

    An Architect is troubleshooting a query with poor performance using the QUERY_HIST0RY function. The Architect observes that the COMPILATIONJHME is greater than the EXECUTIONJTIME.

    What is the reason for this?

    Options:

    A.

    The query is processing a very large dataset.

    B.

    The query has overly complex logic.

    C.

    The query is queued for execution.

    D.

    The query is reading from remote storage.

    Question 35

    A company needs to have the following features available in its Snowflake account:

    1. Support for Multi-Factor Authentication (MFA)

    2. A minimum of 2 months of Time Travel availability

    3. Database replication in between different regions

    4. Native support for JDBC and ODBC

    5. Customer-managed encryption keys using Tri-Secret Secure

    6. Support for Payment Card Industry Data Security Standards (PCI DSS)

    In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

    Options:

    A.

    Standard

    B.

    Enterprise

    C.

    Business Critical

    D.

    Virtual Private Snowflake (VPS)

    Question 36

    A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

    Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

    Options:

    A.

    Use, at minimum, the Business Critical edition of Snowflake.

    B.

    Create Dynamic Data Masking policies and apply them to columns that contain PHI.

    C.

    Use the Internal Tokenization feature to obfuscate sensitive data.

    D.

    Use the External Tokenization feature to obfuscate sensitive data.

    E.

    Rewrite SQL queries to eliminate projections of PHI data based on current_role().

    F.

    Avoid sharing data with partner organizations.

    Question 37

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Question 38

    A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

    1. Deployment of Snowflake accounts on two different cloud providers.

    2. Selection of cloud provider regions that are geographically far apart.

    3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

    4. Implementation of Snowflake client redirect.

    What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

    Options:

    A.

    Connect the applications using the - URL. Use the Business Critical Snowflake edition.

    B.

    Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

    C.

    Connect the applications using the - URL. Use the Enterprise Snowflake edition.

    D.

    Connect the applications using the - URL. Use the Business Critical Snowflake edition.

    Question 39

    In a managed access schema, what are characteristics of the roles that can manage object privileges? (Select TWO).

    Options:

    A.

    Users with the SYSADMIN role can grant object privileges in a managed access schema.

    B.

    Users with the SECURITYADMIN role or higher, can grant object privileges in a managed access schema.

    C.

    Users who are database owners can grant object privileges in a managed access schema.

    D.

    Users who are schema owners can grant object privileges in a managed access schema.

    E.

    Users who are object owners can grant object privileges in a managed access schema.

    Question 40

    How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

    Options:

    A.

    A task scheduled in a UTC-based schedule will have no issues with the time changes.

    B.

    Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

    C.

    A task will move to a suspended state during the daylight savings time change.

    D.

    A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

    E.

    A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

    Question 41

    There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

    An Architect needs to create a read-only role for certain employees working in the human resources department.

    Which permission sets must be granted to this role?

    Options:

    A.

    USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

    B.

    USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

    C.

    MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

    D.

    USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

    Question 42

    How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

    Options:

    A.

    Set masking policy conditions using current_role targeting the role in use for the current session.

    B.

    Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

    C.

    Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

    D.

    Determine if there are ownership privileges on the masking policy that would allow the use of any function.

    E.

    Assign the accountadmin role to the user who is executing the object.

    Question 43

    A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

    The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

    Which design will meet these requirements?

    Options:

    A.

    Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    B.

    Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    C.

    Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

    D.

    Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    Question 44

    A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: product_category, and product_details. Both tables can be joined by the product_id column. Data access should be governed, and only the partner should have access to the records.

    The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

    Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

    Options:

    A.

    Use Secure Data Sharing with an S3 bucket as a destination.

    B.

    Publish product_category and product_details data sets on the Snowflake Marketplace.

    C.

    Create a database user for the partner and give them access to the required data sets.

    D.

    Create a reader account for the partner and share the data sets as secure views.

    Question 45

    Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

    Options:

    A.

    IDEF1X

    B.

    Schema-on-write

    C.

    Schema-on-read

    D.

    Information schema

    Question 46

    A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

    Options:

    A.

    Setting a '0' or NULL value means the warehouses will never suspend.

    B.

    Setting a '0' or NULL value means the warehouses will suspend immediately.

    C.

    Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

    D.

    Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

    Question 47

    Company A has recently acquired company B. The Snowflake deployment for company B is located in the Azure West Europe region.

    As part of the integration process, an Architect has been asked to consolidate company B's sales data into company A's Snowflake account which is located in the AWS us-east-1 region.

    How can this requirement be met?

    Options:

    A.

    Replicate the sales data from company B's Snowflake account into company A's Snowflake account using cross-region data replication within Snowflake. Configure a direct share from company B's account to company A's account.

    B.

    Export the sales data from company B's Snowflake account as CSV files, and transfer the files to company A's Snowflake account. Import the data using Snowflake's data loading capabilities.

    C.

    Migrate company B's Snowflake deployment to the same region as company A's Snowflake deployment, ensuring data locality. Then perform a direct database-to-database merge of the sales data.

    D.

    Build a custom data pipeline using Azure Data Factory or a similar tool to extract the sales data from company B's Snowflake account. Transform the data, then load it into company A's Snowflake account.

    Question 48

    How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

    Options:

    A.

    Shared databases are read-only.

    B.

    Shared databases must be refreshed in order for new data to be visible.

    C.

    Shared databases cannot be cloned.

    D.

    Shared databases are not supported by Time Travel.

    E.

    Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

    F.

    Shared databases can also be created as transient databases.

    Page: 1 / 16
    Total 162 questions