AWS BDS-C00 Certified Big Data Speciality Practice Test Set 6

Which of the following can be used in AWS Red shift to prioritize selected shortrunning queries ahead of longer-running queries ?


Options are :

  • Short query acceleration (Correct)
  • Long query acceleration
  • AWS query acceleration
  • Transfer Acceleration

Answer : Short query acceleration

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 17

You have a requirement to send records from multiple logs stored on multiple application servers hosted or Linux EC2 Instances. There is a requirement to also transform the records and store them on S3, but also ensure that the original records are maintained. Which of the following can be steps for the implementation process which can satisfy these requirements?


Options are :

  • Configure AWS Cloud Watch agents to copy transformed data on 53
  • Use Kinesis Fire hose to stream transformed data on 53 (Correct)
  • Configure AWS Cloud Watch agents to copy the original data onto a backup 53
  • Use Kinesis Fire hose to stream original data to a backup 53 (Correct)

Answer : Use Kinesis Fire hose to stream transformed data on 53 Use Kinesis Fire hose to stream original data to a backup 53

You work for a startup that tracks commercial deliver loT enabled devices via GPS. You receive coordinates are transmitted from each device once every 6 seconds. You need to process these coordinates In real-time from multiple sources. Which tool should you use to digest the data?


Options are :

  • AWS Data Pipeline
  • Amazon SQS
  • Amazon EMR
  • Amazon Kinesis (Correct)

Answer : Amazon Kinesis

Your company has a set of web servers hosted on EC2 Instances. There is a requirement to push the logs from these web servers onto a suitable storage device for subsequent analysis. Which of the following can steps for the implementation process which can satisfy these requirements?


Options are :

  • Install and configure the Kinesis agents on the web servers (Correct)
  • Ensure that Kinesis Fire hose is setup to take the data and send it across to EMR for further processing
  • Install and configure the Cloud watch agent on the web servers
  • Ensure that Kinesis Fire hose is setup to take the data and send it across to Red shift for further processing (Correct)

Answer : Install and configure the Kinesis agents on the web servers Ensure that Kinesis Fire hose is setup to take the data and send it across to Red shift for further processing

AWS SAP-C00 Certified Solution Architect Professional Exam Set 7

Which of the following services can be used for auditing 53 buckets. Choose 2 answers from the options given below ?


Options are :

  • Cloud watch
  • Cloud trail (Correct)
  • AWS Config (Correct)
  • EMR

Answer : Cloud trail AWS Config

When planning for the Instance type for EMR nodes, which of the following generally does not need a high configuration in terms of both computing and memory? Please select:


Options are :

  • Both Core and Master Nodes
  • Task Nodes
  • Core Nodes
  • Master Node (Correct)

Answer : Master Node

Which of the following are open source tools which can be used on top of Hadoop for SQL-like queries? Choose 2 answers from the options given below Please select?


Options are :

  • H Base
  • Ganglia
  • Hive (Correct)
  • Impala (Correct)

Answer : Hive Impala

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 6

You have a collection of sensors that write data to a Kinesis stream. The default stream settings are used for the kinesis stream. Every third day you have decided to send the data to S3 from the stream. When you analyze the data in S3, you can see that only the 3rd day?s data is present in the stream. What is the underlying cause for this?


Options are :

  • Data records are only accessible for a default of 24 hours from the time they are added to a stream (Correct)
  • The right access permissions are not present in the Kinesis streams
  • The sensors are probably failing to send the data on time.
  • The versioning enabled on the buckets is causing only the newest version to be available

Answer : Data records are only accessible for a default of 24 hours from the time they are added to a stream

Which of the following is a tool which can be used for transferring data between Amazon 53. Hadoop, HDFS, and RDBMS databases?


Options are :

  • HUE
  • Sqoop (Correct)
  • Ganglia
  • Hive

Answer : Sqoop

What is the concurrency level of the number of queries that can run per queue in Red shift?


Options are :

  • 20
  • 10
  • 5 (Correct)
  • 2

Answer : 5

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 11

Which of the following is not a data type node available for transfer of data using the AWS Data Pipeline?


Options are :

  • S3DataNode
  • DynamoD8DataNode
  • Glacier Data Node (Correct)
  • Red shift Data Node

Answer : Glacier Data Node

AWS SAP-C00 Certified Solution Architect Professional Exam Set 9

Which of the following commands in Red shift is efficient in loading large amounts of data ?


Options are :

  • INSERT
  • LOAD
  • COPY (Correct)
  • UPDATE

Answer : COPY

Your company currently has an order processing system in AWS. There are EC2 Instances in place to pick up the orders from the application and EC2 Instances in an Auto scaling Group to process the orders. Which of the following additional components can Ideally be used to ensure that the EC2 Processing instances are correctly scaled based on demand?


Options are :

  • Use Cloud Watch metrics to understand the load capacity on the processing servers and then scale the capacity accordingly.
  • Use Cloud Watch metrics to understand the load capacity on the processing servers. Ensure SNS is used to scale up the servers based on notifications.
  • Use SQS queues to decouple the architecture. Scale the processing servers based on the queue length. (Correct)
  • Use SQS queues to decouple the architecture. Scale the processing servers based on notifications sent from the SOS queues.

Answer : Use SQS queues to decouple the architecture. Scale the processing servers based on the queue length.

You currently have data in Dynamo DB tables. You have a requirement to perform complex data analysis queries on the data stored In the Dynamo DB tables. How can this be achieved?


Options are :

  • Copy the data on AWS Red shift and then perform the complex queries (Correct)
  • Query the Dynamo DB tables, since it support complex queries
  • Copy the data on AWS EMR and then perform the complex queries
  • Copy the data on AWS Quick sight and then perform the complex queries

Answer : Copy the data on AWS Red shift and then perform the complex queries

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 10

There is a requirement to collect, process, and analyze video streams in realtime. Which of the below services can be used for this requirement


Options are :

  • Amazon Red shift (Correct)
  • Amazon EMR
  • Amazon Dynamo DB
  • Amazon Kinesis

Answer : Amazon Red shift

You are working with Dynamo DB streams. You need to ensure that the streams put the entire item in the Dynamo DB table as it appears before It was modified. Which of the following stream record views would you choose?


Options are :

  • KEYS_ONLY
  • NEW_AND_OLD_IMAGES
  • OLD_IMAGE (Correct)
  • NEW_MACE

Answer : OLD_IMAGE

In an AWS EMR Cluster which of the following nodes is responsible for running the YARN service?


Options are :

  • Core Node
  • Primary Node
  • Master Node (Correct)
  • Task Node

Answer : Master Node

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 13

A third-party auditor is being brought in to review security processes and configurations for all of a Sh company?s AWS accounts. Currently, the company does not use any onpremise Identity provider. Instead, they rely on lAM accounts in each of their AWS accounts. The auditor needs read-only access to all AWS Sho resources for each AWS account. Given the requirements, what is the best security method for architecting I access for the security auditor?


Options are :

  • Configure an on-premise AD server and enable SAML and Identify federation for single sign-on to each AWS account.
  • Create an lAM role with read-only permissions to all AWS services in each AWS account. Create one auditor lAM account and add a permissions policy that allows the auditor to assume the ARN role for each AWS account that has an assigned role. (Correct)
  • Create an lAM user for each AWS account with read-only permission policies for the auditor and disable account when the audit is complete.
  • Create a custom identity broker application that allows the auditor to use existing Amazon credentials to log into the AWS environments.

Answer : Create an lAM role with read-only permissions to all AWS services in each AWS account. Create one auditor lAM account and add a permissions policy that allows the auditor to assume the ARN role for each AWS account that has an assigned role.

Which of the following is not a best practice when it comes to designing queries in Redshift? Please select:


Options are :

  • Use cross-joins wherever possible, because this can speed up queries. (Correct)
  • Use sub queries in cases where one table in the query is used only for predicate conditions
  • Use a CASE Expression to perform complex aggregations
  • Avoid using select because this can slow down queries

Answer : Use cross-joins wherever possible, because this can speed up queries.

You are trying to connect to the master node for your EMR cluster. Which of the following must be checked to ensure that the connection is successful?


Options are :

  • This is not possible because you are not allowed to connect to the master node in an EMR cluster
  • Check the Outbound rules for the Security Group for the master node
  • Check the Inbound rules for the Security Group for the master node (Correct)
  • Ensure that the master node Is launched in a private subnet

Answer : Check the Inbound rules for the Security Group for the master node

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 16

Which of the following is the term given to the data in Machine Learning where you already know the target answers ?


Options are :

  • Primary data
  • Completed data (Correct)
  • Labeled data
  • Factual data

Answer : Completed data

There is a requirement for EC2 Instances in your private subnet to access Dynamo DB tables. How can this be achieved


Options are :

  • Convert the private subnet to a public subnet since this is the only way for the access to be achieved
  • Use VPC endpoint (Correct)
  • There is no way for instances in a private subnet to access Dynamo DB tables A VPC endpoint for Dynamo DB enables Amazon EC2 instances in your VPC to use their private IP addresses to access Dynamo DB with no exposure to the public Internet
  • Attach a virtual private gateway to the VPC

Answer : Use VPC endpoint

Which of the following can be used to analyze storage access patterns in your buckets stored in 53? Choose 2 answers from the options given below.


Options are :

  • Use Cloud watch logs
  • Use AWS Config
  • Use the 53 analytics feature (Correct)
  • Use Amazon Quick Sight (Correct)

Answer : Use the 53 analytics feature Use Amazon Quick Sight

Certification : Get AWS Certified Solutions Architect in 1 Day (2018 Update) Set 17

Which of the following is a business analytics service provided by AWS?


Options are :

  • Micro strategy
  • Business Objects
  • Power Bi (Correct)
  • Quick Sight .

Answer : Power Bi

You are planning on using AWS Data Pipeline to transfer data from Dynamo DB to 53. The Dynamo DB tables gets populated by an application. The application generates tables based on orders made to particular products. How can you ensure that the Data pipeline is triggered only when data is actually written to a Dynamo DB table by the application?


Options are :

  • Configure Dynamo DB streams to notify AWS Data Pipeline once data is available
  • Use the preconditions available in AWS Data Pipeline (Correct)
  • Configure an SQS queue with the application sending messages to the queue once the table data is available and data pipeline reading the data subsequently
  • Configure the application to send an SNS notification to AWS Data Pipeline once data

Answer : Use the preconditions available in AWS Data Pipeline

You currently have a Red shift Cluster defined in AWS. The data is currently unencrypted in nature. You have now decided that the cluster needs to have encrypted data. How can you achieve this? Choose 2 answers from the options given below. Each answer forms part of the solution ?


Options are :

  • Unload the data from the existing. source cluster (Correct)
  • Make a backup copy of the data from the existing source cluster and encrypt it with SSE
  • Enable the Encryption attribute of the cluster
  • Reload the data In a new, target cluster with the chosen encryption setting (Correct)

Answer : Unload the data from the existing. source cluster Reload the data In a new, target cluster with the chosen encryption setting

Mock : AWS Certified Security Specialty

Your company maintains an e-commerce site in AWS. They want to use AWS Machine learning to see how many units of a particular product will be sold. Which machine learning model would you use for this purpose?


Options are :

  • Binary classification
  • Regression classification (Correct)
  • multiclass classification
  • Simple classification

Answer : Regression classification

You are planning to use the AWS EMR service to create instances which make use of the Hadoop software. But apart from the Hadoop software, you also need some custom software to be installed on these systems. Which is the best way to get the custom software Installed on instances which are launched as part of the cluster?


Options are :

  • Ensure that the EMR Cluster Instance configuration has the location In 53 for the custom software
  • Ensure that the EMR Cluster configuration has the location In S3 for the custom software
  • Use the Bootstrap actions to install the custom software (Correct)
  • Use Cloud watch agents to install the custom software

Answer : Use the Bootstrap actions to install the custom software

Which of the following needing to be done to create supervised machine learning models in the AWS ML service?


Options are :

  • Creation of training data
  • Using external blueprints in AWS Machine Learning
  • Creation of test models in the Machine Learning Designer
  • Usage of existing blueprints in AWS Machine Learning (Correct)

Answer : Usage of existing blueprints in AWS Machine Learning

AWS SOA-C00 Certified Sys Ops Administrator Associate Exam Set 5

You are currently using a Red shift table which stores data for an order processing system. This table has witnessed a large number of update/delete operations over a period of time. Recently you have noticed tb the performance on the table has degraded. What can be done to Improve the performance of the Red shit table?


Options are :

  • Add another table and start adding data to that table (Correct)
  • Perform the cleanup operations for the table
  • Perform the vacuum operation for the table
  • Increase the Read and Write capacity units for the table

Answer : Add another table and start adding data to that table

Comment / Suggestion Section
Point our Mistakes and Post Your Suggestions