Amazon AWS-Certified-Solutions-Architect-Professional ExamAWS-Certified-Solutions-Architect-Professional

Total Question: 272 Last Updated: March 27,2017
  • Updated AWS-Certified-Solutions-Architect-Professional Dumps
  • Based on Real AWS-Certified-Solutions-Architect-Professional Exams Scenarios
  • Free AWS-Certified-Solutions-Architect-Professional pdf Demo Available
  • Check out our AWS-Certified-Solutions-Architect-Professional Dumps in a new PDF format
  • Instant AWS-Certified-Solutions-Architect-Professional download
  • Guarantee AWS-Certified-Solutions-Architect-Professional success in first attempt
Package Select:

Questions & Answers PDF

Practice Test Software

Practice Test + PDF 30% Discount

Price: $110.95 $55.95

Buy Now Free Trial
PDF Version Software Version

100% Guarantee on Products High Success Rate, supported by our 99.3% pass rate history and money back guarantee should you fail your exam.

Yes Yes

Updated regularly Get hold of Updated Exam Materials Every time. Free updates without any extra charges to the actual exam.

Yes Yes

AWS-Certified-Solutions-Architect-Professional PDF Questions & Answers Available in a universal Adobe PDF format. Portable and printable anywhere anytime.

Yes Yes

Quality and Value Exact Exam Questions with Correct Answers, verified by Experts with years of Experience in IT Field.

Yes Yes

Customizable Testing Engine Simulates a real world exam environment to prepare you for AWS-Certified-Solutions-Architect-Professional Success.

Yes

Unlimited Practice AWS-Certified-Solutions-Architect-Professional Exam Re-takes Practice Until you get it right. With options to Highlight missed questions, you can analyse your mistakes and prepare for Ultimate AWS-Certified-Solutions-Architect-Professional Success.

Yes

Special Promotion More than 30% Discount for Royal Pack.

Yes

Top Simulation AWS-Certified-Solutions-Architect-Professional torrent Tips!

Cause all that matters here is passing the Amazon AWS-Certified-Solutions-Architect-Professional exam. Cause all that you need is a high score of AWS-Certified-Solutions-Architect-Professional AWS-Certified-Solutions-Architect-Professional exam. The only one thing you need to do is downloading Testking AWS-Certified-Solutions-Architect-Professional exam study guides now. We will not let you down with our money-back guarantee.

2017 Mar AWS-Certified-Solutions-Architect-Professional free draindumps

Q41. You have an application running on an EC2 instance which will allow users to download files from a private S3 bucket using a pre-signed URL. Before generating the URL, the application should verify the existence of the file in S3. How should the application use AWS credentials to access the S3 bucket securely? 

A. Use the AWS account access keys; the application retrieves the credentials from the source code of the application. 

B. Create an IAM role for EC2 that allows list access to objects In the S3 bucket; launch the Instance with the role, and retrieve the role's credentials from the EC2 instance metadata. 

C. Create an IAM user for the application with permissions that allow list access to the S3 bucket; the application retrieves the 1AM user credentials from a temporary directory with permissions that allow read access only to the Application user. 

D. Create an IAM user for the application with permissions that allow list access to the S3 bucket; launch the instance as the IAM user, and retrieve the IAM user's credentials from the EC2 instance user data. 

Answer: D


Q42. You have deployed a three-tier web application in a VPC with a CIDR block of 10.0.0.0/28. You initially deploy two web servers, two application servers, two database servers and one NAT instance for a total of seven EC2 instances. The web, application and database servers are deployed across two availability zones (AZs). You also deploy an ELB in front of the two web servers, and use Route53 for DNS. Web traffic gradually increases in the first few days following the deployment, so you attempt to double the number of instances in each tier of the application to handle the new load. Unfortunately some of these new Instances fall to launch. Which of the following could be the root cause? Choose 2 answers A. AWS reserves the first and the last private IP address in each subnet's CIDR block so you do not have enough addresses left to launch all of the new EC2 instances 

B. The Internet Gateway (IGW) of your VPC has scaled-up, adding more instances to handle the traffic spike, reducing the number of available private IP addresses for new instance launches 

C. The ELB has scaled-up, adding more instances to handle the traffic spike, reducing the number of available private IP addresses for new instance launches 

D. AWS reserves one IP address in each subnet's CIDR block for Route53 so you do not have enough addresses left to launch all of the new EC2 instances 

E. AWS reserves the first four and the last IP address in each subnet's CIDR block so you do not have enough addresses left to launch all of the new EC2 instances 

Answer: A, C 


Q43. You are the new IT architect in a company that operates a mobile sleep tracking application. When activated at night, the mobile app is sending collected data points of 1 kilobyte every 5 minutes to your backend. The backend takes care of authenticating the user and writing the data points into an Amazon DynamoDB table. Every morning, you scan the table to extract and aggregate last night's data on a per user basis, and store the results in Amazon S3. Users are notified via Amazon SNS mobile push notifications that new data is available, which is parsed and visualized by the mobile app. Currently you have around 100k users who are mostly based out of North America. You have been tasked to optimize the architecture of the backend system to lower cost. What would you recommend? Choose 2 answers 

A. Have the mobile app access Amazon DynamoDB directly Instead of JSON files stored on Amazon S3. 

B. Write data directly into an Amazon Redshift cluster replacing both Amazon DynamoDB and Amazon S3. 

C. Introduce an Amazon SQS queue to buffer writes to the Amazon DynamoDB table and reduce provisioned write throughput. 

D. Introduce Amazon Elasticache to cache reads from the Amazon DynamoDB table and reduce provisioned read throughput. 

E. Create a new Amazon DynamoDB table each day and drop the one for the previous day after its  data is on Amazon S3. 

Answer: A, D 


Q44. Your company policies require encryption of sensitive data at rest. You are considering the possible options for protecting data while storing it at rest on an EBS data volume, attached to an EC2 instance. Which of these options would allow you to encrypt your data at rest? Choose 3 answers 

A. Implement third party volume encryption tools 

B. Implement SSL/TLS for all services running on the server 

C. Encrypt data inside your applications before storing it on EBS 

D. Encrypt data using native data encryption drivers at the file system level 

E. Do nothing as EBS volumes are encrypted by default 

Answer: B, C, D 


Up to the minute AWS-Certified-Solutions-Architect-Professional free question:

Q45. You require the ability to analyze a large amount of data which is stored on Amazon S3 using Amazon Elastic MapReduce. You are using the cc2.8xlarge instance type, whose CPUs are mostly idle during processing. Which of the below would be the most cost efficient way to reduce the runtime of the job? 

A. Create fewer, larger files m Amazon S3. 

B. Use smaller instances that have higher aggregate I/O performance. 

C. Create more, smaller files on Amazon S3. 

D. Add additional cc2.8xlarge instances by introducing a task group. 

Answer:


Q46. An international company has deployed a multi-tier web application that relies on DynamoDB in a single region. For regulatory reasons they need disaster recovery capability in a separate region with a Recovery Time Objective of 2 hours and a Recovery Point Objective of 24 hours. They should synchronize their data on a regular basis and be able to provision the web application rapidly using CloudFormation. The objective is to minimize changes to the existing web application, control the throughput of DynamoDB used for the synchronization of data, and synchronize only the modified elements. Which design would you choose to meet these requirements? 

A. Use AWS Data Pipeline to schedule a DynamoDB cross region copy once a day, create a "LastUpdated" attribute in your DynamoDB table that would represent the timestamp of the last update and use it as a filter 

B. Use AWS Data Pipeline to schedule an export of the DynamoDB table to S3 in the current region once a day, then schedule another task Immediately after it that will import data from S3 to DynamoDB in the other region 

C. Use EMR and write a custom script to retrieve data from DynamoDB in the current region using a SCAN operation and push it to DynamoDB in the second region 

D. Send also each write into an SQS queue in the second region, use an auto-scaling group behind the SQS queue to replay the write in the second region 

Answer:


Q47. Your company plans to host a large donation website on Amazon Web Services (AWS). You anticipate a large and undetermined amount of traffic that will create many database writes. To be certain that you do not drop any writes to a database hosted on AWS, which service should you use? 

A. Amazon Simple Queue Service (SQS) for capturing the writes and draining the queue to write to the database. 

B. Amazon DynamoDB with provisioned write throughput up to the anticipated peak write throughput. 

C. Amazon ElastiCache to store the writes until the writes are committed to the database. 

D. Amazon RDS with provisioned IOPS up to the anticipated peak write throughput. 

Answer:


Q48. A corporate web application is deployed within an Amazon Virtual Private Cloud (VPC), and is connected to the corporate data center via an IPsec VPN. The application must authenticate against the on- premises LDAP server. After authentication, each logged-in user can only access an Amazon Simple Storage Space (S3) keyspace specific to that user. Which two approaches can satisfy these objectives? Choose 2 answers 

A. The application authenticates against IAM Security Token Service using the LDAP credentials. The application uses those temporary AWS security credentials to access the appropriate S3 bucket. 

B. Develop an identity broker that authenticates against LDAP, and then calls IAM Security Token Service to get IAM federated user credentials. The application calls the Identity broker to get IAM federated user credentials with access to the appropriate S3 bucket. 

C. The application authenticates against LDAP, and retrieves the name of an IAM role associated with the user. The application then calls the IAM Security Token Service to assume that IAM role. The application can use the temporary credentials to access the appropriate S3 bucket. 

D. The application authenticates against LDAP. The application then calls the AWS Identity and Access Management (IAM) Security Service to log in to IAM using the LDAP credentials. The application can use the IAM temporary credentials to access the appropriate S3 bucket. 

E. Develop an identity broker that authenticates against IAM Security Token Service to assume an IAM role in order to get temporary AWS security credentials. The application calls the identity broker to get AWS temporary security credentials with access to the appropriate S3 bucket. 

Answer: A, B 


Related AWS-Certified-Solutions-Architect-Professional Articles