Friday, March 5, 2021

5V0-91.20 VMware Carbon Black Portfolio Skills Exam

 

EXAM NUMBER : 5V0-91.20
PRODUCT : Carbon Black
EXAM LANGUAGE : English
Associate Certifications : VMware Carbon Black EndPoint Protection 2021

EXAM OVERVIEW
The VMware Carbon Black Portfolio Skills exam badge validates candidate’s knowledge on how to use the capabilities of the products according to the organization’s security posture and organizational policies.

Exam Info
Duration : 150
Number of Questions : 60
Passing Score : 300 Learn more
Format : Single and Multiple Choice, Proctored

Exam Details
The VMware Carbon Black Portfolio Skils (5V0-91.20), which leads to VMware Carbon Black EndPoint Protection 2021is a 60-item exam, with a passing score of 300 using a scaled method. Exam time is 150minutes.

Exam Delivery
This is a proctored exam delivered through Pearson VUE. For more information, visit the Pearson VUE website

Certification Information
For details and a complete list of requirements and recommendations for attainment, please reference the VMware  Education Services – Certification web site.

Minimally Qualified Candidate
The minimally qualified candidate(MQC) has experience with VMware Carbon Black and Is able to administrate and operationalize, manageand configure the product to meet their organization's goals. The MQC also uses the capability of the product and leverages the tool's capabilities to achieve the organization's security goals.
The MQC should have all the knowledge contained in the associated pre-requisite courses and exam sections listed below.

Exam Sections
If a section is missing from the list below, please note it is because the exam has no testable objectives for that section. The objective numbering may be referenced in your score report at the end of your testing event for further preparation should a retake of the exam be necessary.

Section 1 –Introduction –There are no testable objectives for this section

Section 2 -VMware Products and Solutions
Objective 2.1: Given a scenario about App Control user accounts with privileges, identify how they should be assigned.
Objective 2.2: Identify the characteristics of enforcement levels in App Control.
Objective 2.3: Given an App Control use case, identify the enforcement level that should be used.
Objective 2.4: Given an App Control use case, identify computers that meet the specified state or condition.
Objective 2.5: Give a scenario about managing an endpoint, identify how to accomplish this with App Control.
Objective 2.6: Given an App Control use case, identify the required rule type that should be used.
Objective 2.7: Given a scenario where alerting is needed, identify the criterion that should be configured in App Control.
Objective 2.8: Given an event in App Control, identify the components, the event type, or the meaning of the event.

Section 3 -VMware Carbon Black EDR

Objective 3.1: Identify the EDR components and dataflows.
Objective 3.2: Given a scenario, identify how to manage and configure EDR Sensor groups.
Objective 3.3: Given a scenario including a search in EDR, identify what is being searched for.
Objective 3.4: Given a scenario including a graphic in EDR, analyze the data given.
Objective 3.5: Identify the characteristic of a binary search and banning binaries in EDR.
Objective 3.6: Identify characteristicsthat impact search performance in EDR.
Objective 3.7: Identify how and when to use and configure feeds in EDR.
Objective 3.8: Identify how to create and review watchlists I EDR.
Objective 3.9: Given a scenarioabout an alert, identify the proper response mechanismin EDR.

Section 4 -VMware Carbon Black Cloud Endpoint Standard

Objective 4.1: Identify the communication process and requirements for Sensor to server comms in Endpoint Standard.
Objective 4.2: Givena scenario including a search in Cloud Endpoint including results, analyze the results.
Objective 4.3: Identify characteristics of policy-centered components and sensor options in Cloud Endpoint.
Objective 4.4: Identify the characteristics of permissions and blocking and isolation rules for Cloud Endpoint.
Objective 4.5: Identify the impact of reputation on rules in Cloud Endpoint.
Objective 4.6: Identify the structure of an alert in Cloud Endpoint.
Objective 4.7: Given a scenario about an alert including the investigation and triage pages, identify the components of the alert in Cloud Endpoint.
Objective 4.8: Given a scenario about an alert, identify how to respond using a Cloud Endpoint response option.

Section 5 -VMware Carbon Black Cloud Enterprise EDR

Objective 5.1: Given a scenarioincluding a watchlist, identify the components of the watchlist in Cloud Enterprise EDR.
Objective 5.2: Identify the structure of an alert in Cloud Enterprise EDR.
Objective 5.3: Given a scenario about an alert including the process and binary analysis pages, identify the components of the alert in Cloud Enterprise EDR.
Objective 5.4: Given a scenario about an environment, and an example and a goal, identify the query that should be created to accomplish the goal.
Objective 5.5: Given a scenario about an alert, identify how to respond using a Cloud Enterprise EDR response option.

Section 6 -VMware Carbon Black Cloud Audit and Remediation

Objective 6.1: Identify how to perform basic queries with OSQuery in Cloud Audit and Remediation.
Objective 6.2: Given a query, identify the framework or structure of the query in Cloud Audit and Remediation.
Objective 6.3: Given a query from the UI, identify its function and interpret the results for Cloud Audit and Remediation.
Objective 6.4: Given a scenario about Cloud Audit and Remediation, identify the components in OS query statements.
Objective 6.5: Identify how to exclude data from results using Where statements in Cloud Audit and Remediation.
Objective 6.6: Given a scenario, identify the type of query that should be used for Cloud Audit and Remediation.
Objective 6.7: Given a scenario including the requirement for a specific result, identify how to use Advanced SQL components to achieve the results.
Objective 6.8: IdentifyCloud Audit Live Response capabilities, limitations, and features.

Recommended Courses

VMware Carbon Black Portfolio: Configure and Manage

Actualkey VMware 5V0-91.20 Exam pdf, Certkingdom VMware 5V0-91.20 PDF

MCTS Training, MCITP Trainnig

Best VMware 5V0-91.20 Certification, VMware 5V0-91.20 Training at certkingdom.com

Tuesday, December 29, 2020

CDCP-001 Certified Data Centre Professional (CDCP) Exam

 

Exam Code: CDCP-001
GAQM provides an international, vendor-neutral credential (CDCP) with a global standard for measuring competency in the core elements of a data center.

The Certified Data Centre Professional (CDCP)™ exam certification indicates a foundational knowledge of critical physical infrastructure in the Data Center. The Certified Data Centre Professional’s have to demonstrate a base level proficiency in the elements of: cooling, fire safety and protection, racks, cabling, management and physical security. The candidate’s who are appearing for the CDCP exam must have sufficient knowledge in data center design and cabling strategies. The CDCP certification can significantly increase productivity and proficiency because certified professionals have the knowledge to successfully overcome obstacles faced in data center design, build and operations.

E-Course Duration: 10 to 15 Hours

e-Competence Framework (e-CF)

The mapping of this certificate against the e-Competence Framework. To know more on e-Competence Framework (e-CF) visit, ECF

The exam comprises of 40 Multiple Choice Questions out of which the candidate needs to score 65% (26 out of 40 correct) to pass the exam.

Exams are online and proctored based, using a webcam and a reliable internet connection exams can be taken anywhere and anytime.

The total duration of the exam is 1 hour (60 Minutes).

No external sources of information may be accessed during the exam held via ProctorU. Further details of the materials permitted are provided:

Identification Proof
If a Candidate does not pass the exam in the second (2nd) attempt, the candidate must wait for a period of at least fourteen (14) calendar days from the date of their attempt to retake the exam for third (3rd) time or any subsequent time.
The exam can be taken any number of times.

The Certified Data Centre Professional (CDCP)™ Certificate is valid for life.
CDCP™ is a trademark of GAQM.

Note: The Certified Data Centre Professional (CDCP)™ Certification requires a mandatory E-Course completion requirement.

Course Outline
Module 1 – Fundamentals of Availability

Introduction
Measuring Business Value
Five 9’s of Availability
Limitations of 99.999%
Factors affecting Availability
A/C Power Conditions
Cooling Issues
Equipment Failures
Natural and Artificial Disasters
Human Errors
Cost of Downtime
Calculating Cost of Downtime

Module 2 – Examining Fire protection methods in the Data Center

Introduction
National Fire Protection Association
Prevention
System Objectives of Data Center Fire Protection System
Fire Triangle
Classes of Fire
Stages of Combustion
Fire Detection Devices
Smoke Detectors
ISTD
Fire Extinguishers
Methods of Fire Supression
Water Sprinkler System
Water Mist Suppression System

Module 3 – Fundamentals of Cabling strategies for Data Centers

Introduction
Cabling
Overview of Cables
Cabling Installation
Cable Layout Architectures
Cable Management
Managing Cables
Cable Maintenance Practices

Module 4 – Fundamentals of Cooling I

Introduction
Evolution
Data Center Cooling
Physics of Cooling
Heat Transfer Methods
Airflow in IT Spaces
Heat Generation
Gas Law
Evaporation
Compression
Condensation
Expansion
Evaporator


Module 5 – Fundamentals of cooling II : Humidilty in the Data Center

Introduction
Cooling Related Devices
Humidity and Static Electricity
Nature of Humidity
Humidity Control in Data Center
Relative Humidity Control
Dew Point Control
Humidification System
Converted Office Space
OSP’s
Short Cycling

Target Audience
System integrators involved in data centre IT operations activities, serving their own data centre or as owned by their customers
Commercial customer who have to maintain their own data centre
Personnel working in commercial companies who are responsible for data centre IT operations
IT, facilities or Data Centre Operations professional

QUESTION 1
Which one of the following is an Objective of Data Center Fire Protection?

A. Information
B. Representation
C. Depression
D. Suppression

Correct Answer: D

QUESTION 2
Which Class of Fires involves energized electrical equipment?

A. Class A
B. Class B
C. Class C
D. Class K

Correct Answer: C

QUESTION 3
Which source is used in fiber cable to transmit data?

A. Signals
B. Electric
C. Light
D. Pulse

Correct Answer: C

QUESTION 4
Which one of the following is an AC Power Quality Anomaly?

A. Signal Distortion
B. Waveform Distortion
C. Backup Condition
D. Attenuation

Correct Answer: B

QUESTION 5
Which Class of Fire involves combustible metals or combustible metal alloys such as magnesium, sodium andpotassium?

A. Class A
B. Class B
C. Class C
D. Class D

Correct Answer: D

Actualkey GAQM CDCP-001 Exam pdf, Certkingdom GAQM CDCP-001 PDF

MCTS Training, MCITP Trainnig

Best GAQM CDCP-001 Certification, GAQM CDCP-001 Training at certkingdom.com

Saturday, December 26, 2020

C2140-823 Rational Quality Manager V3 Exam

 

QUESTION 1
What are three acceptable parameters for the IBM Rational Quality Manager out-of-the-box report:
Execution Status using TER count? (Choose three.)

A. test plan
B. test milestone
C. defect logged
D. test case
E. build

Answer: A,B,D
 

QUESTION 2
IBM Rational Quality Manager out-of-box reports are grouped under which three themes? (Choose three.)

A. defects
B. test case
C. cost
D. section manager
E. lab manager

Answer: A,B,E

QUESTION 3
What are the possible states of a test plan in its state transition model?

A. draft, ready for review, reviewed, closed
B. draft, under review, approved, retired
C. created, under review, reviewed , retired
D. created, ready for review, approved, closed

Answer: B

QUESTION 4
RRDI supports which application server?

A. Tomcat
B. WAS (32-bit)
C. WAS (64-bit)
D. WebLogic

Answer: B

Actualkey IBM C2140-823 Exam pdf, Certkingdom IBM C2140-823 PDF

MCTS Training, MCITP Trainnig

Best IBM C2140-823 Certification, IBM C2140-823 Training at certkingdom.com

Saturday, December 12, 2020

PL-400 Microsoft Power Platform Developer Exam

 

Skills measured
Create a technical design (10-15%)
Configure Common Data Service (15-20%)
Create and configure Power Apps (15-20%)
Configure business process automation (5-10%)
Extend the user experience (10-15%)
Extend the platform (15-20%)
Develop integrations (5-10%)

Audience Profile
Candidates for this exam design, develop, secure, and troubleshoot Power Platform solutions. Candidates implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations. Candidates must have strong applied knowledge of Power Platform services, including in-depth understanding of capabilities, boundaries, and constraints. Candidates should have a basic understanding of DevOps practices for Power Platform. Candidates should have development experience that includes Power Platform services, JavaScript, JSON, TypeScript, C#, HTML, .NET, Microsoft Azure, Microsoft 365, RESTful Web Services, ASP.NET, and Power BI.

Skills Measured NOTE:
The bullets that appear below each of the skills measured are intended to illustrate how we are assessing that skill. This list is not definitive or exhaustive.

NOTE: In most cases, exams do NOT cover preview features, and some features will only be added to an exam when they are GA (General Availability).

Create a technical design (10-15%)
Validate requirements and design technical architecture
 design and validate the technical architecture for a solution
 design authentication and authorization strategy
 determine whether you can meet requirements with out-of-the-box functionality
 determine when to use Logic Apps versus Power Automate flows
 determine when to use serverless computing, plug-ins, or Power Automate
 determine when to build a virtual entity data source provider and when to use connectors

Design solution components
 design a data model
 design Power Apps reusable components
 design custom connectors
 design server-side components

Describe Power Platform extensibility points
 describe Power Virtual Agents extensibility points including Bot Framework skills and Power Automate flows
 describe Power BI extensibility points including Power BI APIs, custom visuals, and embedding Power BI apps in websites and other applications
 describe Power Apps portal extensibility points including CRUD APIs and custom styling

Configure Common Data Service (15-20%)

Configure security to support development

 troubleshoot operational security issues
 create or update security roles and field-level security profiles
 configure business units and teams

Implement entities and fields

 configure entity and entity options
 configure fields
 configure relationships and types of behaviors

Implement application lifecycle management (ALM)
 create solutions and manage solution components
 import and export solutions
 manage solution dependencies
 create a package for deployment
 automate deployments
 implement source control for projects including solutions and code assets

Create and configure Power Apps (15-20%)

Create model-driven apps
 configure a model-driven app
 configure forms
 configure views
 configure visualizations

Create canvas apps

 create and configure a canvas app
 implement complex formulas to manage control events and properties
 analyze app usage by using App Insights
 build reusable component libraries

Manage and troubleshoot apps
 troubleshoot app issues by using Monitor and other browser-based debugging tools
 interpret results from App Checker and Solution Checker
 identify and resolve connector and API errors
 optimize app performance including pre-loading data and query delegation

Configure business process automation (5-10%)

Configure Power Automate

 create and configure a flow
 configure steps to use Common Data Service connector actions and triggers
 implement complex expressions in flow steps
 implement error handling
 troubleshoot flows by analyzing JSON responses from connectors

Implement processes
 create and configure business process flows
 create and configure business rules
 create, manage, and interact with business process flows by using server-side and client-side code
 troubleshoot processes

Extend the user experience (10-15%)

Apply business logic using client scripting

 create JavaScript or Typescript code that targets the XRM API
 register an event handler
 create client-side scripts that target the Common Data Service Web API

Create a Power Apps Component Framework (PCF) component

 describe the PCF component lifecycle
 initialize a new PCF component
 configure a PCF component manifest
 implement the component interfaces
 package, deploy, and consume the component
 configure and use PCF Device, Utility, and WebAPI features
 test and debug PCF components by using the local test harness

Create a command button function
 create the command function
 design command button rules and actions
 edit the command bar by using the Ribbon Workbench
 manage dependencies between JavaScript libraries

Extend the platform (15-20%)

Create a plug-in
 describe the plug-in execution pipeline
 design and develop a plug-in
 debug and troubleshoot a plug-in
 implement business logic by using pre and post images
 perform operations on data by using the Organization service API
 optimize plug-in performance
 register custom assemblies by using the Plug-in Registration Tool
 develop a plug-in that targets a custom action message

Create custom connectors

 create a definition for the API
 configure API security
 use policy templates to modify connector behavior at runtime
 expose Azure Functions as custom connectors
 create custom connectors for public APIs by using Postman

Use platform APIs
 interact with data and processes by using the Common Data Service Web API or the Organization Service
 implement API limit retry policies
 optimize for performance, concurrency, transactions, and batching
 query the Discovery service to discover the URL and other information for an organization
 perform entity metadata operations with the Web API
 perform authentication by using OAuth

Process workloads

 process long-running operations by using Azure Functions
 configure scheduled and event-driven function triggers in Azure Functions
 authenticate to the Power Platform by using managed identities

Develop Integrations (5-10%)

Publish and consume events
 publish an event by using the API
 publish an event by using the Plug-in Registration Tool
 register service endpoints including webhooks, Azure Service Bus, and Azure Event Hub
 implement a Common Data Service listener for an Azure solution
 create an Azure Function that interacts with Power Platform

Implement data synchronization
 configure entity change tracking
 read entity change records by using platform APIs
 create and use alternate keys

QUESTION 1
You need to improve warehouse counting efficiency.
What should you create?

A. a flow that updates the warehouse counts as the worker performs the count
B. a model-driven app that allows the user to key in inventory counts
C. A Power BI dashboard that shows the inventory counting variances
D. a canvas app that scans barcodes to allow a warehouse worker to select inventory counts

Correct Answer: D

QUESTION 2
You need to replace the bicycle inspection forms.
Which two solutions should you use? Each answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. a flow that maps inspection data to Dynamics 365 Field Service
B. a logic app that guides the technician through the inspection
C. a canvas app that guides the technician through the inspection
D. a model-driven app based on customer service entities

Correct Answer: AD

QUESTION 3
You are building a custom application in Azure to process resumes for the HR department.
The app must monitor submissions of resumes.
You need to parse the resumes and save contact and skills information into the Common Data Service.
Which mechanism should you use?

A. Power Automate
B. Common Data Service plug-in
C. Web API
D. Custom workflow activity

Correct Answer: A

QUESTION 4
You need to add the script for the registration form event handling.
Which code segment should you use?

A. formContext.data.entity.addOnSave(myFunction)
B. formContext.data.addOnLoad(myFunction)
C. formContext.data.removeOnLoad(myFunction)
D. addOnPreProcessStatusChange
E. formContext.data.isValid()

Correct Answer: B

Actualkey Microsoft PL-400 Exam pdf, Certkingdom Microsoft PL-400 PDF

MCTS Training, MCITP Trainnig

Best Microsoft PL-400 Certification, Microsoft PL-400 Training at certkingdom.com

Thursday, December 10, 2020

DES-1221 Specialist - Implementation Engineer, PowerStore Solutions Exam

 

Certification Overview
This certification benefits any professional implementing and administering PowerStore storage arrays in open systems environments. The certification focuses on configuration, administration, migration, upgrades and basic troubleshooting.

Certification Requirements

To complete the requirements for this certification you must:
1. Achieve one of the following Associate level certifications*
• Associate - Information Storage and Management Version 2.0
• Associate - Information Storage and Management Version 3.0
• Associate - Information Storage and Management Version 4.0

2. Pass the following Specialist exam on or after May 06, 2020:
• DES-1221 Specialist – Implementation Engineer, PowerStore Solutions Exam
Note: These details reflect certification requirements as of May 06, 2020.

The Proven Professional Program periodically updates certification requirements.
*Please check the Proven Professional CertTracker website regularly for the latest information and for other options to meet the Associate level requirement.

Overview
This exam is a qualifying exam for the Specialist – Implementation Engineer, PowerStore Solutions (DCS-IE) track.
This exam focuses on implementation and administration of PowerStore storage arrays in open systems environments. The exam covers configuration, administration, migration, upgrades and basic troubleshooting.
Dell Technologies provides free practice tests to assess your knowledge in preparation for the exam. Practice tests allow you to become familiar with the topics and question types you will find on the proctored exam. Your results on a practice test offer one indication of how prepared you are for the proctored exam and can highlight topics on which you need to study and train further. A passing score on the practice test does not guarantee a passing score on the certification exam.

Products
Products likely to be referred to on this exam include but are not limited to:
Dell EMC PowerStore 1000T
Dell EMC PowerStore 1000X
Dell EMC PowerStore 3000T
Dell EMC PowerStore 3000X
Dell EMC PowerStore 5000T
Dell EMC PowerStore 5000X
• Dell EMC PowerStore 7000T
• Dell EMC PowerStore 7000X
• Dell EMC PowerStore 9000T
Dell EMC PowerStore 9000X

Exam Topics
Topics likely to be covered on this exam include:

PowerStore Concepts and Features (5%)
• Describe the PowerStore system and use cases
• Identify the PowerStore system configurations and models
• Provide an overview of PowerStore architecture and hardware components

PowerStore Cabling (6%)
• Demonstrate front-end cabling for PowerStore models
• Demonstrate back-end for PowerStore
• Describe Ethernet cabling for PowerStore

PowerStore Implementation (18%)
• Perform PowerStore installation planning
• Rack and stack PowerStore systems
• Configure Ethernet switching for PowerStore
• Discover and perform initial system configuration of PowerStore
• Describe and perform licensing process for PowerStore

PowerStore Configuration (22%)
• Provision and configure block, file, and VMware storage
• Configure host and client access to PowerStore block, file, and VMware storage
• Describe and configure local and remote data protection on PowerStore storage resources
• Describe and configure intercluster data migration

PowerStore Administration (22%)

• Create and administer management users on PowerStore
• Perform administrative operations for PowerStore block, file, and VMware storage
• Perform PowerStore local and remote data protection operations
• Perform PowerStore intercluster data migration operations

PowerStore Migration (15%)
• Describe the PowerStore Migration feature and process
• Identify the PowerStore Migration requirements and capabilities
• Configure the migration requirements and capabilities
• Configure the migration feature to import data from supported sources
• Perform migration operations

PowerStore Software and Hardware upgrades (7%)
• Describe and perform PowerStore software upgrade
• Add an appliance into a cluster
• Add an expansion shelve to a PowerStore appliance
• Add drives to a PowerStore appliance

PowerStore Basic Troubleshooting (5%)

• View system alert, events, and jobs
• Gather support materials
• Identify PowerStore system fault LEDs
The percentages after each topic above reflects the approximate distribution of the total question set across the exam.

QUESTION 1
When planning rack layout, why should base enclosures be grouped together?

A. Power distribution requirements
B. Rack weight balance
C. Ease of cable management
D. Hot aisle requirement

Correct Answer: A

QUESTION 2
What does the remote replication Failover operation do?

A. Fully synchronizes the source and destination data states and reverses the replication direction
B. Fully synchronizes the source and destination data states and stops the current replication session
C. Promotes the destination system to production with its replica data consistent to the last successful RPO synchronization state
D. Promotes the destination system to production and resets the RPO synchronization cycle in the protection policy

Correct Answer: D

QUESTION 3
What describes the Import External Storage feature?

A. An external plugin feature for hosts to tune application workload IO on PowerStore NVMe based storage
B. A native network configuration feature of PowerStore that configures external Ethernet switching for PowerStore installation
C. A feature external to PowerStore that orchestrates vMotion movement of virtual machines onto PowerStore X systems
D. A native data migration feature of PowerStore that imports external block storage to PowerStore

Correct Answer: D

QUESTION 4
What is the recommended Windows multi-pathing policy for block volumes on a PowerStore array?

A. Least QueueDepth
B. Round-Robin
C. FailOver
D. Least Weighted Paths

Correct Answer: B

Actualkey Dell EMC DES-1221 Exam pdf, Certkingdom Dell EMC DES-1221 PDF

MCTS Training, MCITP Trainnig

Best Dell EMC DES-1221 Certification, Dell EMC DES-1221 Training at certkingdom.com

Tuesday, December 8, 2020

C2020-013 IBM SPSS Modeler DataMining for Business Partners v2

 

Certkingdom's preparation material includes the most excellent features, prepared by the same dedicated experts who have come together to offer an integrated solution. We provide the most excellent and simple method to pass your certification exams on the first attempt "GUARANTEED"

Whether you want to improve your skills, expertise or career growth, with Certkingdom's training and certification resources help you achieve your goals. Our exams files feature hands-on tasks and real-world scenarios; in just a matter of days, you'll be more productive and embracing new technology standards. Our online resources and events enable you to focus on learning just what you want on your timeframe. You get access to every exams files and there continuously update our study materials; these exam updates are supplied free of charge to our valued customers. Get the best C2020-013 exam Training; as you study from our exam-files "Best Materials Great Results"

Make The Best Choice Chose - Certkingdom
Make yourself more valuable in today's competitive computer industry Certkingdom's preparation material includes the most excellent features, prepared by the same dedicated experts who have come together to offer an integrated solution. We provide the most excellent and simple method to pass your IBM IBM Business Analytics C2020-013 exam on the first attempt "GUARANTEED".

QUESTION 1
How many phases are in the CRISP-DM Process Methodology?

A. Four
B. Five
C. Six
D. Seven

Answer: C

QUESTION 2
True or false: the CRISP-DM Process Methodology is a linear process.

A. True
B. False

Answer: B

QUESTION 3
Which node is used to read data from a comma delimited text file?

A. Var. File
B. Data Collection
C. Fixed File
D. Statistics File

Answer: A

QUESTION 4
Which node can be used to impute (estimate) missing values?

A. Data Audit node
B. Balance node
C. Filler node
D. Reclassify node

Answer: A

Actualkey IBM C2020-013 Exam pdf, Certkingdom IBM C2020-013 PDF

MCTS Training, MCITP Trainnig

Best IBM C2020-013 Certification, IBM C2020-013 Training at certkingdom.com

Monday, December 7, 2020

CCA-505 Cloudera Certified Administrator for Apache Hadoop (CCAH) CDH5 Upgrade Exam

 

A Cloudera Certified Administrator for Apache Hadoop (CCAH) certification proves that you have demonstrated your technical knowledge, skills, and ability to configure, deploy, maintain, and secure an Apache Hadoop cluster.

Cloudera Certified Administrator for Apache Hadoop (CCA-500)
Number of Questions: 60 questions
Time Limit: 90 minutes
Passing Score: 70%
Language: English, Japanese
Price: NOT AVAILABLE
Exam Sections and Blueprint

1. HDFS (17%)
Describe the function of HDFS daemons
Describe the normal operation of an Apache Hadoop cluster, both in data storage and in data processing
Identify current features of computing systems that motivate a system like Apache Hadoop
Classify major goals of HDFS Design
Given a scenario, identify appropriate use case for HDFS Federation
Identify components and daemon of an HDFS HA-Quorum cluster
Analyze the role of HDFS security (Kerberos)
Determine the best data serialization choice for a given scenario
Describe file read and write paths
Identify the commands to manipulate files in the Hadoop File System Shell

2. YARN (17%)
Understand how to deploy core ecosystem components, including Spark, Impala, and Hive
Understand how to deploy MapReduce v2 (MRv2 / YARN), including all YARN daemons
Understand basic design strategy for YARN and Hadoop
Determine how YARN handles resource allocations
Identify the workflow of job running on YARN
Determine which files you must change and how in order to migrate a cluster from MapReduce version 1 (MRv1) to MapReduce version 2 (MRv2) running on YARN

3. Hadoop Cluster Planning (16%)
Principal points to consider in choosing the hardware and operating systems to host an Apache Hadoop cluster
Analyze the choices in selecting an OS
Understand kernel tuning and disk swapping
Given a scenario and workload pattern, identify a hardware configuration appropriate to the scenario
Given a scenario, determine the ecosystem components your cluster needs to run in order to fulfill the SLA
Cluster sizing: given a scenario and frequency of execution, identify the specifics for the workload, including CPU, memory, storage, disk I/O
Disk Sizing and Configuration, including JBOD versus RAID, SANs, virtualization, and disk sizing requirements in a cluster
Network Topologies: understand network usage in Hadoop (for both HDFS and MapReduce) and propose or identify key network design components for a given scenario

4. Hadoop Cluster Installation and Administration (25%)
Given a scenario, identify how the cluster will handle disk and machine failures
Analyze a logging configuration and logging configuration file format
Understand the basics of Hadoop metrics and cluster health monitoring
Identify the function and purpose of available tools for cluster monitoring
Be able to install all the ecoystme components in CDH 5, including (but not limited to): Impala, Flume, Oozie, Hue, Cloudera Manager, Sqoop, Hive, and Pig
Identify the function and purpose of available tools for managing the Apache Hadoop file system

5. Resource Management (10%)
Understand the overall design goals of each of Hadoop schedulers
Given a scenario, determine how the FIFO Scheduler allocates cluster resources
Given a scenario, determine how the Fair Scheduler allocates cluster resources under YARN
Given a scenario, determine how the Capacity Scheduler allocates cluster resources

6. Monitoring and Logging (15%)

Understand the functions and features of Hadoop’s metric collection abilities
Analyze the NameNode and JobTracker Web UIs
Understand how to monitor cluster daemons
Identify and monitor CPU usage on master nodes
Describe how to monitor swap and memory allocation on all nodes
Identify how to view and manage Hadoop’s log files
Interpret a log file

Disclaimer: These exam preparation pages are intended to provide information about the objectives covered by each exam, related resources, and recommended reading and courses. The material contained within these pages is not intended to guarantee a passing score on any exam. Cloudera recommends that a candidate thoroughly understand the objectives for each exam and utilize the resources and training courses recommended on these pages to gain a thorough understand of the domain of knowledge related to the role the exam evaluates..

QUESTION 1
You have installed a cluster running HDFS and MapReduce version 2 (MRv2) on YARN. You have
no afs.hosts entry()ies in your hdfs-alte.xml configuration file. You configure a new worker node by
setting fs.default.name in its configuration files to point to the NameNode on your cluster, and you
start the DataNode daemon on that worker node.
What do you have to do on the cluster to allow the worker node to join, and start storing HDFS blocks?

A. Nothing; the worker node will automatically join the cluster when the DataNode daemon is started.
B. Without creating a dfs.hosts file or making any entries, run the command hadoop dfsadmin –refreshHadoop on the NameNode
C. Create a dfs.hosts file on the NameNode, add the worker node’s name to it, then issue the command hadoop dfsadmin –refreshNodes on the NameNode
D. Restart the NameNode

Answer: D

QUESTION 2
Assuming a cluster running HDFS, MapReduce version 2 (MRv2) on YARN with all settings at
their default, what do you need to do when adding a new slave node to a cluster?

A. Nothing, other than ensuring that DNS (or /etc/hosts files on all machines) contains am entry for the new node.
B. Restart the NameNode and ResourceManager deamons and resubmit any running jobs
C. Increase the value of dfs.number.of.needs in hdfs-site.xml
D. Add a new entry to /etc/nodes on the NameNode host.
E. Restart the NameNode daemon.

Answer: B

QUESTION 3
You have a 20 node Hadoop cluster, with 18 slave nodes and 2 master nodes running HDFS High
Availability (HA). You want to minimize the chance of data loss in you cluster. What should you do?

A. Add another master node to increase the number of nodes running the JournalNode which increases the number of machines available to HA to create a quorum
B. Configure the cluster’s disk drives with an appropriate fault tolerant RAID level
C. Run the ResourceManager on a different master from the NameNode in the order to load share HDFS metadata processing
D. Run a Secondary NameNode on a different master from the NameNode in order to load provide automatic recovery from a NameNode failure
E. Set an HDFS replication factor that provides data redundancy, protecting against failure
Answer: C

QUESTION 4
You decide to create a cluster which runs HDFS in High Availability mode with automatic failover, using Quorum-based Storage. What is the purpose of ZooKeeper in such a configuration?

A. It manages the Edits file, which is a log changes to the HDFS filesystem.
B. It monitors an NFS mount point and reports if the mount point disappears
C. It both keeps track of which NameNode is Active at any given time, and manages the Edits file, which is a log of changes to the HDFS filesystem
D. It only keeps track of which NameNode is Active at any given time
E. Clients connect toZoneKeeper to determine which NameNode is Active

Answer: D

Actualkey Cloudera CCAH CCA-505 Exam pdf, Certkingdom Cloudera CCAH CCA-505 PDF

MCTS Training, MCITP Trainnig

Best Cloudera CCAH CCA-505 Certification, Cloudera CCAH CCA-505 Training at certkingdom.com