Tuesday, December 29, 2020

CDCP-001 Certified Data Centre Professional (CDCP) Exam

 

Exam Code: CDCP-001
GAQM provides an international, vendor-neutral credential (CDCP) with a global standard for measuring competency in the core elements of a data center.

The Certified Data Centre Professional (CDCP)™ exam certification indicates a foundational knowledge of critical physical infrastructure in the Data Center. The Certified Data Centre Professional’s have to demonstrate a base level proficiency in the elements of: cooling, fire safety and protection, racks, cabling, management and physical security. The candidate’s who are appearing for the CDCP exam must have sufficient knowledge in data center design and cabling strategies. The CDCP certification can significantly increase productivity and proficiency because certified professionals have the knowledge to successfully overcome obstacles faced in data center design, build and operations.

E-Course Duration: 10 to 15 Hours

e-Competence Framework (e-CF)

The mapping of this certificate against the e-Competence Framework. To know more on e-Competence Framework (e-CF) visit, ECF

The exam comprises of 40 Multiple Choice Questions out of which the candidate needs to score 65% (26 out of 40 correct) to pass the exam.

Exams are online and proctored based, using a webcam and a reliable internet connection exams can be taken anywhere and anytime.

The total duration of the exam is 1 hour (60 Minutes).

No external sources of information may be accessed during the exam held via ProctorU. Further details of the materials permitted are provided:

Identification Proof
If a Candidate does not pass the exam in the second (2nd) attempt, the candidate must wait for a period of at least fourteen (14) calendar days from the date of their attempt to retake the exam for third (3rd) time or any subsequent time.
The exam can be taken any number of times.

The Certified Data Centre Professional (CDCP)™ Certificate is valid for life.
CDCP™ is a trademark of GAQM.

Note: The Certified Data Centre Professional (CDCP)™ Certification requires a mandatory E-Course completion requirement.

Course Outline
Module 1 – Fundamentals of Availability

Introduction
Measuring Business Value
Five 9’s of Availability
Limitations of 99.999%
Factors affecting Availability
A/C Power Conditions
Cooling Issues
Equipment Failures
Natural and Artificial Disasters
Human Errors
Cost of Downtime
Calculating Cost of Downtime

Module 2 – Examining Fire protection methods in the Data Center

Introduction
National Fire Protection Association
Prevention
System Objectives of Data Center Fire Protection System
Fire Triangle
Classes of Fire
Stages of Combustion
Fire Detection Devices
Smoke Detectors
ISTD
Fire Extinguishers
Methods of Fire Supression
Water Sprinkler System
Water Mist Suppression System

Module 3 – Fundamentals of Cabling strategies for Data Centers

Introduction
Cabling
Overview of Cables
Cabling Installation
Cable Layout Architectures
Cable Management
Managing Cables
Cable Maintenance Practices

Module 4 – Fundamentals of Cooling I

Introduction
Evolution
Data Center Cooling
Physics of Cooling
Heat Transfer Methods
Airflow in IT Spaces
Heat Generation
Gas Law
Evaporation
Compression
Condensation
Expansion
Evaporator


Module 5 – Fundamentals of cooling II : Humidilty in the Data Center

Introduction
Cooling Related Devices
Humidity and Static Electricity
Nature of Humidity
Humidity Control in Data Center
Relative Humidity Control
Dew Point Control
Humidification System
Converted Office Space
OSP’s
Short Cycling

Target Audience
System integrators involved in data centre IT operations activities, serving their own data centre or as owned by their customers
Commercial customer who have to maintain their own data centre
Personnel working in commercial companies who are responsible for data centre IT operations
IT, facilities or Data Centre Operations professional

QUESTION 1
Which one of the following is an Objective of Data Center Fire Protection?

A. Information
B. Representation
C. Depression
D. Suppression

Correct Answer: D

QUESTION 2
Which Class of Fires involves energized electrical equipment?

A. Class A
B. Class B
C. Class C
D. Class K

Correct Answer: C

QUESTION 3
Which source is used in fiber cable to transmit data?

A. Signals
B. Electric
C. Light
D. Pulse

Correct Answer: C

QUESTION 4
Which one of the following is an AC Power Quality Anomaly?

A. Signal Distortion
B. Waveform Distortion
C. Backup Condition
D. Attenuation

Correct Answer: B

QUESTION 5
Which Class of Fire involves combustible metals or combustible metal alloys such as magnesium, sodium andpotassium?

A. Class A
B. Class B
C. Class C
D. Class D

Correct Answer: D

Actualkey GAQM CDCP-001 Exam pdf, Certkingdom GAQM CDCP-001 PDF

MCTS Training, MCITP Trainnig

Best GAQM CDCP-001 Certification, GAQM CDCP-001 Training at certkingdom.com

Saturday, December 26, 2020

C2140-823 Rational Quality Manager V3 Exam

 

QUESTION 1
What are three acceptable parameters for the IBM Rational Quality Manager out-of-the-box report:
Execution Status using TER count? (Choose three.)

A. test plan
B. test milestone
C. defect logged
D. test case
E. build

Answer: A,B,D
 

QUESTION 2
IBM Rational Quality Manager out-of-box reports are grouped under which three themes? (Choose three.)

A. defects
B. test case
C. cost
D. section manager
E. lab manager

Answer: A,B,E

QUESTION 3
What are the possible states of a test plan in its state transition model?

A. draft, ready for review, reviewed, closed
B. draft, under review, approved, retired
C. created, under review, reviewed , retired
D. created, ready for review, approved, closed

Answer: B

QUESTION 4
RRDI supports which application server?

A. Tomcat
B. WAS (32-bit)
C. WAS (64-bit)
D. WebLogic

Answer: B

Actualkey IBM C2140-823 Exam pdf, Certkingdom IBM C2140-823 PDF

MCTS Training, MCITP Trainnig

Best IBM C2140-823 Certification, IBM C2140-823 Training at certkingdom.com

Saturday, December 12, 2020

PL-400 Microsoft Power Platform Developer Exam

 

Skills measured
Create a technical design (10-15%)
Configure Common Data Service (15-20%)
Create and configure Power Apps (15-20%)
Configure business process automation (5-10%)
Extend the user experience (10-15%)
Extend the platform (15-20%)
Develop integrations (5-10%)

Audience Profile
Candidates for this exam design, develop, secure, and troubleshoot Power Platform solutions. Candidates implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations. Candidates must have strong applied knowledge of Power Platform services, including in-depth understanding of capabilities, boundaries, and constraints. Candidates should have a basic understanding of DevOps practices for Power Platform. Candidates should have development experience that includes Power Platform services, JavaScript, JSON, TypeScript, C#, HTML, .NET, Microsoft Azure, Microsoft 365, RESTful Web Services, ASP.NET, and Power BI.

Skills Measured NOTE:
The bullets that appear below each of the skills measured are intended to illustrate how we are assessing that skill. This list is not definitive or exhaustive.

NOTE: In most cases, exams do NOT cover preview features, and some features will only be added to an exam when they are GA (General Availability).

Create a technical design (10-15%)
Validate requirements and design technical architecture
 design and validate the technical architecture for a solution
 design authentication and authorization strategy
 determine whether you can meet requirements with out-of-the-box functionality
 determine when to use Logic Apps versus Power Automate flows
 determine when to use serverless computing, plug-ins, or Power Automate
 determine when to build a virtual entity data source provider and when to use connectors

Design solution components
 design a data model
 design Power Apps reusable components
 design custom connectors
 design server-side components

Describe Power Platform extensibility points
 describe Power Virtual Agents extensibility points including Bot Framework skills and Power Automate flows
 describe Power BI extensibility points including Power BI APIs, custom visuals, and embedding Power BI apps in websites and other applications
 describe Power Apps portal extensibility points including CRUD APIs and custom styling

Configure Common Data Service (15-20%)

Configure security to support development

 troubleshoot operational security issues
 create or update security roles and field-level security profiles
 configure business units and teams

Implement entities and fields

 configure entity and entity options
 configure fields
 configure relationships and types of behaviors

Implement application lifecycle management (ALM)
 create solutions and manage solution components
 import and export solutions
 manage solution dependencies
 create a package for deployment
 automate deployments
 implement source control for projects including solutions and code assets

Create and configure Power Apps (15-20%)

Create model-driven apps
 configure a model-driven app
 configure forms
 configure views
 configure visualizations

Create canvas apps

 create and configure a canvas app
 implement complex formulas to manage control events and properties
 analyze app usage by using App Insights
 build reusable component libraries

Manage and troubleshoot apps
 troubleshoot app issues by using Monitor and other browser-based debugging tools
 interpret results from App Checker and Solution Checker
 identify and resolve connector and API errors
 optimize app performance including pre-loading data and query delegation

Configure business process automation (5-10%)

Configure Power Automate

 create and configure a flow
 configure steps to use Common Data Service connector actions and triggers
 implement complex expressions in flow steps
 implement error handling
 troubleshoot flows by analyzing JSON responses from connectors

Implement processes
 create and configure business process flows
 create and configure business rules
 create, manage, and interact with business process flows by using server-side and client-side code
 troubleshoot processes

Extend the user experience (10-15%)

Apply business logic using client scripting

 create JavaScript or Typescript code that targets the XRM API
 register an event handler
 create client-side scripts that target the Common Data Service Web API

Create a Power Apps Component Framework (PCF) component

 describe the PCF component lifecycle
 initialize a new PCF component
 configure a PCF component manifest
 implement the component interfaces
 package, deploy, and consume the component
 configure and use PCF Device, Utility, and WebAPI features
 test and debug PCF components by using the local test harness

Create a command button function
 create the command function
 design command button rules and actions
 edit the command bar by using the Ribbon Workbench
 manage dependencies between JavaScript libraries

Extend the platform (15-20%)

Create a plug-in
 describe the plug-in execution pipeline
 design and develop a plug-in
 debug and troubleshoot a plug-in
 implement business logic by using pre and post images
 perform operations on data by using the Organization service API
 optimize plug-in performance
 register custom assemblies by using the Plug-in Registration Tool
 develop a plug-in that targets a custom action message

Create custom connectors

 create a definition for the API
 configure API security
 use policy templates to modify connector behavior at runtime
 expose Azure Functions as custom connectors
 create custom connectors for public APIs by using Postman

Use platform APIs
 interact with data and processes by using the Common Data Service Web API or the Organization Service
 implement API limit retry policies
 optimize for performance, concurrency, transactions, and batching
 query the Discovery service to discover the URL and other information for an organization
 perform entity metadata operations with the Web API
 perform authentication by using OAuth

Process workloads

 process long-running operations by using Azure Functions
 configure scheduled and event-driven function triggers in Azure Functions
 authenticate to the Power Platform by using managed identities

Develop Integrations (5-10%)

Publish and consume events
 publish an event by using the API
 publish an event by using the Plug-in Registration Tool
 register service endpoints including webhooks, Azure Service Bus, and Azure Event Hub
 implement a Common Data Service listener for an Azure solution
 create an Azure Function that interacts with Power Platform

Implement data synchronization
 configure entity change tracking
 read entity change records by using platform APIs
 create and use alternate keys

QUESTION 1
You need to improve warehouse counting efficiency.
What should you create?

A. a flow that updates the warehouse counts as the worker performs the count
B. a model-driven app that allows the user to key in inventory counts
C. A Power BI dashboard that shows the inventory counting variances
D. a canvas app that scans barcodes to allow a warehouse worker to select inventory counts

Correct Answer: D

QUESTION 2
You need to replace the bicycle inspection forms.
Which two solutions should you use? Each answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. a flow that maps inspection data to Dynamics 365 Field Service
B. a logic app that guides the technician through the inspection
C. a canvas app that guides the technician through the inspection
D. a model-driven app based on customer service entities

Correct Answer: AD

QUESTION 3
You are building a custom application in Azure to process resumes for the HR department.
The app must monitor submissions of resumes.
You need to parse the resumes and save contact and skills information into the Common Data Service.
Which mechanism should you use?

A. Power Automate
B. Common Data Service plug-in
C. Web API
D. Custom workflow activity

Correct Answer: A

QUESTION 4
You need to add the script for the registration form event handling.
Which code segment should you use?

A. formContext.data.entity.addOnSave(myFunction)
B. formContext.data.addOnLoad(myFunction)
C. formContext.data.removeOnLoad(myFunction)
D. addOnPreProcessStatusChange
E. formContext.data.isValid()

Correct Answer: B

Actualkey Microsoft PL-400 Exam pdf, Certkingdom Microsoft PL-400 PDF

MCTS Training, MCITP Trainnig

Best Microsoft PL-400 Certification, Microsoft PL-400 Training at certkingdom.com

Thursday, December 10, 2020

DES-1221 Specialist - Implementation Engineer, PowerStore Solutions Exam

 

Certification Overview
This certification benefits any professional implementing and administering PowerStore storage arrays in open systems environments. The certification focuses on configuration, administration, migration, upgrades and basic troubleshooting.

Certification Requirements

To complete the requirements for this certification you must:
1. Achieve one of the following Associate level certifications*
• Associate - Information Storage and Management Version 2.0
• Associate - Information Storage and Management Version 3.0
• Associate - Information Storage and Management Version 4.0

2. Pass the following Specialist exam on or after May 06, 2020:
• DES-1221 Specialist – Implementation Engineer, PowerStore Solutions Exam
Note: These details reflect certification requirements as of May 06, 2020.

The Proven Professional Program periodically updates certification requirements.
*Please check the Proven Professional CertTracker website regularly for the latest information and for other options to meet the Associate level requirement.

Overview
This exam is a qualifying exam for the Specialist – Implementation Engineer, PowerStore Solutions (DCS-IE) track.
This exam focuses on implementation and administration of PowerStore storage arrays in open systems environments. The exam covers configuration, administration, migration, upgrades and basic troubleshooting.
Dell Technologies provides free practice tests to assess your knowledge in preparation for the exam. Practice tests allow you to become familiar with the topics and question types you will find on the proctored exam. Your results on a practice test offer one indication of how prepared you are for the proctored exam and can highlight topics on which you need to study and train further. A passing score on the practice test does not guarantee a passing score on the certification exam.

Products
Products likely to be referred to on this exam include but are not limited to:
Dell EMC PowerStore 1000T
Dell EMC PowerStore 1000X
Dell EMC PowerStore 3000T
Dell EMC PowerStore 3000X
Dell EMC PowerStore 5000T
Dell EMC PowerStore 5000X
• Dell EMC PowerStore 7000T
• Dell EMC PowerStore 7000X
• Dell EMC PowerStore 9000T
Dell EMC PowerStore 9000X

Exam Topics
Topics likely to be covered on this exam include:

PowerStore Concepts and Features (5%)
• Describe the PowerStore system and use cases
• Identify the PowerStore system configurations and models
• Provide an overview of PowerStore architecture and hardware components

PowerStore Cabling (6%)
• Demonstrate front-end cabling for PowerStore models
• Demonstrate back-end for PowerStore
• Describe Ethernet cabling for PowerStore

PowerStore Implementation (18%)
• Perform PowerStore installation planning
• Rack and stack PowerStore systems
• Configure Ethernet switching for PowerStore
• Discover and perform initial system configuration of PowerStore
• Describe and perform licensing process for PowerStore

PowerStore Configuration (22%)
• Provision and configure block, file, and VMware storage
• Configure host and client access to PowerStore block, file, and VMware storage
• Describe and configure local and remote data protection on PowerStore storage resources
• Describe and configure intercluster data migration

PowerStore Administration (22%)

• Create and administer management users on PowerStore
• Perform administrative operations for PowerStore block, file, and VMware storage
• Perform PowerStore local and remote data protection operations
• Perform PowerStore intercluster data migration operations

PowerStore Migration (15%)
• Describe the PowerStore Migration feature and process
• Identify the PowerStore Migration requirements and capabilities
• Configure the migration requirements and capabilities
• Configure the migration feature to import data from supported sources
• Perform migration operations

PowerStore Software and Hardware upgrades (7%)
• Describe and perform PowerStore software upgrade
• Add an appliance into a cluster
• Add an expansion shelve to a PowerStore appliance
• Add drives to a PowerStore appliance

PowerStore Basic Troubleshooting (5%)

• View system alert, events, and jobs
• Gather support materials
• Identify PowerStore system fault LEDs
The percentages after each topic above reflects the approximate distribution of the total question set across the exam.

QUESTION 1
When planning rack layout, why should base enclosures be grouped together?

A. Power distribution requirements
B. Rack weight balance
C. Ease of cable management
D. Hot aisle requirement

Correct Answer: A

QUESTION 2
What does the remote replication Failover operation do?

A. Fully synchronizes the source and destination data states and reverses the replication direction
B. Fully synchronizes the source and destination data states and stops the current replication session
C. Promotes the destination system to production with its replica data consistent to the last successful RPO synchronization state
D. Promotes the destination system to production and resets the RPO synchronization cycle in the protection policy

Correct Answer: D

QUESTION 3
What describes the Import External Storage feature?

A. An external plugin feature for hosts to tune application workload IO on PowerStore NVMe based storage
B. A native network configuration feature of PowerStore that configures external Ethernet switching for PowerStore installation
C. A feature external to PowerStore that orchestrates vMotion movement of virtual machines onto PowerStore X systems
D. A native data migration feature of PowerStore that imports external block storage to PowerStore

Correct Answer: D

QUESTION 4
What is the recommended Windows multi-pathing policy for block volumes on a PowerStore array?

A. Least QueueDepth
B. Round-Robin
C. FailOver
D. Least Weighted Paths

Correct Answer: B

Actualkey Dell EMC DES-1221 Exam pdf, Certkingdom Dell EMC DES-1221 PDF

MCTS Training, MCITP Trainnig

Best Dell EMC DES-1221 Certification, Dell EMC DES-1221 Training at certkingdom.com

Tuesday, December 8, 2020

C2020-013 IBM SPSS Modeler DataMining for Business Partners v2

 

Certkingdom's preparation material includes the most excellent features, prepared by the same dedicated experts who have come together to offer an integrated solution. We provide the most excellent and simple method to pass your certification exams on the first attempt "GUARANTEED"

Whether you want to improve your skills, expertise or career growth, with Certkingdom's training and certification resources help you achieve your goals. Our exams files feature hands-on tasks and real-world scenarios; in just a matter of days, you'll be more productive and embracing new technology standards. Our online resources and events enable you to focus on learning just what you want on your timeframe. You get access to every exams files and there continuously update our study materials; these exam updates are supplied free of charge to our valued customers. Get the best C2020-013 exam Training; as you study from our exam-files "Best Materials Great Results"

Make The Best Choice Chose - Certkingdom
Make yourself more valuable in today's competitive computer industry Certkingdom's preparation material includes the most excellent features, prepared by the same dedicated experts who have come together to offer an integrated solution. We provide the most excellent and simple method to pass your IBM IBM Business Analytics C2020-013 exam on the first attempt "GUARANTEED".

QUESTION 1
How many phases are in the CRISP-DM Process Methodology?

A. Four
B. Five
C. Six
D. Seven

Answer: C

QUESTION 2
True or false: the CRISP-DM Process Methodology is a linear process.

A. True
B. False

Answer: B

QUESTION 3
Which node is used to read data from a comma delimited text file?

A. Var. File
B. Data Collection
C. Fixed File
D. Statistics File

Answer: A

QUESTION 4
Which node can be used to impute (estimate) missing values?

A. Data Audit node
B. Balance node
C. Filler node
D. Reclassify node

Answer: A

Actualkey IBM C2020-013 Exam pdf, Certkingdom IBM C2020-013 PDF

MCTS Training, MCITP Trainnig

Best IBM C2020-013 Certification, IBM C2020-013 Training at certkingdom.com

Monday, December 7, 2020

CCA-505 Cloudera Certified Administrator for Apache Hadoop (CCAH) CDH5 Upgrade Exam

 

A Cloudera Certified Administrator for Apache Hadoop (CCAH) certification proves that you have demonstrated your technical knowledge, skills, and ability to configure, deploy, maintain, and secure an Apache Hadoop cluster.

Cloudera Certified Administrator for Apache Hadoop (CCA-500)
Number of Questions: 60 questions
Time Limit: 90 minutes
Passing Score: 70%
Language: English, Japanese
Price: NOT AVAILABLE
Exam Sections and Blueprint

1. HDFS (17%)
Describe the function of HDFS daemons
Describe the normal operation of an Apache Hadoop cluster, both in data storage and in data processing
Identify current features of computing systems that motivate a system like Apache Hadoop
Classify major goals of HDFS Design
Given a scenario, identify appropriate use case for HDFS Federation
Identify components and daemon of an HDFS HA-Quorum cluster
Analyze the role of HDFS security (Kerberos)
Determine the best data serialization choice for a given scenario
Describe file read and write paths
Identify the commands to manipulate files in the Hadoop File System Shell

2. YARN (17%)
Understand how to deploy core ecosystem components, including Spark, Impala, and Hive
Understand how to deploy MapReduce v2 (MRv2 / YARN), including all YARN daemons
Understand basic design strategy for YARN and Hadoop
Determine how YARN handles resource allocations
Identify the workflow of job running on YARN
Determine which files you must change and how in order to migrate a cluster from MapReduce version 1 (MRv1) to MapReduce version 2 (MRv2) running on YARN

3. Hadoop Cluster Planning (16%)
Principal points to consider in choosing the hardware and operating systems to host an Apache Hadoop cluster
Analyze the choices in selecting an OS
Understand kernel tuning and disk swapping
Given a scenario and workload pattern, identify a hardware configuration appropriate to the scenario
Given a scenario, determine the ecosystem components your cluster needs to run in order to fulfill the SLA
Cluster sizing: given a scenario and frequency of execution, identify the specifics for the workload, including CPU, memory, storage, disk I/O
Disk Sizing and Configuration, including JBOD versus RAID, SANs, virtualization, and disk sizing requirements in a cluster
Network Topologies: understand network usage in Hadoop (for both HDFS and MapReduce) and propose or identify key network design components for a given scenario

4. Hadoop Cluster Installation and Administration (25%)
Given a scenario, identify how the cluster will handle disk and machine failures
Analyze a logging configuration and logging configuration file format
Understand the basics of Hadoop metrics and cluster health monitoring
Identify the function and purpose of available tools for cluster monitoring
Be able to install all the ecoystme components in CDH 5, including (but not limited to): Impala, Flume, Oozie, Hue, Cloudera Manager, Sqoop, Hive, and Pig
Identify the function and purpose of available tools for managing the Apache Hadoop file system

5. Resource Management (10%)
Understand the overall design goals of each of Hadoop schedulers
Given a scenario, determine how the FIFO Scheduler allocates cluster resources
Given a scenario, determine how the Fair Scheduler allocates cluster resources under YARN
Given a scenario, determine how the Capacity Scheduler allocates cluster resources

6. Monitoring and Logging (15%)

Understand the functions and features of Hadoop’s metric collection abilities
Analyze the NameNode and JobTracker Web UIs
Understand how to monitor cluster daemons
Identify and monitor CPU usage on master nodes
Describe how to monitor swap and memory allocation on all nodes
Identify how to view and manage Hadoop’s log files
Interpret a log file

Disclaimer: These exam preparation pages are intended to provide information about the objectives covered by each exam, related resources, and recommended reading and courses. The material contained within these pages is not intended to guarantee a passing score on any exam. Cloudera recommends that a candidate thoroughly understand the objectives for each exam and utilize the resources and training courses recommended on these pages to gain a thorough understand of the domain of knowledge related to the role the exam evaluates..

QUESTION 1
You have installed a cluster running HDFS and MapReduce version 2 (MRv2) on YARN. You have
no afs.hosts entry()ies in your hdfs-alte.xml configuration file. You configure a new worker node by
setting fs.default.name in its configuration files to point to the NameNode on your cluster, and you
start the DataNode daemon on that worker node.
What do you have to do on the cluster to allow the worker node to join, and start storing HDFS blocks?

A. Nothing; the worker node will automatically join the cluster when the DataNode daemon is started.
B. Without creating a dfs.hosts file or making any entries, run the command hadoop dfsadmin –refreshHadoop on the NameNode
C. Create a dfs.hosts file on the NameNode, add the worker node’s name to it, then issue the command hadoop dfsadmin –refreshNodes on the NameNode
D. Restart the NameNode

Answer: D

QUESTION 2
Assuming a cluster running HDFS, MapReduce version 2 (MRv2) on YARN with all settings at
their default, what do you need to do when adding a new slave node to a cluster?

A. Nothing, other than ensuring that DNS (or /etc/hosts files on all machines) contains am entry for the new node.
B. Restart the NameNode and ResourceManager deamons and resubmit any running jobs
C. Increase the value of dfs.number.of.needs in hdfs-site.xml
D. Add a new entry to /etc/nodes on the NameNode host.
E. Restart the NameNode daemon.

Answer: B

QUESTION 3
You have a 20 node Hadoop cluster, with 18 slave nodes and 2 master nodes running HDFS High
Availability (HA). You want to minimize the chance of data loss in you cluster. What should you do?

A. Add another master node to increase the number of nodes running the JournalNode which increases the number of machines available to HA to create a quorum
B. Configure the cluster’s disk drives with an appropriate fault tolerant RAID level
C. Run the ResourceManager on a different master from the NameNode in the order to load share HDFS metadata processing
D. Run a Secondary NameNode on a different master from the NameNode in order to load provide automatic recovery from a NameNode failure
E. Set an HDFS replication factor that provides data redundancy, protecting against failure
Answer: C

QUESTION 4
You decide to create a cluster which runs HDFS in High Availability mode with automatic failover, using Quorum-based Storage. What is the purpose of ZooKeeper in such a configuration?

A. It manages the Edits file, which is a log changes to the HDFS filesystem.
B. It monitors an NFS mount point and reports if the mount point disappears
C. It both keeps track of which NameNode is Active at any given time, and manages the Edits file, which is a log of changes to the HDFS filesystem
D. It only keeps track of which NameNode is Active at any given time
E. Clients connect toZoneKeeper to determine which NameNode is Active

Answer: D

Actualkey Cloudera CCAH CCA-505 Exam pdf, Certkingdom Cloudera CCAH CCA-505 PDF

MCTS Training, MCITP Trainnig

Best Cloudera CCAH CCA-505 Certification, Cloudera CCAH CCA-505 Training at certkingdom.com


 

CCA-505 Cloudera Certified Administrator for Apache Hadoop (CCAH) CDH5 Upgrade Exam

 

A Cloudera Certified Administrator for Apache Hadoop (CCAH) certification proves that you have demonstrated your technical knowledge, skills, and ability to configure, deploy, maintain, and secure an Apache Hadoop cluster.

Cloudera Certified Administrator for Apache Hadoop (CCA-500)
Number of Questions: 60 questions
Time Limit: 90 minutes
Passing Score: 70%
Language: English, Japanese
Price: NOT AVAILABLE
Exam Sections and Blueprint

1. HDFS (17%)
Describe the function of HDFS daemons
Describe the normal operation of an Apache Hadoop cluster, both in data storage and in data processing
Identify current features of computing systems that motivate a system like Apache Hadoop
Classify major goals of HDFS Design
Given a scenario, identify appropriate use case for HDFS Federation
Identify components and daemon of an HDFS HA-Quorum cluster
Analyze the role of HDFS security (Kerberos)
Determine the best data serialization choice for a given scenario
Describe file read and write paths
Identify the commands to manipulate files in the Hadoop File System Shell

2. YARN (17%)
Understand how to deploy core ecosystem components, including Spark, Impala, and Hive
Understand how to deploy MapReduce v2 (MRv2 / YARN), including all YARN daemons
Understand basic design strategy for YARN and Hadoop
Determine how YARN handles resource allocations
Identify the workflow of job running on YARN
Determine which files you must change and how in order to migrate a cluster from MapReduce version 1 (MRv1) to MapReduce version 2 (MRv2) running on YARN

3. Hadoop Cluster Planning (16%)
Principal points to consider in choosing the hardware and operating systems to host an Apache Hadoop cluster
Analyze the choices in selecting an OS
Understand kernel tuning and disk swapping
Given a scenario and workload pattern, identify a hardware configuration appropriate to the scenario
Given a scenario, determine the ecosystem components your cluster needs to run in order to fulfill the SLA
Cluster sizing: given a scenario and frequency of execution, identify the specifics for the workload, including CPU, memory, storage, disk I/O
Disk Sizing and Configuration, including JBOD versus RAID, SANs, virtualization, and disk sizing requirements in a cluster
Network Topologies: understand network usage in Hadoop (for both HDFS and MapReduce) and propose or identify key network design components for a given scenario

4. Hadoop Cluster Installation and Administration (25%)
Given a scenario, identify how the cluster will handle disk and machine failures
Analyze a logging configuration and logging configuration file format
Understand the basics of Hadoop metrics and cluster health monitoring
Identify the function and purpose of available tools for cluster monitoring
Be able to install all the ecoystme components in CDH 5, including (but not limited to): Impala, Flume, Oozie, Hue, Cloudera Manager, Sqoop, Hive, and Pig
Identify the function and purpose of available tools for managing the Apache Hadoop file system

5. Resource Management (10%)
Understand the overall design goals of each of Hadoop schedulers
Given a scenario, determine how the FIFO Scheduler allocates cluster resources
Given a scenario, determine how the Fair Scheduler allocates cluster resources under YARN
Given a scenario, determine how the Capacity Scheduler allocates cluster resources

6. Monitoring and Logging (15%)

Understand the functions and features of Hadoop’s metric collection abilities
Analyze the NameNode and JobTracker Web UIs
Understand how to monitor cluster daemons
Identify and monitor CPU usage on master nodes
Describe how to monitor swap and memory allocation on all nodes
Identify how to view and manage Hadoop’s log files
Interpret a log file

Disclaimer: These exam preparation pages are intended to provide information about the objectives covered by each exam, related resources, and recommended reading and courses. The material contained within these pages is not intended to guarantee a passing score on any exam. Cloudera recommends that a candidate thoroughly understand the objectives for each exam and utilize the resources and training courses recommended on these pages to gain a thorough understand of the domain of knowledge related to the role the exam evaluates..

QUESTION 1
You have installed a cluster running HDFS and MapReduce version 2 (MRv2) on YARN. You have
no afs.hosts entry()ies in your hdfs-alte.xml configuration file. You configure a new worker node by
setting fs.default.name in its configuration files to point to the NameNode on your cluster, and you
start the DataNode daemon on that worker node.
What do you have to do on the cluster to allow the worker node to join, and start storing HDFS blocks?

A. Nothing; the worker node will automatically join the cluster when the DataNode daemon is started.
B. Without creating a dfs.hosts file or making any entries, run the command hadoop dfsadmin –refreshHadoop on the NameNode
C. Create a dfs.hosts file on the NameNode, add the worker node’s name to it, then issue the command hadoop dfsadmin –refreshNodes on the NameNode
D. Restart the NameNode

Answer: D

QUESTION 2
Assuming a cluster running HDFS, MapReduce version 2 (MRv2) on YARN with all settings at
their default, what do you need to do when adding a new slave node to a cluster?

A. Nothing, other than ensuring that DNS (or /etc/hosts files on all machines) contains am entry for the new node.
B. Restart the NameNode and ResourceManager deamons and resubmit any running jobs
C. Increase the value of dfs.number.of.needs in hdfs-site.xml
D. Add a new entry to /etc/nodes on the NameNode host.
E. Restart the NameNode daemon.

Answer: B

QUESTION 3
You have a 20 node Hadoop cluster, with 18 slave nodes and 2 master nodes running HDFS High
Availability (HA). You want to minimize the chance of data loss in you cluster. What should you do?

A. Add another master node to increase the number of nodes running the JournalNode which increases the number of machines available to HA to create a quorum
B. Configure the cluster’s disk drives with an appropriate fault tolerant RAID level
C. Run the ResourceManager on a different master from the NameNode in the order to load share HDFS metadata processing
D. Run a Secondary NameNode on a different master from the NameNode in order to load provide automatic recovery from a NameNode failure
E. Set an HDFS replication factor that provides data redundancy, protecting against failure
Answer: C

QUESTION 4
You decide to create a cluster which runs HDFS in High Availability mode with automatic failover, using Quorum-based Storage. What is the purpose of ZooKeeper in such a configuration?

A. It manages the Edits file, which is a log changes to the HDFS filesystem.
B. It monitors an NFS mount point and reports if the mount point disappears
C. It both keeps track of which NameNode is Active at any given time, and manages the Edits file, which is a log of changes to the HDFS filesystem
D. It only keeps track of which NameNode is Active at any given time
E. Clients connect toZoneKeeper to determine which NameNode is Active

Answer: D

Actualkey Cloudera CCAH CCA-505 Exam pdf, Certkingdom Cloudera CCAH CCA-505 PDF

MCTS Training, MCITP Trainnig

Best Cloudera CCAH CCA-505 Certification, Cloudera CCAH CCA-505 Training at certkingdom.com


 

Thursday, October 22, 2020

HPE6-A66 Aruba Certified Design Associate Exam

 

Exam ID HPE6-A66
Exam type Proctored
Exam duration 1 hour 30 minutes
Exam length 60 questions
Passing score 67%
Delivery languages Latin American Spanish, Japanese, English
Supporting resources These recommended resources help you prepare for the exam:

Aruba Design Fundamentals, Rev. 19.41

Additional study materials

Aruba Certified Design Associate Study Guide

Register for this Exam
You need an HPE Learner ID and a Pearson VUE login and password.

No reference material is allowed at the testing site. This exam may contain beta test items for experimental purposes.

During the exam, you can make comments about the exam items. We welcome these comments as part of our continuous improvement process.

Exam description
This exam validates you have a fundamental knowledge of an Aruba network design and know the Aruba product lines to help you design the network with the assistance of a senior designer. Candidate should know how to read a customer request and extract the information needed for a wired or wireless network. Candidate should know how to use VRF and IRIS.

Ideal candidate for this exam
Candidates are IT Associates with minimal Aruba Networking knowledge. It is suggested all candidates take the course for this exam.

Exam contents
This exam has 60 questions. Here are types of questions to expect:

Multiple choice (multiple responses), scenario based
Multiple choice (single response), scenario based
Multiple choice (multiple responses)
Multiple choice (single response)

Advice to help you take this exam
Complete the training and review all course materials and documents before you take the exam.
Use HPE Press study guides and additional reference materials; study guides, practice tests, and HPE books.
Exam items are based on expected knowledge acquired from job experience, an expected level of industry standard knowledge, or other prerequisites (events, supplemental materials, etc.).
Successful completion of the course or study materials alone, does not ensure you will pass the exam.

Objectives
This exam validates that you can:

Percentage of Exam Sections/Objectives

15% Gather and analyze data, and document customer requirements
Given an outline of a customer's needs for a simple campus environment determine the information required to create a solution

25%  Evaluate the requirements, and select the appropriate Aruba solution for the design
Given a scenario, evaluate the customer requirements for a simple campus environment identify gaps per a gap analysis, and select components based on the analysis results.
Given a scenario, translate the business needs of a simple campus environment into technical customer requirements.

23% Plan and design an Aruba solution per customer requirements

Given a scenario, select the appropriate products based on the customer's technical requirements for a simple campus environment
Given the customer requirements for a single-site campus environment design the high-level Aruba solution
Given a customer scenario, explain how a specific technology or solution would meet the customer's requirements

25% Produce a detailed design specification document.
Given a customer scenario for a simple campus environment, choose the appropriate components that should be included in the BOM.
Given the customer requirements for a simple site environment determine the component details and document the high-level design.
Given a customer scenario of a simple site environment, design and document the logical and physical network solutions.
Given the customer scenario and service level agreements, document the licensing and maintenance requirements.

12% Recommend the solution to the customer.
Given the customer's requirements, explain and justify the recommended solution.

QUESTION 1
A customer has phones used as wireless Voice over IP (VoIP) devices.
Which is one implication for the design?

A. Plan policies for the phone role on MCs to give the phones a high QoS priority.
B. Ensure a -75 GHz signal in both the 2.4GHz band and the 5GHz band across the entire site.
C. Ensure that APs connect on Smart Rate ports to support the high bandwidth demands of the phones.
D. Apply a bandwidth contract to the phone VLAN to limit broadcast and multicast traffic.

Correct Answer: C

QUESTION 2
What is one reason to deploy an Aruba 8320 switch when compared to an Aruba 5400R switch?

A. to support cloud-based management and guest services through Aruba Central integration
B. to obtain a great number of options for types of ports, including PoE and non-PoE
C. to enhance network monitoring and analytics
D. to support Zero Touch Provisioning (ZTP)

Correct Answer: C

QUESTION 3
A hospital needs a wireless solution which will provide guest access for patients and visitors, as well as for
medical staff. In addition to laptops and tablets, staff have wireless voice communicator devices. Some medical equipment also connects wirelessly.
How can the network architect help to ensure that patient and visitor internet use does not interfere with more vital hospital applications?

A. Deploy IntroSpect to monitor patient and visitor traffic.
B. Plan a bandwidth contract for the guest role in the MC firewall.
C. Deploy dedicated Air Monitors (AMs) at about one-fourth the density of APs.
D. Ensure that the guest SSID has a password associated with it.

Correct Answer: C

QUESTION 4
An architect has recommended the deployment or RAPs at user home offices to provide access to the corporate LAN.
How should the architect plan the SSID for the RAPs?

A. Same SSID and security settings as the corporate SSID
B. any name for the SSID with MAC-Authentication
C. any name for the SSID, which would be open; VIA is used for security
D. same name used for the corporate SSID, but always with WPA2-Personal security

Correct Answer: A

Actualkey HPE HPE6-A66 exam pdf, Certkingdom HPE HPE6-A66 PDF

MCTS Training, MCITP Trainnig

Best HPE HPE6-A66 Certification, HPE HPE6-A66 Training at certkingdom.com

NS0-161 NetApp Certified Data Administrator-ONTAP Exam

 

NetApp Certified Data Administrator, ONTAP
You have proven skills in performing in-depth support, administrative functions, and performance management for NetApp® data storage controllers running the ONTAP® operating system in NFS and Windows® (CIFS) multiprotocol environments. You understand how to implement high-availability controller configurations, and have detailed knowledge of technologies used to manage and protect mission-critical data.

NCDA logos and certificates will be granted to those individuals who successfully pass the NetApp Certified Data Administrator, ONTAP (NS0-161) exam.
Register now for your exam

Prepare for your exam

NS0-161 NetApp Certified Data Administrator, ONTAP
Candidates for NCDA (NetApp Certified Data Administrator) certification should have at least six to 12 months of field experience implementing and administering NetApp® data storage solutions in multiprotocol environments. In addition, candidates taking the NetApp Certified Data Administrator, ONTAP exam should know how to implement HA controller configurations, SyncMirror® software for rapid data recovery, or ONTAP® solutions with either single- or multi-node configurations.

Take your exam
The NetApp Certified Data Administrator, ONTAP (NS0-161) exam includes 60 test questions, with an allotted time of 1-1/2 hours to complete. In countries where English is not the native language, candidates for whom English is not their first language will be granted a 30-minute extension to the allotted examination completion time.

Your results will be available in CertCenter two (2) to five (5) business days after you complete your exam.

The NCDA ONTAP (NS0-161) exam includes the following topics:

Storage Platforms
Describe knowledge of physical storage systems.
Describe software-defined on-premises or cloud storage systems.
Describe how to upgrade or scale ONTAP clusters.

Core ONTAP
Describe ONTAP system management.
Describe high availability concepts.
Describe how to manage Storage Virtual Machines (SVM).

Logical Storage
Describe how to use logical storage features.
Describe NetApp storage efficiency features.
Describe NetApp ONTAP Data Fabric solutions.

Networking
Describe how to use network components.
Demonstrate knowledge of how to troubleshoot network components.

SAN Solutions and Connectivity
Describe how to use SAN solutions.
Demonstrate knowledge of how to troubleshoot SAN solutions.

NAS Solutions
Describe how to use NAS solutions.
Demonstrate knowledge of how to troubleshoot NAS solutions.

Data Protection
Describe how to use ONTAP data protection solutions.
Describe how to use SnapMirror.
Identify MetroCluster concepts.

Security
Describe protocol security.
Describe security hardening.
Describe inflight or at rest encryption.
Identify SnapLock concepts.

Performance
Demonstrate knowledge of how to administer ONTAP performance.
Demonstrate knowledge of how to troubleshoot storage system performance.

QUESTION 1
What is the minimum number of disks required to create a RAID-DP data aggregate?

A. 4
B. 6
C. 3
D. 5

Correct Answer: D

QUESTION 2
In an aggregate with only thick-provisioned volumes, you need to split a FlexClone volume from its parent volume. There are other volumes in the aggregate.
In this scenario, how much space must be available within the containing aggregate?

A. You need enough space for twice the size of the parent volume.
B. The split FlexClone volume must be created in a new aggregate.
C. You need enough space for half the size of the parent volume.
D. You need to double the space in the existing aggregate.

Correct Answer: A

QUESTION 3
You have a 4-node cluster with 2x AFF A300 and 2x FAS8200 controllers. One of the AFF A300 controllers
has a 50 TB FlexVol volume that needs an immediate read-write copy for standalone testing on one of the FAS8200 controllers.
Which two commands combined accomplish this task? (Choose two.)

A. volume rehost
B. volume clone create
C. volume move
D. volume clone split

Correct Answer: B,D

Actualkey NetApp NS0-161 exam pdf, Certkingdom NetApp NS0-161 PDF

MCTS Training, MCITP Trainnig

Best NetApp NS0-161 Certification, NetApp NS0-161 Training at certkingdom.com

Friday, October 2, 2020

CIPM Certified Information Privacy Manager Exam

 

Operationalizing Privacy – Turning Policies into Programs
Make data privacy regulations work for your organization by understanding how to implement them in day-to-day operations. Learn to create a company vision, structure a data protection team, develop and implement system frameworks, communicate to stakeholders, measure performance and more.

How to create a company vision
How to structure the privacy team
How to develop and implement a privacy program framework
How to communicate to stakeholders
How to measure performance
The privacy program operational life cycle

The CIPM body of knowledge outlines all the concepts and topics that you need to know to become certified. The exam blueprint gives you an idea of how many questions from each topic area you can expect on the exam. These documents, as well as additional certification resources and helpful links, can be found here.

We strongly encourage all potential test takers to read our 2020 Certification Candidate Handbook before testing for details on our testing policies and procedures.

How to Prepare
Earning respected credentials requires a rigorous certification process, which includes passing demanding exams. IAPP exams have a reputation for being difficult to pass on the first try. We strongly recommend careful preparation, even for degreed professionals who have passed other certification tests.

Preparation makes all the difference. In general, we recommend that you train and study for a minimum of 30 hours.

We want you to succeed. Please take advantage of IAPP resources to get through exams with as little anxiety as possible.

Tips for effective studying
Consider enrolling in an IAPP training class. With multiple in-person, live online and online options, IAPP trainings are created from the same topic outline from which exam questions are drawn. PLEASE NOTE that completing a training course does not guarantee passing an exam. Additional preparation is essential, so:

Self-assess—Each IAPP exam comes with two tools for determining how ready you are:

The body of knowledge is an outline of the information covered in the exam. Use it to identify topics you are and are not familiar with.
The exam blueprint tells you how many questions to expect on each topic. Use it to map out a study strategy—allowing more time for topics with many questions, for example.

Use your textbook properly—Textbooks are included with purchase of in-person, live online and online training, and are also sold separately through the IAPP store. Start by reading the table of contents. Note which topics are new to you. That will give you a feel for how much study and review time you need. When you start reading:

Highlight important points in each chapter
Copy out key passages; it will help you remember them
Review each chapter to make sure you’ve captured the key points before moving on

Create flash cards—As you read your textbook, articles, web pages, etc., copy new terms onto notecards. Write their definitions on the other side. Quiz yourself. Use the IAPP’s glossary of privacy terms to look up unfamiliar terms and make flash cards of them, also.

Form a study group—Discussing the material with your coworkers and colleagues is a great way to remember material and understand it more deeply.

Learn in context—It’s easier and more interesting to learn a subject you’re going to use in real life. IAPP publications show how privacy affects our lives and businesses:
Privacy Perspectives
The Privacy Advisor
Privacy Tracker
Privacy Tech
DPO Confessional

Get familiar with privacy news and issues by subscribing to the Daily Dashboard and the IAPP’s curated regional news digests:
Daily Dashboard
Canada Dashboard Digest
Asia-Pacific Dashboard Digest
Latin America Dashboard Digest
Europe Data Protection Digest
U.S. Privacy Digest

Explore our information-packed Resource Center, participate in educational web conferences and listen to The Privacy Advisor Podcast.

Also, compare what’s going on in privacy today with your job. What privacy issues could affect your work and career?
Use questions to find answers­—Utilize sample questions to help you review what you’ve studied and identify weak areas. Re-read notes and chapters on those subjects. Ask your study partners questions. Search for articles that approach the subject from different directions.

QUESTION 1
In addition to regulatory requirements and business practices, what important factors must a global privacy strategy consider?

A. Monetary exchange
B. Geographic features
C. Political history
D. Cultural norms

Correct Answer: B

QUESTION 2
How are individual program needs and specific organizational goals identified in privacy framework development?

A. By employing metrics to align privacy protection with objectives
B. Through conversations with the privacy team
C. By employing an industry-standard needs analysis
D. Through creation of the business case

Correct Answer: A

QUESTION 3
In privacy protection, what is a “covered entity”?

A. Personal data collected by a privacy organization
B. An organization subject to the privacy provisions of HIPAA
C. A privacy office or team fully responsible for protecting personal information
D. Hidden gaps in privacy protection that may go unnoticed without expert analysis

Correct Answer: B

QUESTION 4
Which of the following is an example of Privacy by Design (PbD)?

A. A company hires a professional to structure a privacy program that anticipates the increasing demands of new laws.
B. The human resources group develops a training program from employees to become certified in privacy policy.
C. A labor union insists that the details of employers’ data protection methods be documented in a new contract.
D. The information technology group uses privacy considerations to inform the development of new networking software.

Correct Answer: C

QUESTION 5
What is the key factor that lays the foundation for all other elements of a privacy program?

A. The applicable privacy regulations
B. The structure of a privacy team
C. A privacy mission statement
D. A responsible internal stakeholder

Correct Answer: A

Actualkey IAPP CIPM exam pdf, Certkingdom IAPP CIPM PDF

MCTS Training, MCITP Trainnig

Best IAPP CIPM Certification, IAPP CIPM Training at certkingdom.com

Thursday, July 2, 2020

C1000-087 IBM Cloud Pak for Applications Solution Architect V4.1 Exam

Number of questions: 60
Number of questions to pass: 43
Time allowed: 90 mins
Status: Live

An IBM Certified Solution Architect – IBM Cloud Pak for Applications is a person who can design, plan and create an architecture with IBM Cloud Pak for Applications. They can do this with limited assistance from support, documentation or relevant subject matter experts.

This exam consists of seven sections described below. For more detail, please see the study guide on the Exam Preparation tab.

Section 1: Design and architect a Cloud Native solution 13%
Understand the main elements of a cloud native solution
Design a Microservices architecture
Explain Containers and Container Orchestration
Understand the Cloud Native reference architecture

Section 2: OpenShift Container Platform Architecture 17%
Understand the OpenShift Container Platform Architecture
Understand HA,DR, Backup and Storage

Section 3: Cloud Pak for Applications Overview 20%
Understand the Cloud Pak for Applications value proposition
Explain IBM Cloud Pak for Applications components
Understand entitlements and subscriptions
Articulate Cloud Pak integration scenarios

Section 4: Architecting for new applications 12%
Design serverless applications
Build Mobile application
Understand the capabilities of runtimes

Section 5: Architecting new applications with Accelerators for Teams 17%
Describe the business value and outcomes from cloud-native governance and Accelerators for Teams
Understand the reference architecture for Accelerator for Teams via Operators that manage the lifecycles and governance of the components
Demonstrate knowledge of how to approach building and customizing application stacks
Demonstrate knowledge of the available developer tools
Understand how to customize the integrated DevOps Toolchain

Section 6: Modernize applications 12%
Understand the Application Modernization Journey
Understand the use case of running existing applications
Understand the application modernization tools

Section 7: Architect Continuous Integration/Continuous Deployment (CI/CD) and the DevOps lifecycle 9%
Extending deployment automation, governance, pipeline & release orchestration with IBM Cloud DevOps
Understand the value of integrating IBM Cloud DevOps with Openshift Pipelines and other Cloud providers


Actualkey IBM C1000-087 exam pdf, Certkingdom IBM C1000-087 PDF
MCTS Training, MCITP Trainnig
Best IBM C1000-087 Certification, IBM C1000-087 Training at certkingdom.com

Monday, June 29, 2020

C1000-080 IBM Business Automation Workflow v19 Application Development using Integration Designer Exam

Number of questions: 67
Number of questions to pass: 48
Time allowed: 90 mins
Status: Live

This intermediate level certification is intended for Application Developers responsible for the development of integration services for business process applications. This certification focuses on application development with IBM Integration Designer V19.0 for deployment on IBM Business Automation Workflow V19.0.

This exam does not include IBM Process Designer or Process and Case Modeling.

This exam consists of eight sections described below. For more detail, please see the study guide on the Exam Preparation tab.

Installation and Configuration 4%
Install and update the IBM Integration Designer (IID)
Install and configure a Unit Test Environment (UTE)

Service Component Architecture (SCA) programming model and solution design 22%
Design and use Service Component Architecture (SCA)
Design and use business objects
Demonstrate an understanding of Service Component Architecture (SCA)
Effectively organize a solution into modules, mediation modules, and libraries taking into consideration component reuse, and application maintainability
Determine the appropriate use of macroflows (long-running processes), microflows (short-running processes), and mediations
Effectively use quality of service (QoS) qualifiers
Demonstrate understanding of and apply performance considerations for business integration solutions, including long-running processes
Configure dynamic invocation using Gateway patterns
Monitor Business Processing using Dynamic Event Framework (DEF) and audit logging

BPEL Development 15%
Design and implement Business Process Execution Language (BPEL) processes using the business process editor
Use correlation sets in the BPEL process
Demonstrate understanding of transaction behavior
Implement custom logic using the visual snippet editor and Java code
Implement error handling and compensation within a business process
Demonstrate an understanding of working with Human tasks
Create new versions for the BPEL process

Mediation Development 15%
Describe the Service Message Object (SMO)
Implement fault handling in mediation modules
Develop mediation flows
Use mediation primitives and subflows in mediation flows
Transform data using maps (XSLT and Business Object)
Use dynamic service routing through a Dynamic Endpoint Selection
Design a parallel flow (fan in/fan out)

Workflow Center Repository 11%
Working with the Workflow Center Perspective
Import Process Application and Toolkits
Manage artifacts in the repository (associating, disassociating and merging)
Implement advanced integration service (emulating)
Understand design considerations when working with Workflow Center Repository integration modules

Connectivity and Integration 13%
Use and configure technology adapters, including the Java Database Connectivity (JDBC), FTP, email, Enterprise Content Management (ECM) and flat file adapters
Configure import and export bindings (for example, JMS, MQ, Web Services, HTTP, and SCA)
Demonstrate an understanding of different SCA invocation styles between synchronous, asynchronous using one-way operations, asynchronous with callback, and asynchronous with deferred response

Packaging and Deployment 9%
Generate unmanaged integration module deployment packages
Apply security to SCA application
Understand the use of shared library
Use Software Configuration Management (SCM) system with Integration Designer

Testing and Troubleshooting 11%
Test business integration solutions using component tests
Configure and use the integration test client tool to test components
Use Business Process Choreographer (BPC) Explorer for testing and troubleshooting long-running processes tasks
Use appropriate server logs and cross component trace (XCT) for problem determination
Use the integration debugger to debug business integration components
Demonstrate an understanding of Failed Event Manager (FEM) and recovering of events

Actualkey IBM C1000-080 exam pdf, Certkingdom IBM C1000-080 PDF
MCTS Training, MCITP Trainnig
Best IBM C1000-080 Certification, IBM C1000-080 Training at certkingdom.com

 

Friday, May 1, 2020

MCIA - Level 1 MuleSoft Certified Integration Architect - Level 1 Exam

Description
MuleSoft Certified Integration Architect - Level 1 A MuleSoft Certified Integration Architect should be able to drive and be responsible for an organization’s Anypoint Platform implementation and the technical quality, governance (ensuring compliance), and operationalization of the integration solutions. The MCIA - Level 1 exam validates that an architect has the required knowledge and skills to work with technical and non-technical stakeholders to translate functional and non-functional requirements into integration interfaces and implementations. S/he should be able to:

Create the high-level design of integration solutions and guide implementation teams on the choice of Mule components and patterns to use in the detailed design and implementation.
Select the deployment approach and configuration of Anypoint Platform with any of the available deployment options (MuleSoft-hosted or customer-hosted control plane and runtime plane).
Design Mule applications for any of the available deployment options of the Anypoint Platform runtime plane.
Apply standard development methods covering the full development lifecycle (project preparation, analysis, design, development, testing, deployment, and support) to ensure solution quality.
Advise technical teams on performance, scalability, reliability, monitoring and other operational concerns of integration solutions on Anypoint Platform.
Design reusable assets, components, standards, frameworks, and processes to support and facilitate API and integration projects.

Note: Approximately 10% of the exam is specific to Mule 4.

A downloadable data sheet for the exam can be found here.

Format: Multiple-choice, closed book, proctored online or in a testing center
Length: 58 questions
Duration: 120 minutes (2 hours)
Pass score: 70%
Language: English

The exam can be taken a maximum of 5 times, with a 24 hour wait between each attempt.

The exam can be purchased with one of the following. Each includes a coupon for one free retake.

1.5 Flexible Training Credits (FTC)

A voucher obtained by attending the instructor-led Anypoint Platform Architecture: Integration Solutions course

Additional retakes (attempts 3 to 5) are $250 or 1 FTC and do not come with a free retake.

The certification expires two years from the date of passing.

The best preparation for the exam is to take the instructor-led Anypoint Platform Architecture: Integration Solutions course. Candidates should be familiar with all of the content in the course and be able to apply the concepts.

The following resources are available to assist in a candidate’s preparation:

Instructor-led training: Anypoint Platform Architecture: Integration Solutions

Recommended as the most effective and efficient method of preparation
5-day class
Private and public classes available
Onsite and online classes available
Includes a certification voucher for this exam
Practice quiz
20+ multiple-choice questions
Comparable difficulty to the proctored exam

Topics
The exam validates that the candidate can perform the following tasks.

Note: ARC:INT is the acronym for the Anypoint Platform Architecture: Integration Solutions course.

Configuring and Provisioning Anypoint Platform
Configure business groups, roles, and permissions within an Anypoint Platform organization
Select Anypoint Platform identity management vs client management for the correct purpose
Identify common and distinguishing features and usage scenarios for CloudHub VPCs and public worker cloud
Suggest number of Mule runtimes for a Mule application given performance targets and HA requirements
Define a performant and HA deployment architecture for Mule applications in on-prem deployments
Select monitoring options for all available Anypoint Platform deployment options

ARC:INT Module 5
ARC:INT Module 7
ARC:INT Module 9
ARC:INT Module 10

Selecting Integration Styles

Given a description of an integration problem, identify the most appropriate integration style
When designing an integration solution, select the most appropriate interface/data technology and interface definition language for all integration interfaces
Design parts of an integration solution using general message-based integration or event-driven architecture (EDA) using message brokers or streaming technologies
Recognize scenarios where message correlation is necessary

ARC:INT Module 2
ARC:INT Module 3
ARC:INT Module 4
ARC:INT Module 5
ARC:INT Module 7

Designing and Documenting Enterprise Integration Architecture
For a given organization and their preferences and constraints, select the most appropriate Anypoint Platform deployment option
Design parts of an integration solution using any SOA-based integration approach
Identify the information that should be included in any integration solution architecture document
Simplify a large-scale enterprise-wide integration architecture so that it can be effectively communicated to semi-technical stakeholders
Identify the persistence mechanism and durability used for watermarks in different Mule runtime deployment options
Identify integrations scenarios for which the use of batch would be beneficial
Design for short or long retries using reconnection strategies
Identify common and distinguishing features and usage scenarios for CloudHub DLBs and public CloudHub LBs

ARC:INT Module 1
ARC:INT Module 3
ARC:INT Module 7
ARC:INT Module 8

Architecting Resilient and Performant Integration Solutions
Recognize requirements that are best addressed using transactions (single-resource and XA)
Define transaction considerations where needed in a solution design including the requirement for an external transaction coordinator
Specify the connectors that can participate in the different types of transactions
Recognize the purpose of various fault-tolerance strategies for remote calls
Design parts of an integration solution using general batch-oriented integration or ETL to/from files or databases
Determine if horizontal scaling will help a Mule application meet its performance targets

ARC:INT Module 5
ARC:INT Module 7
ARC:INT Module 8
ARC:INT Module 11
ARC:INT Module 12
ARC:INT Module 13
ARC:INT Module 14

Handling Events and Messages

Identify scenarios in which to use different storage mechanisms including persistent and non-persistent ObjectStore, in-memory ObjectStore, cluster-replicated in-memory OS, hashtables, and disk-persisted OS
Select suitable storage mechanisms for IDs (correlation IDs, message IDs, transaction IDs) in Mule applications deployed to CloudHub or on-prem
Use Mule 4 constructs to make effective use of Enterprise Integration Patterns
Use streaming to handle large payloads within Mule applications
Predict the runtime behavior of messages queued internally for processing for load balancing or to achieve reliability
Predict the runtime load balancing behavior of messages sent to the public URL of a Mule application deployed to multiple CloudHub workers

ARC:INT Module 8
ARC:INT Module 9
ARC:INT Module 12
ARC:INT Module 13
ARC:INT Module 14

Designing Applications with Anypoint Connectors
For a given Mule 4 connector (Premium, Select, and MuleSoft Certified), identify its purpose, the network protocol it uses, and whether it supports incoming or outgoing types of connections
Specify the requirements that would require the use of domain-level connectors
Specify when a Mule application would require persistence and select an appropriate persistence solution
Identify possible failures when a component (such as an API client) invokes a remote component (such as an API implementation)

ARC:INT Module 2
ARC:INT Module 3
ARC:INT Module 4
ARC:INT Module 8
ARC:INT Module 10
ARC:INT Module 12

Designing Networks for Anypoint Connectors
For a given connector, recognize whether it will typically connect to/from an external system across organizational boundaries
Use transport protocols and connectors correctly and coherently when and where applicable
Match protocols with networking constraints and API layers
When incoming and outgoing HTTPS connections with mutual authentication are used, identify what certificates are needed in what stores in different environments

ARC:INT Module 3
ARC:INT Module 7
ARC:INT Module 8
ARC:INT Module 15
ARC:INT Module 16

Handling Integration Implementation Lifecycles
Identify the Anypoint Platform components where various types of API-related assets and artifacts are maintained or published
Recognize the advantages and disadvantages of storing and managing properties in properties files in Mule applications
For a given API or integration, identify the steps that need to be taken in order for testing to occur

ARC:INT Module 6
ARC:INT Module 7
ARC:INT Module 10

Implementing DevOps
Specify the purpose of various MuleSoft products in the area of DevOps/CI/CD
Identify differences, advantages, and disadvantages of DevOps based on deployable Mule applications versus container images
Formulate an effective source code management strategy including branching and merging
Specify testing strategies that use both mocking and invoking of external dependencies

ARC:INT Module 6
ARC:INT Module 7
ARC:INT Module 10

Operating and Monitoring Integration Solutions
Specify the type of metrics for API invocations and API implementations that can be monitored with Anypoint Platform
Identify metrics and operations exposed by default via JMX
Identify differences in monitoring and alerting between customer-hosted and MuleSoft-hosted Anypoint Platform
Identify ways of transmitting IDs between components in remote interactions and capture this in the interface design of the remote interaction

QUESTION 1
A global organization operates datacenters in many countries. There are private network links between these
datacenters because all business data (but NOT metadata) must be exchanged over these private network
connections.
The organization does not currently use AWS in any way.
The strategic decision has just been made to rigorously minimize IT operations effort and investment going forward.
What combination of deployment options of the Anypoint Platform control plane and runtime plane(s) best
serves this organization at the start of this strategic journey?

A. MuleSoft-hosted Anypoint Platform control plane CloudHub Shared Worker Cloud in multiple AWS regions
B. MuleSoft-hosted Anypoint Platform control plane Customer-hosted runtime plane in multiple AWS regions
C. MuleSoft-hosted Anypoint Platform control plane Customer-hosted runtime plane in each datacenter
D. Anypoint Platform - Private Cloud Edition Customer-hosted runtime plane in each datacenter

Correct Answer: B

QUESTION 2
Anypoint Exchange is required to maintain the source code of some of the assets committed to it, such as
Connectors, Templates, and API specifications.
What is the best way to use an organization's source-code management (SCM) system in this context?

A. Organizations need to point Anypoint Exchange to their SCM system so Anypoint Exchange can pull source code when requested by developers and provide it to Anypoint Studio
B. Organizations need to use Anypoint Exchange as the main SCM system to centralize versioning and avoid code duplication
C. Organizations can continue to use an SCM system of their choice for branching and merging, as long as they follow the branching and merging strategy enforced by Anypoint Exchange
D. Organizations should continue to use an SCM system of their choice, in addition to keeping source code for these asset types in Anypoint Exchange, thereby enabling parallel development, branching, and merging

Correct Answer: B

QUESTION 3
An organization is designing an integration solution to replicate financial transaction data from a legacy system into a data warehouse (DWH).
The DWH must contain a daily snapshot of financial transactions, to be delivered as a CSV file. Daily
transaction volume exceeds tens of millions of records, with significant spikes in volume during popular shopping periods.
What is the most appropriate integration style for an integration solution that meets the organization's current requirements?

A. API-led connectivity
B. Batch-triggered ETL
C. Event-driven architecture
D. Microservice architecture

Correct Answer: D
Actualkey Mulesoft MCIA - Level 1 exam pdf, Certkingdom Mulesoft MCIA - Level 1 PDF
MCTS Training, MCITP Trainnig
Best Mulesoft MCIA - Level 1 Certification, Mulesoft MCIA - Level 1 Training at certkingdom.com

Tuesday, April 28, 2020

Exam MB-400 Microsoft Power Apps + Dynamics 365 Developer

The content of this exam will be updated on May 22, 2020. Please download the skills measured document below to see what will be changing.

Candidates for this exam are Developers who work with Microsoft Power Apps model-driven apps in Dynamics 365 to design, develop, secure, and extend a Dynamics 365 implementation. Candidates implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates must have strong applied knowledge of Power Apps model-driven apps in Dynamics 365, including in-depth understanding of customization, configuration, integration, and extensibility, as well as boundaries and constraints. Candidates should have a basic understanding of DevOps practices for Power Apps model-driven apps in Dynamics 365. Candidates must expose, store, and report on data.

Candidates should have development experience that includes JavaScript, TypeScript, C#, HTML, .NET, Microsoft Azure, Office 365, RESTful Web Services, ASP.NET, and Power BI.

Skills measured
The content of this exam will be updated on May 22, 2020. Please download the skills measured document below to see what will be changing.
Create a technical design (10-15%)
Configure Common Data Service (CDS) (15-20%)
Create and configure Power Apps (10-15%)
Configure business process automation (10-15%)
Extend the user experience (15-20%)
Extend the platform (15-20%)
Develop integrations (10-15%)

Audience Profile Candidates for this exam are developers who work with Microsoft Power Apps model-driven apps in Dynamics 365 to design, develop, secure, and extend a Dynamics 365 implementation. Candidates implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations. Candidates must have strong applied knowledge of Power Apps model-driven apps in Dynamics 365, including in-depth understanding of customization, configuration, integration, and extensibility, as well as boundaries and constraints. Candidates should have a basic understanding of DevOps practices for Power Apps model-driven apps in Dynamics 365. Candidates must expose, store, and report on data. Candidates should have development experience that includes JavaScript, TypeScript, C#, HTML, .NET, Microsoft Azure, Office 365, RESTful Web Services, ASP.NET, and Power BI. Skills Measured

NOTE: The bullets that appear below each of the skills measured are intended to illustrate how we are assessing that skill. This list is not definitive or exhaustive.

NOTE: In most cases, exams do NOT cover preview features, and some features will only be added to an exam when they are GA (General Availability).

Create a Technical Design (10-15%)
Validate requirements and design technical architecture
 design and validate technical architecture
 design authentication and authorization strategy
 determine whether requirements can be met with out-of-the-box functionality
 determine when to use Logic Apps versus Microsoft Flow
 determine when to use serverless computing vs. plug-ins
 determine when to build a virtual entity data source provider vs. when to use connectors Create a data model
 design a data model Configure Common Data Service (CDS) (

15-20%) Configure security to support development
 troubleshoot operational security issues
 create or update security roles and field-level security profiles Implement entities and fields
 configure entities
 configure fields
 configure relationships Create and maintain solutions
 configure solutions
 import and export solutions
 manage solution dependencies Create and Configure Power Apps

(10-15%) Create model-driven apps
 configure a model-driven app
 configure forms
 configure views
 configure visualizations Create Canvas Apps
 configure a Canvas App
 develop complex expressions Configure business process automation

(10-15%) Configure Microsoft Flow
 configure a Flow
 configure actions to use CDS connectors
 develop complex expressions Implement processes
 create and configure business process flows
 create and configure business rules

Extend the user experience

(15-20%) Apply business logic using client scripting
 configure supporting components
 create JavaScript or Typescript code
 register an event handler
 use the Web API from client scripting

Create a Power Apps Component Framework (PCF) component
 initialize a new PCF component
 configure a PCF component manifest
 implement the component interfaces
 package, deploy, and consume the component
 use Web API device capabilities and other component framework services

Create a command button function
 create the command function
 design command button triggers, rules, and actions
 edit the command bar using the Ribbon Workbench
 modify the form JavaScript library dependencies

Extend the platform (15-20%)


Create a plug-in
 debug and troubleshoot a plug-in
 develop a plug-in
 use the Organization Service
 optimize plug-ins for performance
 register custom assemblies by using the Plug-in Registration Tool
 create custom actions

Configure custom connectors for Power Apps and Flow
 create a definition for the API
 configure API security
 use policy templates

Use platform APIs
 interact with data and processes using the Web API
 optimize for performance, concurrency, transactions, and batching
 perform discovery using the Web API
 perform entity metadata operations with the Web API
 use OAuth with the platform APIs

Develop Integrations (10-15%)
Publish and consume events
 publish an event by using the API
 publish an event by using the Plug-in Registration Tool
 register a webhook
 create an Azure event listener application

Implement data synchronization
 configure and use entity change tracking
 configure the data export service to integrate with Azure SQL Database
 create and use alternate keys

The exam guide below shows the changes that will be implemented on May 22, 2020. Audience Profile Candidates for this exam are developers who work with Microsoft Power Apps model-driven apps in Dynamics 365 to design, develop, secure, and extend a Dynamics 365 implementation. Candidates implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations. Candidates must have strong applied knowledge of Power Apps model-driven apps in Dynamics 365, including in-depth understanding of customization, configuration, integration, and extensibility, as well as boundaries and constraints. Candidates should have a basic understanding of DevOps practices for Power Apps model-driven apps in Dynamics 365. Candidates must expose, store, and report on data. Candidates should have development experience that includes JavaScript, TypeScript, C#, HTML, .NET, Microsoft Azure, Office 365, RESTful Web Services, ASP.NET, and Power BI.

Skills Measured NOTE:
The bullets that appear below each of the skills measured are intended to illustrate how we are assessing that skill. This list is not definitive or exhaustive.
NOTE: In most cases, exams do
NOT cover preview features, and some features will only be added to an exam when they are GA (General Availability). Create a Technical Design

(10-15%) Validate requirements and design technical architecture
 design and validate technical architecture
 design authentication and authorization strategy
 determine whether requirements can be met with out-of-the-box functionality
 determine when to use Logic Apps versus Microsoft FlowPower Automate flows
 determine when to use serverless computing vs. plug-ins
 determine when to build a virtual entity data source provider vs. when to use connectors Create a data model
 design a data model Configure Common Data Service (CDS)

(15-20%) Configure security to support development
 troubleshoot operational security issues
 create or update security roles and field-level security profiles Implement entities and fields
 configure entities
 configure fields
 configure relationships Create and maintain solutions
 configure solutions
 import and export solutions
 manage solution dependencies Create and Configure Power Apps

(10-15%) Create model-driven apps
 configure a model-driven app
 configure forms
 configure views
 configure visualizations Create Canvas Apps
 configure a Canvas App
 develop complex expressions Configure business process automation

(10-15%) Configure Microsoft FlowPower Automate
 configure a Flow
 configure actions to use CDS Common Data Service connectors
 develop complex expressions Implement processes
 create and configure business process flows
 create and configure business rules

Extend the user experience (15-20%)
Apply business logic using client scripting

 configure supporting components
 create JavaScript or Typescript code
 register an event handler
 use the Web API from client scripting

Create a Power Apps Component Framework (PCF) component
 initialize a new PCF component
 configure a PCF component manifest
 implement the component interfaces
 package, deploy, and consume the component
 use Web API device capabilities and other component framework services

Create a command button function
 create the command function
 design command button triggers, rules, and actions
 edit the command bar using the Ribbon Workbench
 modify the form JavaScript library dependencies

Extend the platform (15-20%)
Create a plug-in
 debug and troubleshoot a plug-in
 develop a plug-in
 use the Organization Service global Discovery Service endpoint
 optimize plug-ins for performance
 register custom assemblies by using the Plug-in Registration Tool
 create custom actions

Configure custom connectors for Power Apps and Flow
 create a definition for the API
 configure API security
 use policy templates

Use platform APIs
 interact with data and processes using the Web API
 optimize for performance, concurrency, transactions, and batching
 perform discovery using the Web API
 perform entity metadata operations with the Web API
 use OAuth with the platform APIs

Develop Integrations (10-15%)
Publish and consume events
 publish an event by using the API
 publish an event by using the Plug-in Registration Tool
 register a webhook
 create an Azure event listener application

Implement data synchronization
 configure and use entity change tracking
 configure the data export service to integrate with Azure SQL Database
 create and use alternate keys

QUESTION 1
You need to replace the bicycle inspection forms.
Which two solutions should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. a canvas app that guides the technician through the inspection
B. a logic app that guides the technician through the inspection
C. a flow that maps inspection data to Dynamics 365 for Field Service
D. a model-driven app based on customer service entities

Correct Answer: C,D


QUESTION 2
Note: This question is part of a series of questions that present the same scenario. Each question in
the series contains a unique solution that might meet the stated goals. Some question sets might have
more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these
questions will not appear in the review screen.

A Common Data Service (CDS) environment has two custom entities named Building code and Work item.
Building code has a code date custom field and Work item has an elapsed time custom field. Construction
workers use a consolidated custom form with data from both entities to fill.in their daily work items.
A JavaScript library is used with these custom entities and fields to apply complex logic.
You need to ensure that the JavaScript library continues to function as originally designed if other developers
expand the environment.
Solution: In form properties of the consolidated form, add the JavaScript library in the events tab and add the
two custom fields to the dependent fields section of the non-event dependencies tab.
Does the solution meet the goal?

A. Yes
B. No

Correct Answer: B

QUESTION 3
Note: This question is part of a series of questions that present the same scenario. Each question in
the series contains a unique solution that might meet the stated goals. Some question sets might have
more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these
questions will not appear in the review screen.

A Common Data Service (CDS) environment has two custom entities named Building code and Work item.
Building code has a code date custom field and Work item has an elapsed time custom field. Construction
workers use a consolidated custom form with data from both entities to fill.in their daily work items.
A JavaScript library is used with these custom entities and fields to apply complex logic.
You need to ensure that the JavaScript library continues to function as originally designed if other developers
expand the environment.
Solution: In the JavaScript library, add Building code with Code date and Work item with Elapsed time in the dependencies tab.
Does the solution meet the goal?

A. Yes
B. No

Correct Answer: A
Actualkey Microsoft Certified Power Apps MB-400 exam pdf, Certkingdom MB-400 PDF
MCTS Training, MCITP Trainnig
Best Microsoft Certified Power Apps MB-400 Certification, Microsoft Certified Power Apps MB-400 Training at certkingdom.com