Friday, 8 December 2017

C9030-644 IBM z Systems Technical Support V7

Number of questions: 65
Number of questions to pass: 42
Time allowed: 120 mins
Status: Live

This exam consists of 5 sections described below.

Apply Information / Installation Planning / Migration Considerations 15%
Identify areas of risk to discuss with customer, relevant business partner(s), and IBM team:
Sysplex, I/O options
End of life / limited life
Drawers for cards (z13 and Infiniband cards)
TCO (workload retention)
Create a mutually developed implementation plan with the customer, including post-install.
Describe and implement consolidation methodology.
Ensure implementation plan is executed per requirements (including all necessary vendors, business partners, and IBM team groups).
Ensure that customer expectations have been met.

Business Resiliency 14%
Identify the elements of high availability which enable a z Systems environment to remain up and running without unscheduled outages (e.g., elements unique to single system environments, or multiple system environments in a single location, and multi-system-multi-location environment).
Identify the elements of continuous availability which enable a z Systems environment to remain up and running without any outages (planned or unplanned).
Identify the elements of a disaster recovery solution which affect the ability of the business to continue to run.
Explain recovery time objective (RTO) and recovery point objective (RPO), and identify the technologies that support these objectives.
Identify business or external elements which make having a resilient business critical, such as governmental and industry regulations or standards (finance, transit, etc), audit points, competition, and revenue impact.
Identify z Systems business resilience options and their capabilities, and alternative offerings, including (but not limited to) IBM BCRS and GDPS.
Given specific customer criteria and requirements, propose the appropriate business resilience solution, product, or process.
Conduct a business impact analysis to identify a solution which eliminates identified single points of failure (networking redundancy, application failover, infrastructure redundancy, HW or SW product redundancy).

Evaluate Customer Environment and Plans 15%
Evaluate and document current customer environment (equipment, software, staff usage, satisfaction, need for change and growth).
Identify opportunities (business resilience, workload consolidation, cloud, analytics, mobile, etc.).
Solve customer business problems using tools, methods, and processes, including benchmarks.
Solve customer business problems using capacity planning tools (zPCR, zCP3000, zBNA, zSoftCap, zTPM, zSCON).
Discuss how customers can use Techline to resolve computing issues.
Solve customer business problems using tools, methods, and processes, including methodology for best fit (newer workloads, operating environments, "fit for purpose," etc.).
Determine which tools are used to compare different platforms: Sizing, TCO analysis, etc. (RACE, Eagle).

z Systems Features and Architecture 38%
Describe z Systems models (z13, z13s).
Describe LinuxONE models (Emperor and Rockhopper).
Describe currently marketed z Systems, operating systems (z/OS, z/VM, and Linux), and related system software, middleware, and compilers.
Describe z Systems virtualization (PR/SM, z/VM, DPM, KVM).
Describe z Systems specialty engines (IFL, zIIP, ICF) use and benefits.
Describe z Systems permanent and temporary capacity offerings (CoD, Capacity Provisioning, CBU).
Describe z Systems security offerings (RACF, PKI, Crypto, TKE).
Describe z Systems storage compatibility options for z/OS, z/VM and Linux on z Systems.
Describe z Systems connectivity options (I/O such as FICON, OSA, RoCE, zEDC and coupling links such as Infiniband, Integrated Coupling Adapter).
Describe z Systems performance improvements (HiperDispatch, zHPF, Out of Order Execution, Flash Express, Large Memory).
Describe z Systems architectural enhancements (SIMD, SMT, new instructions, Chip/Cache structure, PCIe, IFP, IBM zAware, Secure Service Container).
Describe z Systems modernization of legacy applications (new architecture and deployment).
Describe z Systems systems management (HMC, SE, zOSMF, SMF, RMF, IBM WAVE, etc.).
Describe software pricing options under z/OS, z/VM, and Linux on z Systems.

z Systems Solutions 17%
Security: Identify those things in a z Systems environment that protect networks, data, and applications.
Security: Identify common methods to reduce risk exposure in encryption and cryptography.
Security: Discuss gaps in the customer security environment in data security (encryption, data-at-rest, data-in-motion, permissions/access, etc.).
Security: Identify tools, resources and products which monitor, track and secure data, applications, and systems, and differentiate when they are used. (zSecure, crypto coprocessors, IBM Multi-Factor Authentication for z/OS (MFA), etc.).
Security: Given a customer situation, select the security solution (including processes, products, tools and services) which most closely matches the customer requirements.
Cloud: Identify the characteristics of a cloud environment (elastic, broad network access, pooled/shared resources, measured usage/resources).
Cloud: Position cloud solutions on z Systems that differentiate it from cloud solutions on other platforms (elements that are unique to z Systems, such as cryptography, RAS, Capacity on Demand).
Cloud: Position cloud virtualization solutions on z Systems that differentiate it from cloud solutions on other platforms.
Cloud: Describe hybrid cloud characteristics.
Analytics: Describe the unique values of data intensive workloads on z Systems including reliability, availability, security and scalability.
Blockchain: Describe the capabilities and benefits of Blockchain.

This exam has an Assessment Exam option: A9030-644 Assessment: IBM z Systems Technical Support V7

Assessment exams are web-based exams that provides you, at a cheaper costs, the ability to check your skills before taking the certification exam.
This assessment exam is available in: English, Japanese

Passing the exam does not award you a certification, and it is only used to help you assess if you are ready or not to take the certification exam.

You can register for it at Pearson VUE and it will provide you a score report, showing you how you did in each section.

All IBM certification tests presume a certain amount of "on-the-job" experience which is not present in any classroom or Web presentation. The recommended courses and links will help you gain the skill and product knowledge represented in the test objectives. They do�? not teach the answers to the test questions, and are not intended to do so. This information may not cover all subject areas in the certification test, or may contain more recent information than is present in the certification test. Taking these or any classes will not guarantee that you will achieve certification

IBM z Systems Technical Support V7�? certification�? preparation education can be found at the sites below.�? �? �? �?

IBMers:�? �? IBM Systems Academy

Business Partners or Clients:�? IBM PartnerWorld University

QUESTION 1
A customer wants to concurrently add a central processor (CP) to a running z/OS LPAR, but has no reserved processors defined.
What must they do to add this additional processor to the LPAR?

A. Dynamically update and activate the new profile.
B. Invoke Change Running System on HMC/SE (Support Element) to add the CP to the running LPAR.Configure the CP online in z/OS.
C. Configure a CP online in z/OS.
D. Take down the LPAR and deactivate the current profile.Update the profile with the number of processors needed.Activate the new profile and IPL the LPAR.

Answer: D

Explanation:
With a proper PFT (Profile table entry) you can add CP, ZIIP, IFL, and ICSF processor to an LPAR.
References: IBM z13 and IBM z13s Technical Introduction (March 2016), page 90


QUESTION 2
Which functionality provides a high speed networking link for sharing data using server-to-server communications, at low latency, and with lower CPU overhead than traditional TCP/IP communications?

A. SMC-R
B. zEDC
C. OSA Express5S
D. High performance FICON

Answer: A

Explanation:
With SMC-R, System z network capability takes a new leap, strengthening performance for sharing data and reducing data transmission network overhead. The new System z®
RDMA over Converged Ethernet (RoCE) feature — 10GbE RoCE Express—enables industry-standard RDMA capability on the System z platform. As an RDMA-capable network interface card (RNIC), it can transfer data using direct memory-to-memory communication, reducing the CPU overhead of the networking software stack. SMC-R provides application transparent exploitation of this new RoCE feature reducing the network overhead and latency of data transfers, effectively offering the benefits of optimized network performance across processors. This breakthrough also lowers the CPU cost associated when moving large amounts of data.


QUESTION 3
What is hybrid cloud?

A. It is a compute cloud that mixes compute nodes with different CPU architectures.
B. It is an laaS cloud that provides instances with different operating systems.
C. It is an architecture that combines public and private clouds.
D. It is an laaS cloud that provides ephemeral storage only.

Answer: C

Explanation:
A hybrid cloud is an integrated cloud service utilizing both private and public clouds to perform distinct functions within the same organization.
References: https://www.interoute.com/what-hybrid-cloud


QUESTION 4
Which IBM z Systems technology is the foundation of the IBM Bluemix-based Blockchain service High Security Business Network (HSBN)?

A. Hyperedger Fabric
B. Docker
C. KVM
D. Secure Service Container

Answer: A

Explanation:
Blockchain is based on the industry standard Open Linux Foundation Hyperledger.

References: https://www.digitalmarketplace.service.gov.uk/g-cloud/services/862468845018056



Saturday, 14 October 2017

C5050-408 IBM Worklight Foundation V6.2 Mobile Application Development

Number of questions: 74
Number of questions to pass: 46

Time allowed: 150 mins

The test consists of eight sections containing a total of approximately 74 multiple-choice questions. The percentages after each section title reflect the approximate distribution of the total question set across the sections.

Section 1 - Development Environment Set-up 7%
Install and configure Worklight Studio.
Install Worklight-CLI (Command Line Interface).
Install and configure 3rd party JavaScript libraries (e.g., jQuery Mobile, Sencha, Dojo).
Install and configure an optional device specific SDK (e.g., Android SDK, Apple Xcode, Windows Visual Studio, Blackberry development tools).
Install and configure IBM Mobile Test Workbench for Worklight.

Section 2 - Development: Architecture 5%
Describe Worklight Foundation components and architecture.
Identify the anatomy of a Worklight Project and Worklight application.
Differentiate between native vs hybrid application development approaches.
Describe the default startup process in iOS-based and Android-based hybrid

Section 3 - Development: Client Side 42%
Create Worklight projects and applications with Worklight Studio and Command-Line.
Add a Worklight environment with Worklight Studio and Command-Line.
Build and deploy applications with Worklight Studio and Command-Line.
Use UI patterns.
Use common user interface (UI) controls.
Build a user interface using Rich Page Editor.
Use Worklight native APIs.
Use Geo-location APIs.
Use Apache Cordova API.
Use JSON store client side (e.g., syncing, encrypting, storing).
Provide offline access.
Implement Push and SMS Notification mechanisms.
Implement a custom startup process for hybrid applications such as iOS-based, and Android-based.
Optimize code for a specific environment (e.g., skins).
Configure minification and concatentation to optimize Worklight applications for Mobile Web.
Invoke adapter procedure on hybrid and native applications.
Use Unstructured Supplementary Service Data (USSD) communication.
Customize iOS and Android applications (i.e. adding custom code).
Change the splash screen with JavaScript APIs, in iOS or Android-based hybrid applications.
Enable the Simple Data Sharing feature.
Add a Worklight web view into an existing native application.
Configure and customize direct update.
Globalize an application.

Section 4 - Development: Server Side 11%
Develop Worklight adapters via Worklight Studio and Worklight-CLI.
Filter adapters result data using XSLT.
Invoke Java code from adapters.
Use Worklight server side APIs.
Remove adapters and applications from Worklight Console.
Distinguish between the different types of adapters.

Section 5 - Security 15%
Implement authentication mechanisms (e.g., form-based, adapter based, custom, certificate based authentication).
Use application security mechanisms: realms and security tests.
Use the Worklight Console to control the application authenticity.
Configure application security mechanisms using the application-descriptor.xml and the authenticationConfig.xml file.
Use Direct update as a Security Realm.
Implement device provisioning.
Implement device single sign-on (SSO).

Section 6 - Deployment 7%
Configure and build an application for deployment on an external server.
Configure the Worklight Server settings in Worklight Studio.
Configure and deploy adapters using Worklight Studio and Worklight-CLI.
Demonstrate the capabilities of updating an application using Direct Update.

Section 7 - Quality Assurance 9%
Preview an application using the Mobile Browser Simulator.
Preview an application on the device emulator.
Preview an application on a physical device.
Use the Worklight Foundation V6.2 debugging tools (logs, traces, etc.).
Capture and receive uploaded client-side logs.
Create and run a mobile test using Mobile Test Workbench for Worklight (MTWW).

Section 8 - Analytics and Reports 4%
Enable raw and analytic reports in an application.
Use JSON Store Analytics.
Use Analytics dashboard.

PartnerWorld Code: 15010903
Replaces PW Code: 15010902

Status: Live
This intermediate level certification is intended for application developers who have hands-on experience using Worklight Foundation V6.2 to develop mobile hybrid and native applications.

A mobile application developer who achieves this certification can use Worklight Foundation V6.2 to develop client-side applications, develop server-side integration and security components, as well as test and deploy Worklight Foundation V6.2 projects. Overall, a mobile application developer can develop and implement mobile solutions.

The mobile application developer is generally self-sufficient and is able to perform most of the tasks involved in the role with limited assistance from peers and vendor support services. The mobile application developer efficiently uses product documentation.

To attain the IBM Certified Mobile Application Developer - Worklight Foundation V6.2 certification, candidates must pass 1 test. To prepare for the test, it is recommended to refer to the job role description and recommended prerequisite skills, and click the link to the test below to refer to the test objectives (skills measured on the test) and the Test preparation tab.
Recommended Prerequisite Skills

Knowledge and foundational skills one needs to possess before acquiring skills measured on the certification test. These foundational skills are NOT measured on the test. For skills measured on the test, see Test Objectives.)

Basic knowledge of:
Java programming
Web Services and REST
Database connectivity

Working knowledge of:
Eclipse based development tools
Command Line Interface (CLI)
HTML and CSS
JavaScript programming and JavaScript Framework (such as jQuery, Dojo and Sencha)
Designing applications for mobile devices
Developing hybrid and native applications for both Android and iOS

Requirements
This certification requires 1 exam

Exam Required:
Click on the link below to see exam details, exam objectives, suggested training and sample tests.

C5050-408 - IBM Worklight Foundation V6.2 Mobile Application Development

Each test:
contains questions requiring single and multiple answers. For multiple-answer questions, you need to choose all required options to get the answer correct. You will be advised how many options make up the correct answer.
is designed to provide diagnostic feedback on the Examination Score Report, correlating back to the test objectives, informing the test taker how he or she did on each section of the test. As a result, to maintain the integrity of each test, questions and answers are not distributed.

QUESTION 1
An application developer has determined that Worklight does not provide an authenticator that meets the complex needs of the application being developed. It is decided that the developer must implement a custom authenticator.
Which interface must the application developer implement?

A. com.worklight.core.auth.api.CustomAuthenticator
B. com.worklight.core.auth.api.AuthenticationService
C. com.worklight.server.auth.api.CustomAuthenticator
D. com.worklight.server.auth.api.WorkLightAuthenticator

Answer: D

Explanation:
Your custom authenticator class must implement the com.worklight.server.auth.api.WorkLightAuthenticator interface. References:
https://www.ibm.com/support/knowledgecenter/SSZH4A_6.2.0/com.ibm.worklight.dev.doc/ devref/ t_custom_authenticator.html


QUESTION 2
An application developer is using Worklight skins to support multiple from factors on an Android based hybrid application that will run on phones and tablet devices. The developer built skins that modify the color and size of the text in the application based on the device that it is running on. To switch between them, the developer needs to modify a file in the Worklight project.
What is the name of the file that the application developer needs to modify to set the skins to apply at runtime?

A. main.js
B. skinList.json
C. skinLoader.js
D. initOptions.js

Answer: C

Explanation:
To set which skin to apply at run time, implement the function getSkinName() in the file skinLoader.js.
References:
https://www.ibm.com/support/knowledgecenter/SSHS8R_6.3.0/com.ibm.worklight.dev.doc/
devref/ c_developing_application_skins.html


QUESTION 3
An application developer has implemented the following security test to protect a mobile application.
<mobileSecurityTest name=”mobileTest”>
<testUser realm=”myMobileLoginForm”/>
<testDeviceID provisioningType=”none” />
</mobileSecurityTest>
The corporate security team has recently learned about cross-site request forgery (XSRF) attacks against the company's website. The corporate security team wants to prevent further attacks and has asked the developer to protect the mobile application against XSRF attacks.
What step must the application developer take to protect against XSRF attacks?

A. Nothing. By default, a mobileSecurityTest includes protection against XSRF attacks.
B. Define a new webSecurityTest and add the element <test realm=”wl_antiXSRFRealm”/>
C. Add the element <test realm=”wl_antiXSRFRealm”/> to the mobileSecurityTest definition.
D. Change the implementation to a custom security test and add the element <testXSRF realm=”wl_antiXSRFRealm” />

Answer: A

Explanation:
The mobileSecurityTest contains:
* The following realms, enabled by default: wl_anonymousUserRealm, wl_antiXSRFRealm, wl_remoteDisableRealm and wl_deviceNoProvisioningRealm. * The user and device realms that you must specify.
References: https://www.ibm.com/support/knowledgecenter/SSZH4A_6.1.0/com.ibm.worklight.dev.doc/ devref/ r_security_tests.html


QUESTION 4
An application developer is developing a native iOS application. The application developer needs to call a web service to retrieve application data. In order to do that, the application developer will call an existing Worklight adapter that retrieves this data.

A. MyInvokeListener
B. WLAdapterDelegate
C. WLAdapterInvocationData
D. WLProcedureInvocationData

Answer: D

Explanation:
The WLProcedureInvocationData class contains all necessary data to call a procedure, including:
The name of the adapter and procedure to call. The parameters that the procedure requires.
References: https://www.ibm.com/support/knowledgecenter/SSZH4A_6.2.0/com.ibm.worklight.apiref.do c/html/refjavaworklight-android-native/html/com/worklight/wlclient/api/WLProcedureInvocationData.html

Friday, 1 September 2017

70-774 Perform Cloud Data Science with Azure Machine Learning

Published: February 14, 2017
Languages: English
Audiences: Data scientists
Technology: Azure Machine Learning, Bot Framework, Cognitive Services
Credit toward certification: MCSE

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Prepare Data for Analysis in Azure Machine Learning and Export from Azure Machine Learning
Import and export data to and from Azure Machine Learning
Import and export data to and from Azure Blob storage, import and export data to and from Azure SQL Database, import and export data via Hive Queries, import data from a website, import data from on-premises SQL
Explore and summarize data
Create univariate summaries, create multivariate summaries, visualize univariate distributions, use existing Microsoft R or Python notebooks for custom summaries and custom visualizations, use zip archives to import external packages for R or Python
Cleanse data for Azure Machine Learning
Apply filters to limit a dataset to the desired rows, identify and address missing data, identify and address outliers, remove columns and rows of datasets
Perform feature engineering
Merge multiple datasets by rows or columns into a single dataset by columns, merge multiple datasets by rows or columns into a single dataset by rows, add columns that are combinations of other columns, manually select and construct features for model estimation, automatically select and construct features for model estimation, reduce dimensions of data through principal component analysis (PCA), manage variable metadata, select standardized variables based on planned analysis

Develop Machine Learning Models
Select an appropriate algorithm or method
Select an appropriate algorithm for predicting continuous label data, select an appropriate algorithm for supervised versus unsupervised scenarios, identify when to select R versus Python notebooks, identify an appropriate algorithm for grouping unlabeled data, identify an appropriate algorithm for classifying label data, select an appropriate ensemble
Initialize and train appropriate models
Tune hyperparameters manually; tune hyperparameters automatically; split data into training and testing datasets, including using routines for cross-validation; build an ensemble using the stacking method
Validate models
Score and evaluate models, select appropriate evaluation metrics for clustering, select appropriate evaluation metrics for classification, select appropriate evaluation metrics for regression, use evaluation metrics to choose between Machine Learning models, compare ensemble metrics against base models

Operationalize and Manage Azure Machine Learning Services
Deploy models using Azure Machine Learning
Publish a model developed inside Azure Machine Learning, publish an externally developed scoring function using an Azure Machine Learning package, use web service parameters, create and publish a recommendation model, create and publish a language understanding model
Manage Azure Machine Learning projects and workspaces
Create projects and experiments, add assets to a project, create new workspaces, invite users to a workspace, switch between different workspaces, create a Jupyter notebook that references an intermediate dataset
Consume Azure Machine Learning models
Connect to a published Machine Learning web service, consume a published Machine Learning model programmatically using a batch execution service, consume a published Machine Learning model programmatically using a request response service, interact with a published Machine Learning model using Microsoft Excel, publish models to the marketplace
Consume exemplar Cognitive Services APIs
Consume Vision APIs to process images, consume Language APIs to process text, consume Knowledge APIs to create recommendations

Use Other Services for Machine Learning
Build and use neural networks with the Microsoft Cognitive Toolkit
Use N-series VMs for GPU acceleration, build and train a three-layer feed forward neural network, determine when to implement a neural network
Streamline development by using existing resources
Clone template experiments from Cortana Intelligence Gallery, use Cortana Intelligence Quick Start to deploy resources, use a data science VM for streamlined development
Perform data sciences at scale by using HDInsights
Deploy the appropriate type of HDI cluster, perform exploratory data analysis by using Spark SQL, build and use Machine Learning models with Spark on HDI, build and use Machine Learning models using MapReduce, build and use Machine Learning models using Microsoft R Server
Perform database analytics by using SQL Server R Services on Azure
Deploy a SQL Server 2016 Azure VM, configure SQL Server to allow execution of R scripts, execute R scripts inside T-SQL statements
QUESTION 1
You are building an Azure Machine Learning Solution for an Online retailer.
When a customer selects a product, you need to recommend products that the customer might like to purchase at the same time. The recommendation should be based on what other customers purchased the same product.
Which model should you use?

A. Collaborative Filtering
B. Boosted Decision Tree Regression Model
C. Two-Class boosted decision tree
D. K-Means Clustering

Answer: A


QUESTION 2
You are analyzing taxi trips in New York City. You leverage the Azure Data Factory to create data pipelines and to orchestrate data movement.
You plan to develop a predictive model for 170 million rows (37 GB) of raw data in Apache Hive by using Microsoft R Serve to identify which factors contributes to the passenger tipping behavior.
All of the platforms that are used for the analysis are the same. Each worker node has eight processor cores and 28 GB Of memory.
Which type of Azure HDInsight cluster should you use to produce results as quickly as possible?

A. Hadoop
B. HBase
C. Interactive Hive
D. Spark

Answer: A


QUESTION 3
Note: This question is part of a series of questions that present the same Scenario.
Each question I the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution while others might not have correct solution.
Start of repeated Scenario:
A Travel agency named Margie’s Travel sells airline tickets to customers in the United States.
Margie’s Travel wants you to provide insights and predictions on flight delays. The agency is considering implementing a system that will communicate to its customers as the flight departure near about possible delays due to weather conditions.
The flight data contains the following attributes:
* DepartureDate: The departure date aggregated at a per hour granularity.
* Carrier: The code assigned by the IATA and commonly used to identify a carrier.
* OriginAirportID: An identification number assigned by the USDOT to identify a unique airport (the flight’s Origin)
* DestAirportID: The departure delay in minutes.
*DepDet30: A Boolean value indicating whether the departure was delayed by 30 minutes or more ( a value of 1 indicates that the departure was delayed by 30 minutes or more)
The weather data contains the following Attributes: AirportID, ReadingDate (YYYY/MM/DD HH), SKYConditionVisibility, WeatherType, Windspeed, StationPressure, PressureChange and HourlyPrecip.
End of repeated Scenario:
You plan to predict flight delays that are 30 minutes or more.
You need to build a training model that accurately fits the data. The solution must minimize over fitting and minimize data leakage. Which attribute should you remove?

A. OriginAirportID
B. DepDel
C. DepDel30
D. Carrier
E. DestAirportID

Answer: B

Tuesday, 8 August 2017

70-773 Analyzing Big Data with Microsoft R

Exam 70-773
Analyzing Big Data with Microsoft R


Published: January 3, 2017
Languages: English
Audiences: Data scientists
Technology Microsoft R Server, SQL R Services
Credit toward certification: MCP, MCSE

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Read and explore big data
Read data with R Server
Read supported data file formats, such as text files, SAS, and SPSS; convert data to XDF format; identify trade-offs between XDF and flat text files; read data through Open Database Connectivity (ODBC) data sources; read in files from other file systems; use an internal data frame as a data source; process data from sources that cannot be read natively by R Server
Summarize data
Compute crosstabs and univariate statistics, choose when to use rxCrossTabs versus rxCube, integrate with open source technologies by using packages such as dplyrXdf, use group by functionality, create complex formulas to perform multiple tasks in one pass through the data, extract quantiles by using rxQuantile
Visualize data
Visualize in-memory data with base plotting functions and ggplot2; create custom visualizations with rxSummary and rxCube; visualize data with rxHistogram and rxLinePlot, including faceted plots

Process big data
Process data with rxDataStep
Subset rows of data, modify and create columns by using the Transforms argument, choose when to use on-the-fly transformations versus in-data transform trade-offs, handle missing values through filtering or replacement, generate a data frame or an XDF file, process dates (POSIXct, POSIXlt)
Perform complex transforms that use transform functions
Define a transform function; reshape data by using a transform function; use open source packages, such as lubridate; pass in values by using transformVars and transformEnvir; use internal .rx variables and functions for tasks, including cross-chunk communication
Manage data sets
Sort data in various orders, such as ascending and descending; use rxSort deduplication to remove duplicate values; merge data sources using rxMerge(); merge options and types; identify when alternatives to rxSort and rxMerge should be used
Process text using RML packages
Create features using RML functions, such as featurizeText(); create indicator variables and arrays using RML functions, such as categorical() and categoricalHash(); perform feature selection using RML functions

Build predictive models with ScaleR
Estimate linear models
Use rxLinMod, rxGlm, and rxLogit to estimate linear models; set the family for a generalized linear model by using functions such as rxTweedie; process data on the fly by using the appropriate arguments and functions, such as the F function and Transforms argument; weight observations through frequency or probability weights; choose between different types of automatic variable selections, such as greedy searches, repeated scoring, and byproduct of training; identify the impact of missing values during automatic variable selection
Build and use partitioning models
Use rxDTree, rxDForest, and rxBTrees to build partitioning models; adjust the weighting of false positives and misses by using loss; select parameters that affect bias and variance, such as pruning, learning rate, and tree depth; use as.rpart to interact with open source ecosystems
Generate predictions and residuals
Use rxPredict to generate predictions; perform parallel scoring using rxExec; generate different types of predictions, such as link and response scores for GLM, response, prob, and vote for rxDForest; generate different types of residuals, such as Usual, Pearson, and DBM
Evaluate models and tuning parameters
Summarize estimated models; run arbitrary code out of process, such as parallel parameter tuning by using rxExec; evaluate tree models by using RevoTreeView and rxVarImpPlot; calculate model evaluation metrics by using built-in functions; calculate model evaluation metrics and visualizations by using custom code, such as mean absolute percentage error and precision recall curves
Create additional models using RML packages
Build and use a One-Class Support Vector Machine, build and use linear and logistic regressions that use L1 and L2 regularization, build and use a decision tree by using FastTree, use FastTree as a recommender with ranking loss (NDCG), build and use a simple three-layer feed-forward neural network

Use R Server in different environments
Use different compute contexts to run R Server effectively
Change the compute context (rxHadoopMR, rxSpark, rxLocalseq, and rxLocalParallel); identify which compute context to use for different tasks; use different data source objects, depending on the context (RxOdbcData and RxTextData); identify and use appropriate data sources for different data sources and compute contexts (HDFS and SQL Server); debug processes across different compute contexts; identify use cases for RevoPemaR
Optimize tasks by using local compute contexts
Identify and execute tasks that can be run only in the local compute context, identify tasks that are more efficient to run in the local compute context, choose between rxLocalseq and rxLocalParallel, profile across different compute contexts
Perform in-database analytics by using SQL Server
Choose when to perform in-database versus out-of-database computations, identify limitations of in-database computations, use in-database versus out-of-database compute contexts appropriately, use stored procedures for data processing steps, serialize objects and write back to binary fields in a table, write tables, configure R to optimize SQL Server ( chunksize, numtasks, and computecontext), effectively communicate performance properties to SQL administrators and architects (SQL Server Profiler)
Implement analysis workflows in the Hadoop ecosystem and Spark
Use appropriate R Server functions in Spark; integrate with Hive, Pig, and Hadoop MapReduce; integrate with the Spark ecosystem of tools, such as SparklyR and SparkR; profile and tune across different compute contexts; use doRSR for parallelizing code that was written using open source foreach
Deploy predictive models to SQL Server and Azure Machine Learning
Deploy predictive models to SQL Server as a stored procedure, deploy an arbitrary function to Azure Machine Learning by using the AzureML R package, identify when to use DeployR

Question No : 1

Note: This question Is part of a series of questions that use the same or similar answer choice. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided In a question apply only to that question. You need to evaluate the significance of coefficient that are produced by using a model that
was estimated already.

Which function should you use?

A. rxPredict
B. rxLogit
C. Summary
D. rxLinMod
E. rxTweedie
F. stepAic
G. rxTransform
H. rxDataStep

Answer: D

Explanation: https://docs.microsoft.com/en-us/r-server/r/how-to-revoscaler-linear-model

Question No : 2
You need to build a model that looks at the probability of an outcome. You must regulate between L1 and L2. Which classification method should you use?

A. Two-Class Neural Network
B. Two-Class Support Vector Machine
C. Two-Class Decision Forest
D. Two-Class Logistic Regression

Answer: A

Saturday, 29 July 2017

C2090-645 IBM Cognos 10 BI Multidimensional Author

Test information:
Number of questions: 57
Time allowed in minutes: 60
Required passing score: 75%
Languages: English, Japanese

Related certifications:
IBM Certified Designer - Cognos 10 BI Multidimensional Reports
IBM Certified Solution Expert - Cognos BI

The Cognos 10 BI Multidimensional Author Exam covers key concepts, technologies, and functionality of the Cognos products. In preparation for an exam, we recommend a combination of training and hands-on experience, and a detailed review of product documentation.

Dimensional Data Components (28%)
Distinguish between relational, DMR, and dimensional data sources
Identify dimensional data items and expressions
Define multidimensional data structure components
Describe the importance of report context
Identify the default measure of a report
Describe default members and their purpose
Describe what a MUN is and identify the impact of using the wrong MUN
Describe what a set is
Describe what a tuple is

Focus Reports (14%)
Distinguish between dimensional and relational filtering styles
Identify techniques to focus data using the dimensional style
Interpret data that is focused based on members
Interpret data that is filtered based on measure values
Describe the purpose of a slicer

Drilling in Reports (14%)
Describe default drill-down behavior
Describe default drill-up behavior
Describe cases for advanced drilling configuration
Appraise reports generated with calculations that are preserved during drilling
Describe how member sets work

Drill-through Access (8%)
Identify supported drill-through data item combinations
Set-up drill-through access
Describe a conformed dimension

Calculations and Dimensional Functions (36%)
Describe the use of arithmetic operations in queries
Analyze the use of dimensional functions in queries
Examine coercion
Apply prompts to focus reports
Compose complex expressions that combine and reuse existing expressions

QUESTION 1
To display all individual campaigns in a crosstab report, a report author could use the expression set([TrailChef Campaign],[EverGlow Campaign],[Course Pro Campaign]). Instead, the report author decides to use the parent member of the campaigns in the set expression "children([All Campaigns])". Which statement is true about the method that was used?

A. In the future, when a campaign is deleted or new ones are added, the report author must modify the expression.
B. In the future, when a campaign is deleted or new ones are added, the unmodified expression will be valid.
C. The report author should not have used the method chosen, as the first method is best
in this situation.
D. To be accurate, the report author should avoid using a set expression.

Answer: B


QUESTION 2
Which of the following statements is correct about the order function?

A. The currentMeasure function must be used with the order function as the sort by criterion.
B. It arranges members of all sets in the report by ascending or descending values.
C. Optional parameters allow the author to order the members of a hierarchy without regard of their level.
D. It arranges members of a set alphabetically by ascending or descending captions.

Answer: C

QUESTION 3
A report author is working with an OLAP data source. The report author creates a query that uses a caption function on a member and applies a string function. What is a possible consequence of this action?

A. Using these dimensional methods will not work with an OLAP data source.
B. The mapped string values will not pass through to the target report.
C. There is nothing wrong with this approach.
D. Mixing dimensional styles and relational styles in a single query can create unexpected results.

Answer: D

QUESTION 4
When must a report author use the caption function?

A. As the first parameter of the roleValue function.
B. To return the display name for the specified business key.
C. To see the string display name for the specified element.
D. To pass the returned value to a drill-through target report, this expects a matching string as a parameter value.

Answer: D

QUESTION 5
Instead of prompting the user to select any countries in Europe, the report author wants to constrain the user to select one or more countries from the Northern Europe region. What kind of prompt should be used and how can this be achieved?

A. This is not possible because a prompt must always be populated with all members of a level.
B. Create a multi-select value prompt. Populate it using an expression on the [Northern Europe] member to retrieve its children on the country level.
C. Generate a prompt by creating an expression with a parameter on the crosstab edge: children([Northern Europe]->?Country?
D. Create a tree prompt, and populate it using an expression on the [Northern Europe]
member to retrieve its children at the country level.

Answer: B

Friday, 7 July 2017

Exam 70-713 Software Asset Management (SAM) - Core

Published: April 11, 2017
Languages: English, Spanish, Chinese (Simplified), French, German, Italian, Japanese, Portuguese (Brazil), Russian
Audiences: IT Professionals
Technology: Microsoft SAM Optimization Model, ISO/IEC 19770-1 standards, IAITAM best practices, ITIL SAM-related standards
Credit toward certification: MCP

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam. The higher the percentage, the more questions you are likely to see on that content area on the exam. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Assess SAM Programs by using the SAM Optimization Model (SOM) (15-20%)
Define the scope of a SAM program assessment
Identify infrastructure groups and locations, identify estimated quantity of hardware and on-premises and cloud software assets, identify functional descriptions for each group and key points of contact
Assess SAM processes, policies, resources, and tools throughout an organization
Assess infrastructure groups for the existence of documented SAM procedures, roles, and responsibilities
Assign maturity levels
Assign maturity levels according to the 10 components of SOM, Assign one of four maturity levels to each component
Perform a gap analysis between maturity levels
Perform a gap analysis between the current maturity level and the desired maturity level, review assigned maturity levels

Manage Software Licenses (15-20%)
Collect and manage complete hardware and on-premises and cloud software inventories
Review an organization’s hardware and on-premises and cloud software inventory collection processes and data to ensure completeness
Validate inventory accuracy
Normalize on-premises and cloud software inventories, reconcile on-premises and cloud software inventories against other data sources, verify the accuracy of specified license metrics such as user counts based on HR employee records
Collect, validate, and manage license entitlement records
Gather, store, normalize, and validate license entitlement records and term documents, provide reports as needed
Perform a periodic reconciliation of on-premises and cloud software inventories, license entitlements and optimization opportunities
Reconcile on-premises and cloud software inventory data against on-premises and cloud software license entitlements data, determine and report license compliance status

Coordinate Data Collection Technologies (15-20%)
Manage data collection and ensure completeness
Identify machine type, agent installation requirements, collection schedules, and discrepancies between inventories, define data schemas, identify data storage locations, normalize collected data
Coordinate data collection between operations groups
Facilitate data transfer and synchronization between various IT groups; validate data between various IT groups; ensure that overall SAM processes are being followed; collect data for security, virtualization, and third-party resources including Linux
Manage data interfaces between disparate data sources
Identify the process of matching common fields, cross-referencing, and consolidation and integration of data from multiple sources
Manage reporting
Gather requirements for general user and executive reports, generate and maintain periodic reports, maintain the infrastructure necessary for ad hoc reporting requests, generate reports for software publishers

Design and Manage a SAM Program (25-30%)
Secure executive sponsorship and ongoing cadence
Identify stakeholders, create proposal materials, obtain explicit executive authorization for software asset infrastructure, policies, and overall corporate governance, establish a regular engagement with the senior management team
Secure funding
Estimate operational costs in both consulting hours and employee time, create a project plan and budget, obtain funding from each infrastructure group for each task associated with managing a SAM program
Design a SAM program
Identify resources and objectives for a SAM program, align resources with customer requirements and schedules, coordinate acquisition strategies, coordinate optimization methods that include Software as a Service (SaaS), Platform as a Service (PaaS), and application virtualization, ensure strong security and compliance requirements
Implement a SAM program
Create a SAM stakeholder group to oversee the project, assess and benchmark the current state of SAM according to industry standards such as ISO/IEC 19770-1 and IAITAM, implement technologies to support asset discovery, enhance purchasing processes to include storage and retrieval of license entitlement information, perform initial license and cloud services reconciliation, create policies, processes, and procedures to support SAM efforts, secure support from all associated departments, educate employees
Maintain a SAM program
Monitor adherence to the policies, processes, and procedures of an organization’s SAM life cycle, standardize SAM processes across all domains and organizational units, perform periodic license and cloud services reconciliation, provide ongoing SAM awareness training
Create and manage a SAM program improvement plan
Incorporate SAM analytical data into strategic IT and business unit planning, create detailed metrics and reports to measure SAM adoption, implementation, and maturity, report Return of Investment (ROI), cost avoidance, and end-user satisfaction to all stakeholders, evangelize SAM maturity benefits

Manage the Software Asset Life Cycle (15-20%)

Manage the acquisition process
Identify and manage approved and unapproved purchasing processes, identify suppliers, manage software approval and receiving processes, manage license updates, optimization, and subscriptions, ensure proper consumption of online services
Manage the deployment process and consumption
Validate the availability of on-premises and cloud software licenses, select the correct media, specify the software identification characteristics, track deployment and consumption of on-premises cloud software
Optimize assets
Manage on-premises and cloud software and hardware centralization throughout the software asset life cycle according to ISO/IEC 19770-1 and IAITAM, generate awareness of available agreement benefits
Manage the retirement process
Identify hardware for retirement, retire software, harvest software licenses, ensure decommissioning or destruction of storage media, stop subscriptions and downloads, preserve compliance, perform notifications, retain documentation
QUESTION 1
Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question In the series. Each question is Independent of the other questions In this series. Information and details provided In a question apply only to that question.
You represent a SAM partner.
A client must consolidate then IT departments into a single business unit. The client was previously assessed in the Microsoft SAM Optimization Model (SOM) key competencies and assigned a SAM maturity level.
You need to ensure that unused Microsoft Office 365 licenses arc reclaimed.
What should you do?

A. Purchase software from approved vendors.
B. Publish software deployment reports to stakeholder.
C. Use information provided by a software publisher.
D. Use software metadata generated by the client.
E. Deploy only approved software.
F. Formulate a retirement process.
G. Create an inventory of deployed assets.
H. Maintain updated records of deployed assets.

Answer: C


QUESTION 2
This question requires that you evaluate the underlined text to determine if it is correct.
An organization that achieves the Dynamic Microsoft SAM Optimization Model (SOM) level has SAM data available but typically does not use it for decision marking.
Review the underlined text. If it makes the statement correct, select “No change is needed.” If the statement is incorrect, select the answer choice that makes the statement correct.

A. No change is needed.
B. Basic
C. Standardized
D. Rationalized

Answer: A

 

QUESTION 3
An organization is implementing a SAM program. The organization is focused on achieving Tier 1 of the ISO 19770-1 specification.
You need to verity that contracts are reported on and inventoried.
What should you include in the report?

A. the last reported hardware inventory date
B. any approved exceptions to the contracts
C. contracts that do not match billing statements
D. the last reported software inventory date

Answer: D

QUESTION 4
Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question In the series. Each question is independent of the other questions In this series. Information and details provided In a question apply only to that question.
You represent a SAM partner.
A client must consolidate their II departments into a single business unit. The client was previously assessed in the Microsoft SAM Optimization Model (SOM) key competencies and assigned a SAM maturity level.
The client reclaims unused software licenses from all departments.
You need to ensure that the client progresses to the next SOM level in the Retirement
Process key competency.
What should you do?

A. Purchase software only from approved vendors.
B. Publish software deployment reports to stakeholders.
C. Use information provided by a software publisher.
D. Use software metadata generated by the client.
E. Deploy only approved software.
F. Formulate a retirement process.
G. Create an inventory of deployed assets.
H. Maintain updated records of deployed assets.

Answer: A

Thursday, 6 July 2017

RC0-903 CompTIA A+ Recertification Exam

Exam Codes RC0-903
Launch Date July 7, 2016
Eligibility Candidates MUST have:

An active A+ CE certification earned by passing exams from the 800-series or earlier.

Received an email from CompTIA containing a Private Access Code (PAC).

Exam Description The CompTIA A+ Recertification Exam covers these domains:
1.0 Mobile Devices (6% of total)
2.0 Windows Operating Systems (20% of total)
3.0 Other Operating Systems and Technologies (11% of total)
4.0 Troubleshooting (Hardware/Software) (59% of total)
5.0 Operational Procedures (4% of total)

Recertification Exam Objectives Download
Number of Questions 50
Type of Questions Multiple choice questions (single and multiple response)
Length of Test 75 Minutes
Passing Score 700 (on a scale of 900)
Delivery Non-proctored Pearson IBT
CEU Impact

Only candidates with an active A+ CE certification will receive CEU credit.
Passing the exam will automatically renew your existing A+ CE. Please allow 1-3 days for your record to be updated.

Introduction The CompTIA A+ Recertification Exam is one way for CompTIA A+ certified professionals to keep their A+ certification active. A CompTIA A+ certification earned on or after January 1, 2011 is valid for three years from the date the certification was earned. The certification must be renewed within three years in order for the individual to remain certified. To remain certified, individuals may:
 Re-take (and pass) both of the current certification exams (220-901 and 220-902)
 Participate in continuing education activities
 Take (and pass) the A+ re-certification exam (RC0-903) The CompTIA A+ Recertification Exam RC0-903 bridges the competencies measured by the A+ 800 series (220-801 and 220-802) and the 900 series (220-901 and 220-902). The exam (RC0-903) blueprint includes the objectives new to the 900 series and also assesses the highest weighted competencies that appear on both sets of exams
(i.e., the knowledge and skills rated by SMEs as most relevant for on - the - job performance). NOTE: Availability of RC0-903 is LIMITED TO THOSE who have kept their A+ certification active and have not taken and passed the current 900 series exams.

CompTIA A+ is accredited by ANSI to show compliance with the ISO 17024 Standard and, as such, undergoes regular reviews and updates to the exam objectives. The following CompTIA A+ Recertification RC0-903 exam objectives result from subject matter expert workshops and industry - wide survey results regarding the skills and knowledge required of an entry - level IT technical support
professional.

This examination blueprint includes domain weighting, test objectives, and example content. Example topics and concepts are included to clarify the test objectives and should not be construed as a comprehensive listing of all the content of this examination.
Candidates are encouraged to use this document to guide their studies. The table below lists the domains measured by this examination and the extent to which they are represented. The CompTIA A+
RC0-903 exam is based on these objectives.

QUESTION 1 – (Topic 1)
A user reports that the cursor jumps to random screen locations when typing on a laptop computer. Which of the following devices is MOST likely causing this?
A. The touchpad
B. The mouse wheel
C. The multimedia keys
D. The digitizer
Answer: A

QUESTION 2 – (Topic 1)
Which of the following would need to be enabled on a mobile phone to share its Internet connection with multiple devices simultaneously?
A. NFC
B. Bluetooth
C. Hotspot
D. Tethering
Answer: C

QUESTION 3 – (Topic 1)
A customer asks a technician for a device that has the capability to easily connect a laptop to an external monitor, keyboard, mouse, and charge the battery. Which of the following devices should the technician recommend to the customer?
A. Lightning
B. KVM switch
C. USB 3.0
D. Docking station
Answer: D


Wednesday, 21 June 2017

C2090-552 IBM InfoSphere Optim for Distributed Systems Fundamentals

Test information:
Number of questions: 65
Time allowed in minutes: 90
Required passing score: 75%
Languages: English, Japanese

This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to plan, install, configure, troubleshoot, administer and maintain an IBM InfoSphere Optim Data Growth, Optim Decommissioning, Test Data Management, or Test Data Management with Data Privacy option on a Distributed System.

Section 1 - Installation and Configuration (25%)
Optim Architecture
Operating Systems/Hardware support
Database Support
Establishing Environments
Optim Directories (location, numbers, owners)
Confirming installation pre-reqs
Physical installations
Security
Initializing
Enabling
Configuring
Archive / Extract Maintenance
File locations/sub-systems
File naming / sizing

Section 2 - Optim Core Functionality (35%)
Data Growth
Archive
Delete
Restore
Decommissioning
Test Data Management
Extract / Insert
Edit
Compare
Data Privacy Functions (when to use which ones)
Convert
Privacy Functions
Access Definitions
Relationships

Section 3 - Advanced Topics (34%)
Enterprise Integration
Security
Storage
Automation (i.e. Command Line)
Import/Export OPTIM DDL
Archive Files (i.e., location, collections, moving, registering, creating, managing index files)
Optim Connect
Archive Collections
Archive Indexes
Archive File Maintenance
Complex Data Model Traversals
Extended Data Sources
InfoSphere Federation Server
Optim Connect
Non-Enforced Referential Integrity
Column Maps and Exits

Section 4 - Performance and Troubleshooting (6%)
Log Files, Trace Files and Statistical Reports
Relationships (i.e., delete strategy, OPTIM relationships, Forced Key/Scan Lookups, Scans, Threads)

Relationship Index Analysis

IBM Certified Specialist - InfoSphere Optim for Distributed Systems v9.1

Job Role Description / Target Audience
This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to plan, install, configure, troubleshoot, administer and maintain an IBM InfoSphere Optim Solutions v9.1.

This includes:
* product installation and configuration
* Optim Core Functionality
* LUA
* Optim Search
* Masking on Demand
* Service on Demand
* New Database features and functionality
* performance and tuning

The person performing these functions is generally self-sufficient and is able to perform most of the tasks involved in the role with limited assistance from peers, product documentation and vendor support services.

Recommended Prerequisite Skills

Hands-on experience with installing, configuring and using InfoSphere Optim on Distributed Systems

IBM Certified Specialist - InfoSphere Optim for Distributed Systems Fundamentals

Job Role Description / Target Audience
This intermediate level certification is intended for application developers, database administrators, and technical personnel who perform the installation, configuration and day-to-day tasks associated with ensuring the smooth and efficient operation of InfoSphere Optim environment.

This includes:
* product installation and configuration
* configuration of security
* archive and extract maintenance
* archiving and Optim Connect
* decommissioning
* creation and maintenance of test data
* privatizing data
* accessing extended data sources using Optim Connect and InfoSphere Federation Server
* implementing a test data management environment
* performance and tuning

The person performing these functions is generally self-sufficient and is able to perform most of the tasks involved in the role with limited assistance from peers, product documentation and vendor support services.

Recommended Prerequisite Skills
Familiarity with Optim Manuals
Hands-on experience with installing, configuring, and using Optim on Distributed Systems



QUESTION 1
You can use the silent installer in a UNIX environment to install the Optim Server. The silent
installer is NOT available for which two of the following platforms? (Choose two.)

A. HP-UX 11i v2
B. Red Hat Linux 3
C. IBM AIX 5
D. Solaris 8
E. SUSE 10

Answer: B,D

Explanation:


QUESTION 2
When installing Optim, what is the minimum disk space needed for the database?

A. 50gb
B. 50mb
C. As required
D. 500mb

Answer: C

Explanation:


QUESTION 3
You are planning an installation of IBM Optim. Which two configurations are supported for the
Optim Server and Optim Directory? (Choose two.)

A. DB2 database running on Ubuntu Linux
B. DB2 or Oracle databases running on AIX
C. Oracle or Microsoft SQL Server databases running on AIX
D. DB2 or Informix databases running on Sun Solaris
E. DB2 or Microsoft Access databases running on Windows XP Professional

Answer: B,D

Explanation:


QUESTION 4
Which two statements are true about Optim security roles? (Choose two.)

A. Functional Privilege classes such as Create New Actions and Create New Definitions can be
controlled using Roles.
B. Privileges such as Archive Request and Compare Request can be controlled using Roles
C. By default, the Access Control Domain allows access to all Optim Actions and privileges. Only
denial of privileges can be assigned using roles.
D. Default Roles may not be modified to allow or deny Functional Privilege Classes and New
Action Privileges.
E. Edit the FAD (File Access Definition) to control which roles has access to all of the files.

Answer: A,B

Explanation:

Friday, 16 June 2017

C2090-461 IBM InfoSphere Optim for Distributed Systems v9.1

Test information:
Number of questions: 34
Time allowed in minutes: 60
Required passing score: 61%
Languages: English

This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to plan, install, configure, troubleshoot, administer and maintain an IBM InfoSphere Optim Solutions v9.1.

This includes:
* product installation and configuration
* Optim Core Functionality
* LUA
* Optim Search
* Masking on Demand
* Service on Demand
* New Database features and functionality
* performance and tuning

The person performing these functions is generally self-sufficient and is able to perform most of the tasks involved in the role with limited assistance from peers, product documentation and vendor support services.

Section 1 - Installation and Configuration (29%)
Optim Architecture
Operating Systems / Hardware Support
Database Support
New or additional components (Optim Manager, Optim Designer, self-service and Optim Search, UDFs)
Masking on Demand
Establishing Environments
Optim Directories (location, numbers, owners)
Confirming installation pre-reqs
Physical installations (locations, what to separate, manager, where things exist)
Archive / Extract Maintenance
File location / sub-systems (BigData enhancements)
File naming / sizing

Section 2 - Optim Core Functionality (35%)
Archiving
File and Directory Maintenance
Search
Test Data Management
Using the Designer interface
Creating access definitions, table maps, column maps, services
Creating Optim relationships, aliases, data sources (discovery)
Using Optim Manager
Data Privacy Providers (when to use which ones)

Section 3 - Advanced Topics (29%)
LUA
Optim Search
Masking on Demand
Service on Demand
New Database Features/functionality (Netezza, loaders, teradata)

Section 4 - Performance and Troubleshooting (6%)
Log and Trace Files

IBM Certified Specialist - InfoSphere Optim for Distributed Systems v9.1

Job Role Description / Target Audience
This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to plan, install, configure, troubleshoot, administer and maintain an IBM InfoSphere Optim Solutions v9.1.

This includes:
* product installation and configuration
* Optim Core Functionality
* LUA
* Optim Search
* Masking on Demand
* Service on Demand
* New Database features and functionality
* performance and tuning

The person performing these functions is generally self-sufficient and is able to perform most of the tasks involved in the role with limited assistance from peers, product documentation and vendor support services.

Recommended Prerequisite Skills
Hands-on experience with installing, configuring and using InfoSphere Optim on Distributed Systems

QUESTION 1
Considering best practices and Optim Access Definitions, which statement is FALSE?

A. Optim Classic interface would be used to define the Optionl and Option2 for my archive Access
Definition.
B. All tables included in an Access Definition that was created in Optim Designer must be in the
same DBAIias.
C. One can copy an Access Definition from one folder to another in Optim Designer and the
Wizard will require you to rename it.
D. A reference table can easily be changed to a related table in Optim Designer, however, one
should then verify the entire process by viewing Show Steps.

Answer: D

Explanation:


QUESTION 2
In order to consistently mask a credit card number, which option should be used?

A. TRANS CCN
B. TRANS CCN ('=n CREDITCARD')
C. SUBSTR(CREDITCARD. 1. 12) || SEQ(1111.1)
D. There is no reason to mask the credit card number as long as you mask the name.

Answer: A

Explanation:

Friday, 9 June 2017

1Z0-468 Oracle Cloud Application Foundation Essentials

Duration: 120 minutes
Number of Questions: 76
Passing Score: 62%
View passing score policy
Validated Against:
Format: Multiple Choice

Cloud Application Foundation (CAF) Fundamentals
Describe Cloud Application Foundation concepts
Identify components of WebLogic Suite
Identify differences between WebLogic Server (WLS) SE, EE & Suite
Describe the problem domain of Coherence with WLS
Describe Oracle Cloud Computing business drivers
Describe Virtual Assembly Builder Studio features
Describe supported and custom Appliances for Virtual Assemblies
Describe ActiveCache (WLS and Coherence)

Coherence Development Fundamentals
Describe Use Cases for Coherence
Deploy the correct Cache Topologies
Understand how Coherence Clustering works
Describe how parititioning works in Coherence
Describe client types and usage
Describe POF and other models of Object Serialization
Understand how to configure different cache topologies and services
Understand basic Coherence key-based APIs
Integrate with a datasource

Advanced Coherence Development Topics
Create indexes to optimize filters
Describe a service
Describe when you would use Elastic Data
Perform a map-reduce operation
Explain advanced capabilities like eventing and processing to customer
Configure and use a Near Cache or Continuous Query Cache
Configure operational features like Quorum and Service Guardian
Perform concurrent operations against a cache
Integrate with TopLink Grid
Pre-load a cache
Write Coherence queries

Deploying and Debugging a Coherence Application
Configure a set of proxies
Architect a Coherence client tier
Apply best practices for performance tuning
Plan Capacity for a Coherence deployment
Debug network issues in Coherence deployments
Tune and size Coherence JVMs
Describe how to run a Coherence application
Debug a basic Coherence problem
Collect information for advanced Coherence troubleshooting
Explain how Oracle Enterprise Manager integrates with Coherence
Apply the Production Checklist

Monitoring and Managing WebLogic Server with Oracle Enterprise Manager
Describe the architecture and components within Oracle Enterprise Manager Cloud Control
Utilize the three primary functional areas that are addressed by Oracle Enterprise Manager Cloud Control in regards to WLS management
Configure Enterprise Manager with WLS
Explain how to position WebLogic Management Pack EE
Configure Oracle Enterprise Manager to provide full stack visibility and application performance management for WLS
Explain how Java Virtual Machine (JVM) diagnostics provides full cross-tier diagnostics in production environments and why it is important
Configure Oracle Enterprise Manager Cloud Control to provide WLS patch automation
Utilize Oracle Enterprise Manager for lifecycle management (i.e. provisioning and cloning of WLS domains and Java applications)
Utilize the user experience management features provided by Oracle Enterprise Manager Real User Experience Insight
Describe the primary critical use cases for Oracle Enterprise Manager Business Transaction Management for WLS
Explain the key challenges that Oracle Enterprise Manager addresses with its configuration management features

Java VM
Explain the basics of Java VM
Explain the differences between HotSpot VM and Jrockit VM
Troubleshoot common performance problems
Describe different garbage collection schemes
Describe different VM tuning options
Utilize performance monitoring and profiling using JVM command line tools
Utilize advanced real-time performance monitoring, profiling and troubleshooting using Java Mission Control
Utilize Back-in-time analysis and troubleshooting using Java Flight Recorder
Design for Java application performance

Virtual Assembly Builder
Explain the business and IT challenges Virtual Assembly Builder helps with
Explain the advantages of assemblies over standalone VM templates
Create appliances and multi-tier assemblies
Customize an assembly at deployment time
Setup Virtual Assembly Builder Deployer and deploy a multi-tier assembly

Web Tier
Explain the basics of HTTP Server and Traffic Director
Design and configure Reverse Proxy with HTTP Server and Traffic Director
Secure HTTP Server and Traffic Director Environments
Perform basic troubleshooting of HTTP Server and Traffic Director


QUESTION 1
Which two mechanisms are explicitly monitored to determine death of a cluster member? (Choose two.)

A. garbage collection
B. stuck threads
C. heartbeats
D. TCP socket connections

Answer: C,D

Explanation:
Explanation
Death detection works by creating a ring of TCP connections between all cluster members.
TCP communication is sent on the same port that is used for cluster UDP communication.
Each cluster member issues a unicast heartbeat, and the most senior cluster member issues the cluster heartbeat,
which is a broadcast message. Each cluster member uses the TCP connection to detectthe death of another node
within the heartbeat interval. Death detection is enabled by default and is configured within the <tcp-ring-listener> element.


QUESTION 2
Which is a valid reason for using Coherence Elastic Data?

A. You want to cache more that the total amount of RAM on your systems.
B. You want to extend your cluster across a WAN.
C. Data must be persisted to disk to ensure fault tolerance.
D. You want to store data on very large heaps.

Answer: A

Explanation:
http://www.oracle.com/technetwork/tutorials/tutorials-1694910.pdf


QUESTION 3
Which configuration file must a user edit to configure Oracle HTTP Server?

A. httpd.conf
B. obj.conf
C. workers.properties
D. magnus.conf
E. oraclehttp.conf

Answer: A


QUESTION 4
You are doing capacity planning for a Coherence application with one distributed cache (dcache), and one replicated cache (rcache). You have one index on dcache, and dcache has backup_count=1. Your requirement is that you must be able to tolerate one machine failure with no loss of data. You have three machines, each with 4 JVMs GB.
What three factors do you take into consideration in case of machine failure? (Choose three.)

A. Each cache server will be responsible for more primary and backup data from dcache.
B. The size of index data in each cache server is likely to grow.
C. The size of rcache on each node will grow.
D. Updates to rcache will result in more network traffic per machine.
E. Updates to dcache will result in more network traffic per machine.

Answer: A,B,E

Friday, 28 April 2017

C2070-984 IBM Case Manager V5.2 Solution Designer

Test information:
Number of questions: 61
Time allowed in minutes: 120
Required passing score: 57%
Languages: English

The test contains six sections, totaling 61 multiple-choice questions. The percentages after each section title reflect the approximate distribution of the total question set across the sections.

Section 1 - Planning and Architecture (16%)
Demonstrate understanding of IBM Case Manager Architecture
Identify solutions best addressed by Case Manager
Identify minimum installation requirements
Identify key differences between development and production environments
Identify integrated products and add-on software capabilities including licensing

Section 2 - Designing a Case Solution (44%)
Demonstrate an understanding of the Case Manager object model
Demonstrate an understanding of Case Builder and its Artifacts
Demonstrate an understanding of Case Task and Process Fragments
Demonstrate an understanding of the Page Designer
Demonstrate an understanding of IBM Content Navigator integration
Demonstrate an understanding of the security model
Demonstrate the use of IBM Case Manager bundled components
Demonstrate knowledge of using Case Operations Component

Section 3 - Deploying and Testing a Case Solution (8%)
Demonstrate an understanding of the deployment structure and its limitations
Demonstrate knowledge of project areas
Demonstrate knowledge of using the Case Manager Administration Client

Section 4 - Extending a Case Management Solution (16%)
Demonstrate knowledge of integrating custom widgets, business rules and reporting
Identify IBM capabilities to extend Case Manager capabilities
Identify integration scenarios between IBM BPM and Case Manager
Demonstrate an understanding of building a custom application using REST service and JavaScript APIs

Section 5 - Solution Migration (8%)
Understanding of the uses for Case Manager Configuration Manager, Case Manager Administration Client and FileNet Deployment Manager
Demonstrate an understanding of requirements and steps to deploy to production
Demonstrate an understanding of configuring security in a production environment

Section 6 - Business Metrics and Analytics (8%)
Demonstrate knowledge of using Case Analyzer
Demonstrate knowledge of using Case Monitor and Cognos RTM
Demonstrate knowledge of using Content Analytics with Enterprise Search

IBM Certified Solution Designer - Case Manager V5.2

Job Role Description / Target Audience
This certification test certifies that the successful candidate has the knowledge to create the theoretical and/or detailed technical design of an application, solution, and infrastructure using IBM Case Manager v5.2 (ICM). Provide expertise to evaluate and choose between alternatives; assist with balancing costs with capabilities and priorities. Identify products / technologies / processes to be included, and determine points of required integration and customization. Identify sizing and capacity issues, as well as conflicts with existing processes or environments.

Recommended Prerequisite Skills
Before preparing for this certification, the following knowledge and skills are recommended and assumed:

Knowledge of IBM Case Manager
Knowledge of P8 Administration
Knowledge of P8 Case Foundation
Knowledge of P8 Content Manager
Knowledge of supported Application Servers
Knowledge of Content Navigator
Knowledge of Cognos RealTime Monitoring
Knowledge of IBM Content Analytics


Monday, 17 April 2017

C2070-585 IBM Datacap Taskmaster Capture V8.1 Implementation

Test information:
Number of questions: 63
Time allowed in minutes: 90
Required passing score: 60%
Languages: English


The test contains five sections totalling 63 multiple-choice questions. The percentages after each section title reflect the approximate distribution of the total question set across the sections.

Section 1 - Planning and Installation (11%)
Understand the deployment prerequisites and minimum installation requirements.
Identify Taskmaster components to be used.
Demonstrate knowledge of deployment validation procedures.

Section 2 - Configuration and Security (13%)
Demonstrate knowledge of client/server communication requirements.
Knowledge of the different authentication modes.
Demonstrate knowledge of Taskmaster Administrator.

Section 3 - Deployment and Testing (16%)
Understand how to test using Datacap Studio.
Understand how to set up scanning with Taskmaster.
Understand the steps needed to migrate an application.
Understand how to deploy and test Taskmaster components.

Section 4 - Customization (i.e. Studio) (46%)
Knowledge of the relationship between components in Studio (e.g. hierarchy, ruleset, library and profiles).
Demonstrate knowledge of the Taskmaster Workflow.
Understand Datacap Studio actions.
Demonstrate knowledge of various key Taskmaster variables (page statuses, ICP - IPS/DPS/DOF, SELECT, Lookup, Smart Parameters).
Understand Page Identification Techniques.
Understand index capturing techniques and recognition.
Understand how to build interfaces using DotEdit/DotScan.
Demonstrate knowledge of Export techniques.

Section 5 - Operations Management, Troubleshooting, and Administration (14%)
Understand how to enable / review logs to troubleshoot Taskmaster.
Understand how to leverage NENU.
Understand how to use Job Monitor/Station Monitor to monitor system.
Understand how to work with the RV2 reporting tool and create custom reports.

IBM Certified Deployment Professional - Datacap Taskmaster Capture V8.1

Job Role Description / Target Audience
This intermediate level certification test certifies that the successful candidate has the knowledge to create the theoretical and/or detailed technical design of an application, solution, and infrastructure using IBM Datacap Taskmaster Capture V8.1. Provide expertise to evaluate and choose between alternatives; assist with balancing costs with capabilities and priorities. Identify products / technologies / processes to be included, and determine points of required integration and customization. Identify sizing and capacity issues, as well as conflicts with existing processes or environments.

To attain the IBM Certified Deployment Professional - Datacap Taskmaster Capture V8.1 certification, candidates must pass 1 test. To gain additional knowledge and skills, and prepare for the tests based on the job roles and test objectives, click on the link for Test 000-585 below.

Recommended Prerequisite Skills
Before preparing for this certification, basic understanding of the following is recommended and assumed:

Working knowledge databases
Working knowledge of backend servers (API)
Working knowledge of LDAP and basic security principles
Working knowledge on IIS
Working knowledge on scanners including ISIS and TWAIN drivers
Working knowledge of IBM Datacap Taskmaster Capture client/server architecture
Working knowledge of Windows Services

In preparing for this certification, the following IBM course(s) are recommended to further improve your skills:
F1680 - IBM Datacap Taskmaster 8.1: Configuration and Implementation
F1687 - IBM Datacap Taskmaster 8.1: Configuration and Implementation (ILO)
F1760 - IBM Datacap Taskmaster Capture: Implementation and Configuration
F1767 - IBM Datacap Taskmaster Capture: Implementation and Configuration (ILO)
F1769 - IBM Datacap Taskmaster Capture: Implementation and Configuration (SPVC)
F1870 - IBM Datacap Taskmaster 8.1: Administration
F1877 - IBM Datacap Taskmaster 8.1: Administration (ILO)
F1879 - IBM Datacap Taskmaster 8.1: Administration (SPVC)
 

QUESTION: 1
What tool would a Deployment Professional use to configure Rulerunner Enterprise (aka Quattro) background tasks?

A. Datacap Studio
B.RuIerur1r1er Manager
C. Taskmaster Server Manager
D. Taskmaster Application Manager

Answer: B


QUESTION: 2
What is the Taskmaster Server?

A. a Server that client applications use to create tasks.
B. one of many Datacap applications that access the admin and engine databases.
C. a Windows service which controls authentication, database access, and batch processing.
D. a Windows service which controls authentication, database access, and the creation of tasks.

Answer: C


QUESTION: 3
Which operating system does Datacap support?

A. AIX
B. iOS
C. Linux
D. Microsoft Windows

Answer: D


QUESTION: 4
Which steps are needed to configure a Datacap application to use Oracle or MSSQL?

A. During installation, populate DB server name and account information, test sample application
B. Run DB definition scripts, populate the databases, configure an application to use the database, verify DB
connection
C. Install using default settings. Once installed, use Taskmaster Server Manager to point Datacap to DB server. Use
Application Wizard to copy
applications to DB server
D. Ensure the database server is installed. Enter the database server name and credentials when prompted by the
installation wizard. Test sample application when install completes.

Answer: B


QUESTION: 5
What is the Taskmaster Web?

A. never runs batch verification rules.
B. has the sole responsibility to run batch verification rules.
C. supports browser-based Taskmaster clients including the ability to perform ISIS scanning.
D. supports browser-based Taskmaster clients, and does not require any additional downloads.

Answer: D