This page is a one-stop solution for any information you may require for SAS Certified Data Quality Steward for SAS 9 (A00-262) Certification exam. The SAS A00-262 Exam Summary, Syllabus Topics and Sample Questions provide the base for the actual SAS Data Quality Using DataFlux Data Management Studio exam preparation, we have designed these resources to help you get ready to take your dream exam.
The SAS Certified Data Quality Steward for SAS 9 credential is globally recognized for validating SAS Data Quality Steward knowledge. With the SAS Data Quality Using DataFlux Data Management Studio Certification credential, you stand out in a crowd and prove that you have the SAS Data Quality Steward knowledge to make a difference within your organization. The SAS Certified Data Quality Steward for SAS 9 Certification (A00-262) exam will test the candidate's knowledge on following areas.
SAS A00-262 Exam Summary:
Exam Name | SAS Certified Data Quality Steward for SAS 9 |
Exam Code | A00-262 |
Exam Duration | 110 minutes |
Exam Questions | 75 Multiple Choice Questions |
Passing Score | 68% |
Exam Price | $180 (USD) |
Training |
Using DataFlux® Data Management Studio Understanding the SAS® Quality Knowledge Base DataFlux® Data Management Studio: Creating a New Data Type in the Quality Knowledge Base |
Books |
DataFlux Data Management Studio Documentation DataFlux Data Management Server Documentation |
Exam Registration | Pearson VUE |
Sample Questions | SAS Data Quality Steward Certification Sample Question |
Practice Exam | SAS Data Quality Steward Certification Practice Exam |
SAS A00-262 Exam Topics:
Objective | Details |
---|---|
Navigating the DataFlux Data Management Studio Interface |
|
Navigate within the Data Management Studio Interface |
- Register a new Quality Knowledge Base (QKB) - Create and connect to a repository - Define a data connection - Specify Data Management Studio options - Access the QKB - Create a name value macro pair - Access the business rules manager - Access the appropriate monitoring report - Attach and detach primary tabs |
Exploring and Profiling data |
|
Create and explore a data profile |
- Create and explore a data profile
- Interpret the results
|
Design data standardization schemes |
- Build a scheme from profile results - Build a scheme manually - Update existing schemes - Import and export a scheme |
Data Jobs |
|
Create Data Jobs |
- Rename output fields - Add nodes and preview nodes - Run a data job - View a log and settings - Work with data job settings and data job displays - Best practices (ensure you are following a particular best practice such as inserting notes, establishing naming conventions) - Work with branching - Join tables - Apply the Field layout node to control field order - Work with the Data Validation node:
- Work with data inputs |
Apply a Standardization definition and scheme |
- Use a definition - Use a scheme - Determine the differences between definition and scheme - Explain what happens when you use both a definition and scheme - Review and interpret standardization results - Explain the different steps involved in the process of standardization |
Apply Parsing definitions |
- Distinguish between different data types and their tokens - Review and interpret parsing results - Explain the different steps involved in the process of parsing - Use parsing definition - Interpret parse result codes |
Apply Casing definitions |
- Describe casing methods: upper/lower/proper - Explain different techniques for accomplishing casing - Use casing definition |
Compare and contrast the differences between identification analysis and right fielding nodes |
- Review results - Explain the technique used for identification (process of definition) |
Apply the Gender Analysis node to determine gender |
- Use gender definition - Interpret results - Explain different techniques for conducting gender analysis |
Create an Entity Resolution Job |
- Use a clustering node in a data job and explain its use - Survivorship (surviving record identification)
- Discuss and apply the Cluster Diff node
|
Use data job references within a data job |
- Use of external data provider node - Use of data job reference node - Define a target node - Explain why you would want to use a data job reference (best practice) - Real-time data service |
Understand how to use an Extraction definition |
- Interpret the results - Explain the process of the definition |
Explain the process of the definition of pattern analysis | |
Business Rules Monitoring |
|
Define and create business rules |
- Use Business Rules Manager - Create a new business rule
- Distinguish between different types of business rules
- Apply business rules
- Use of Expression Builder |
Create new tasks |
- Understand events
- Applying tasks
- Review a data monitoring job log
|
Data Management Server |
|
Interact with the Data Management Server |
- Import/export jobs (special case profile) - Test service - Run history/job status - Identify the required configuration components (QKB, data, reference sources, and repository) - Security, the access control list - Creation and use of WSDL |
Expression Engine Language (EEL) |
|
Explain the basic structure of EEL (components and syntax) |
- Identify basic structural components of the code
- Use EEL
|
Process Jobs |
|
Work with and create process jobs |
- Add nodes and explain what nodes do - Interpret the log - Parameterizing process jobs - Identify Run options - Using different functionality in process jobs - If/then logic
- Embedded data job and data job reference |
Macro Variables and Advanced Properties and Settings |
|
Work with and use macro variables in data profiles, data jobs and data monitoring |
- Define macro variables:
- Use macro variables:
- Determine Scoping/precedence (order in which macros are read) |
Determine uses for advanced properties |
- Multi-locale
- Apply setting for Max output rows |
Quality Knowledge Base (QKB) |
|
Describe the organization, structure and basic navigation of the QKB |
- Identify and describe locale levels (global, language, country) - Navigate the QKB (tab structure, copy definitions, etc) - Identify data types and tokens |
Be able to articulate when to use the various components of the QKB. Components include: |
- Regular expressions - Schemes - Phonetics library - Vocabularies - Grammar - Chop Tables |
Define the processing steps and components used in the different definition types. |
- Identify/describe the different definition types
- Explain the interaction between different definition types (with one another, parse within match, etc) |
The SAS has created this credential to assess the knowledge and understanding of a candidate in the area as above via the certification exam. The SAS Data Quality Steward (A00-262) Certification exam contains a high value in the market being the brand value of the SAS attached to it. It is highly recommended to a candidate to do a thorough study and also get a hand full of the practice to clear SAS Certified Data Quality Steward for SAS 9 exam without any hiccups.