the path to insights: data models and pipelines coursera weekly challenge 3 answers

Test your knowledge: Optimize pipelines and ETL processes

1. What is the business intelligence process that involves checking data for defects in order to prevent system failures?

  • Query planning
  • Business intelligence monitoring
  • Quality testing
  • Data governance

2. Fill in the blank: Completeness is a quality testing step that involves confirming that the data contains all desired ____ or components.

  • Measures
  • Columns
  • Fields
  • Context

3. A business intelligence professional is considering the integrity of their data throughout its life cycle. Which of the following goals do they aim to achieve?

  • Data is trustworthy
  • Data is consistent
  • Data is accurate and complete
  • Data is encrypted

Activity: Evaluate a schema using a validation checklist

4. Did you complete this activity?

  • Yes
  • No

5. The Shipments table is missing a relationship to another table. Which table should it connect to?

  • Sales Fact
  • Order Details
  • Product
  • Order Items

6. Which of the following is a convention used in this schema?

  • Including the order_sid dimension in every table
  • Abbreviating system id as “sid”
  • Abbreviating customer as “cust”
  • Alphabetizing each dimension name

7. You find an error while trying to connect the Product table to the Order Items table. Which problem(s) would prevent the schema from validating? Select all that apply.

  • The Product table has fewer columns than the Order Items table
  • The product_id name does not match product_sid.
  • The data type of the product ids in the Product table is an integer, but it’s a string in the Order Items table.
  • There are product ids in the Order Items table that don’t exist in the Product table.

8. The Customer table should be linked to which of the following tables? Select all that apply.

  • Billing
  • Order Details
  • Order Items
  • Sales Fact

Shuffle Q/A 1

Test your knowledge: Data schema validation

9. A team of business intelligence professionals builds schema validation into their workflows. In this situation, what goal do they want to achieve?

  • Consolidate data from multiple source systems
  • Prevent two or more components from using a single resource in a conflicting way
  • Consider the needs of stakeholders in the design of the data schema
  • Ensure the source system data schema matches the target system data schema

10. Why is it important to ensure primary and foreign keys continue to function after data has been moved from one database system to another?

  • To preserve the existing table relationships
  • To evaluate database performance
  • To provide more detail and context about the data
  • To read and execute coded instructions

Leave a Reply