GC-DI-CDF

Data Integration with Cloud Data Fusion Training

This 2-day course introduces learners to Google Cloud’s data integration capability using Cloud Data Fusion. In this course, we discuss challenges with data integration and the need for a data integration platform (middleware). We then discuss how Cloud Data Fusion can help to effectively integrate data from a variety of sources and formats and generate insights. We take a look at Cloud Data Fusion’s main components and how they work, how to process batch data and real time streaming data with visual pipeline design, rich tracking of metadata and data lineage, and how to deploy data pipelines on various execution engines.
Course Details

Duration

2 days

Prerequisites

Completed "Big Data and Machine Learning Fundamentals"

Target Audience

  • Data Engineer
  • Data Analysts

Skills Gained

  • Identify the need of data integration,
  • Understand the capabilities Cloud Data Fusion provides as a data integration platform,
  • Identify use cases for possible implementation with Cloud
  • Data Fusion,
  • List the core components of Cloud Data Fusion,
  • Design and execute batch and real time data
  • processing pipelines,
  • Work with Wrangler to build data transformations
  • Use connectors to integrate data from various sources and formats,
  • Configure execution environment; Monitor and Troubleshoot pipeline execution,
  • Understand the relationship between metadata and data lineage
Course Outline
  • Introduction to data integration and Cloud Data Fusion
    • Data integration: what, why, challenges
    • Data integration tools used in industry
    • User personas
    • Introduction to Cloud Data Fusion
    • Data integration critical capabilities
    • Cloud Data Fusion UI components
  • Building pipelines
    • Cloud Data Fusion architecture
    • Core concepts
    • Data pipelines and directed acyclic graphs (DAG)
    • Pipeline Lifecycle
    • Designing pipelines in Pipeline Studio
  • Designing complex pipelines
    • Branching, Merging and Joining
    • Actions and Notifications
    • Error handling and Macros
    • Pipeline Configurations, Scheduling, Import and Export
  • Pipeline execution environment
    • Schedules and triggers
    • Execution environment: Compute profile and provisioners
    • Monitoring pipelines
  • Building Transformations and Preparing Data with Wrangler
    • Wrangler
    • Directives
    • User-defined directives
  • Connectors and streaming pipelines
    • Understand the data integration architecture.
    • List various connectors.
    • Use the Cloud Data Loss Prevention (DLP) API.
    • Understand the reference architecture of streaming pipelines.
    • Build and execute a streaming pipeline.
  • Metadata and data lineage
    • Metadata
    • Data lineage
  • Conclusion
Upcoming Course Dates
USD $1,800
Online Virtual Class
Scheduled
Date: Jul 18 - 19, 2024
Time: 9 AM - 5 PM ET
Partner Registration

The course you are registering for is being delivered by our sister company - ExitCertified. All logistics related to course delivery will be managed by the ExitCertified team. If you have a dedicated Web Age representative, please feel to reach out to them with any questions/concerns you may have.

You'll now be redirected to https://www.exitcertified.com to complete the enrollment process.