Cloud migrations present a unique set of problems that engineers may not have faced before. Moving data between data-centers has always been difficult, and the cloud represents a new iteration of the challenge. The possibilities offered by cloud-first software represents an opportunity to innovate but also brings a new set of tools that require mastery.
This session will present an AWS-centric view to moving data between sites and the challenges associated with it. It will cover requirements, tools, and evaluation criteria when planning your dataset migration to the cloud.
Topics in this half day class include different data movement scenarios, the tools available, and selection criteria. Specifically:
• Capture and transfer of different datasets: Event vs batch vs streaming Moving data from traditional databases and associated workloads into the cloud
• Moving log files and reports into the cloud
• Overview of AWS specific functions and services
This workshop is taught by Christopher Gambino, one of the founders of Calculated Systems. Prior to Calculated Systems, Chris lead project teams at Google and Hortonworks in deploying big data technologies such as Hadoop, Spark, BigQuery, and Kafka in AWS, GCP and Azure cloud environment Chris is a certified Google Cloud Architect and Data Engineer. Chris holds a bachelor’s degree in Biomedical Engineering and is a co-author of the big data ebook Apache NiFi for Dummies.
Target audience for Data Movement in AWS with Apache NiFi:
• Data Engineers, Engineering Managers, Architects, Data Scientists
• Technical professionals with data-oriented responsibilities (i.e. scientists needing more near real-time data, data engineers looking to build a new data pipeline)