Bringing the Mainframe to the Data Lake: How advanced analytics can leverage real-time transactions
Wednesday, December 7, 2016
1pm EDT | 10am PDT
Recording Time: Length: 57:06
“The mainframe is going away” is as true now as it was 10, 20 and 30 years ago. Mainframes are still crucial in handling critical business transactions, they were however built for an era where batch data movement was the norm and can be difficult to integrate into today’s data-driven, real-time, analytics-focused business processes as well as the environments that support them. Until now.
Join experts from Confluent, Attunity, and Capgemini for a one-hour online talk session where you’ll learn how to:
- Unlock your mainframe data with unique change data capture (CDC) functionality without incurring the complexity and expense that come with sending ongoing queries into the mainframe database
- How using CDC benefits advanced analytics approaches such as deep machine learning and predictive analytics
- Deliver ongoing streams of data in real-time to the most demanding analytics environments
- Ensure that your analytics environment includes the broadest possible range of data sources and destinations while ensuring true enterprise-grade functionality
- Identify use cases that can help you get started delivering value to the business moving from POC to Pilot to Production
Jordan has extensive experience delivering Enterprise IT solutions, such as software development on web, core/enterprise, and Client/Server applications; and implementing analytics solutions, such as, Business Intelligence, Machine Learning/Artificial Intelligence, Simulation, Optimization, Predictive Analytics, Data Warehousing, Big Data, Master Data Management, and Data Governance. Prior to Attunity, Jordan worked at Oracle, Domino’s Pizza (Hadoop/BI/Data Warehouse Architect), Kalido, and Information Builders. He’s is also the Founder and Lead Data Scientist for DataMartz (a lucky pun on words for his last name and profession).
David is a senior technologist specializing in complex deployments of enterprise software in physical and virtual environments. His experience encompasses all levels of product development and solution design across the latest Apache Hadoop and Apache Kafka technologies, as well as traditional business applications (n-tier ERP, relational and MPP databases, and others). David currently leads partner engineering initiatives at Confluent.