Ensuring good data quality and timely data integration in a big data environment is a significant challenge facing many organizations. Numerous data sources, multiple data formats, and increasing data velocity combine to create a highly uncertain and volatile situation for any organization that needs to process all of that data in near real time, or in batch. Jay Yusko, Vice President for Technology Research at SymphonyIRI Group will discuss how his organization tackled this problem, and the business benefits they achieved as a result.
A few learning points that will be discussed are:
The critical capabilities a data integration and data quality solution must provide in a big data environment
How those capabilities are deployed to optimize performance
The business benefits of the capabilities
Vice President for Technology Research, SymphonyIRI Group