|By Jeremy Geelan||
|May 20, 2013 01:00 AM EDT||
Enterprises can't close their doors just because integration tools won't cope with the volume of information that their systems produce. As each day goes by, their information will become larger and more complicated, and enterprises must constantly struggle to manage the integration of dozens (or hundreds) of systems.
Apache Hadoop has quickly become the technology of choice for enterprises that need to perform complex analysis of petabytes of data, but few are aware of its potential to handle large-scale integration work. By using effective tools, integrators can process the complex transformation, synchronization, and orchestration tasks required in a high-performance, low cost, infinitely scalable way.
In his session at 12th Cloud Expo | Cloud Expo New York [June 10-13, 2013], Cédric Carbone will discuss how Hadoop can be used to integrate disparate systems and services, and provide a demonstration of the process for designing and deploying common integration tasks.
About the Speaker:
Cédric Carbone is CTO of Talend, where he manages the data integration, data quality, Big Data, MDM and ESB products lines with an international team of more than 140 R&D engineers. He's also a Board Member at the Eclipse Foundation and OW2 consortium.
Carbone has lectured at several universities on technical topics such as XML or Web Services. He holds a Master's Degree in Computer Science and an Advanced Degree in Document Engineering.