~ Chinese Proverb ~
Cluster world does not end on Hadoop Map Reduce: Shark/Sparc accelerators.
World is using Hadoop Map Reduce to perform large scale computing tasks. Does it fit every possible situation? When it's not optimal, and what are the approaches to overcome those problems? Short introduction on idea behind Sparc data processor and Shark query engine. Architecture overview, pros and cons. How the solution fits into existing systems. What are the benefits - is it worth trying and when? Limitations. Few code examples explained.More talks