Forrester released recently a pretty unusual report: the first Hadoop Wave. While, like other players in the Big Data market, I applaud every effort made to clarify a market that is moving very fast, I am still baffled by the criteria that were used to select the vendors included in this Wave. I mean, who do we find there? Hadoop distros. Cloud. Management. Modeling. Data integration. Event streaming. Jim Kobielus, the Wave author, told Information Week that “It's an apples and oranges collection of vendors”. I would go for a less fruity analogy:
it’s more like an elephants and kettles collection (never heard this analogy before? come on...) A number of players were omitted from this Wave. Microsoft, Oracle, Teradata, Informatica, Talend just to name a few – all have been offering Hadoop solutions for diverse amounts of time, some of them well before the Wave project even started – more than 6 months ago. And that puts us back at the heart of one of the critical issues plaguing analyst reports in general: latency. I have been arguing for a long time that such reports become obsolete before they are released. The Hadoop Wave proves the point again. In a market that is moving so fast, how can you “freeze” offerings at a set date, and release findings more than 6 months later? I certainly understand that analysts have to juggle with many priorities and far too few resources, like the rest of us. But the system is inherently broken. This Hadoop Wave is already in urgent need of revision. Other markets are more mature – but this one needs real-time research. Yves PS: As far as the technical accuracy of the Wave is concerned, Curt Monash did a pretty thorough job of pointing out a number of technical aspects, and Derrick Harris from GigaOM offered an interesting taxonomy of Hadoop – saving me the embarrassment of trying to sound clever.