In his new book, Ian Ayres describes the shift he observes in areas where we previously relied a lot on intuition to number-and-lettercrunching techniques and tools. He illustrates some of the staggering results organizations are achieving and new insights labs are gaining using advanced datamining techniques and tools.
Datamining took off as a hot topic in research labs in the nineties. Gradually more tools became available and more widely spread in the corporate and government world, increasingly powerful, increasingly smart. Successes were celebrated in various fields, it worked better in some than in others because of the nature of the data, the complexity, descriptive power, completeness etc. Although computers are becoming more context aware, the difficulty still lies in notions such as ‘meaning’, ‘nuance’, ‘ambiguity’, ‘qualities’, ‘perspective’ etc. As in statistics you ask yourself: what does the data tell me and what does it not? How should I interpret what I see obscured as data and is that consistent with the reality it is trying to describe? The path from data to information (to knowledge to wisdom) is long, gradual, difficult and at times mysterious.
Also in trendwatching & patent-based innovation, use of data- and textmining tools is proliferating to enhance grip on the fast and vast global knowledge landscape, as the value of ‘knowing things and recognizing patterns first’ rises further. In futures studies, trends, certainties and ‘predictabilities’ are used, but a stronger emphasis lies on uncertainties and the different directions in which they might push us. Yet also in this area, number-crunching, modelling etc. belong to the favorite toolset of some practitioners.