I kid you not. That’s the actual title of research reported in the Journal of Management Studies. One of the authors, Mats Alvesson, has a long history of ground-breaking research. And this is another feather in his cap. Not quirky, it’s really, really smart and very useful research.
This is one of those studies which tell you more than you want to know about how organizations actually work. The researchers don’t screw around. They nail organizational BS to the wall. What’s unique about the research are...
If you read my blog, True data about big data, in which I proposed that Zero Dark Thirty’s lead character, Maya was a near perfect example of a data scientist, you’ll understand today’s blog title. The unstated subtext of that blog was that it’s unwise to turn big data over to the IT group. Sure, you’ll need them around, but their discipline differs significantly from what’s needed. And. . . I’ve found plenty of IT execs who understand that reality.
I argued the case from my experience and knowledge about the thinking styles and knowledge base of IT and other business functions. My cognitive approach is one way to go at the issue. But Marchand and Peppard arrive at the same conclusion from still a different tack in their HBR article, entitled Why IT Fumbles Analytics. They got right to the crux of their understanding in the first paragraph:
In their quest to extract insights from the massive amounts of data now available from internal and external sources, many companies are spending heavily on IT tools and hiring data scientists. Yet most are struggling to achieve a worthwhile return. That’s because they treat their big data and analytic projects the same way they treat all IT projects, not realizing that the two are completely different animals.
What this implies is that rather than emphasizing information in the big data, information is viewed not as a resource that makes possible the design and implementation of more IT systems, but, rather as something that people themselves...
We all know about STEM, but Geoff Colvin suggests that’s not the whole problem for our future work success. Instead high achievers know stuff that STEM, brilliant software and machines will never know or bring to the party.
In his Fortune article and new book, Humans are Underrated, Colvin certainly has his finger on the right three issues. What’s unique is his framing of the issue of technological obsolescence. Typically, the framing is around what computers can’t do. Year after year we try to estimate the answer to that problem. But year after year software developers and algorithms prove us wrong. Amusingly, the Germans have a new robot that can both load and unload the dishwasher. So, what next? Driving the car? Diagnosing your illness? You should know better than that. Figuring out...
Slowly, very slowly, the research and some journalists are killing off the notion of the lone genius. You know, that American mythology of individual superiority applied to creativity. Like John Wayne’s rescue before the barbarians get to you, the lone genius is all nonsense. But the myth of the lone genius will require a cruel death of many cuts. It’s one of our most deeply held and cherished beliefs--even though dead wrong.
The latest cut is delivered by William Deresiewicz in The Atlantic. He writes rather bluntly that the “truth is that the geniuses weren’t really quite as solitary as advertised. They also often came together—think of the Bloomsbury Group—in situations of intense sustained creative ferment.”
Does it really matter? That’s the question. And the answer is a clear-cut affirmative.