One of the things I appreciated most about my high school education was the fact that the curriculum was developed with the express goal of teaching us how to think rather than teaching us by rote. There was no “new math” or studying to the test or things like that, thankfully. Unfortunately that did me a bit of a disservice when I went to college, because engineering school it seemed to me was all about rote or “plug ‘n chug” as my instructors referred to it. This was especially true of the hard sciences. I loved Physics and really wanted to continue to explore that further, but Physics in college is about applying the right equation and nothing could be more boring to me. Gone was the mystery of the universe, instead it was insert tab A into slot B. Now, I could’ve slogged through it until a Master’s or Doctoral program when things get interesting again, but why on earth wait? So I changed my major to Computer Science where you solved problems creatively, if you were so inclined, by writing awesome software.
There is a point after this preamble, I promise. I’ve been very excited about the possibilities that exist in the realm of Big Data, or as I prefer to think of it Big Information trending toward Big Knowledge. EMC has been doing great stuff in this space and our consulting organization is helping customers with some incredible solutions. Analytics is becoming a core value creator to many companies and it is a new channel for IT to contribute to shareholder value and business agility. I decided last year to enroll in a master’s program in Predictive Analytics so I could learn more about this space and hopefully contribute to EMC’s vision for Big Information. It also helps that there’s a lot of math involved, a favorite area of study for me. I’ve enjoyed the program and am in my fourth quarter, about a third of the way through things. I’ve learned a lot and have a much greater appreciation for the capabilities of analytics and the skills of data scientists and analysts. But there is a troubling trend, not just in this program but also in the way my daughters are learning math in school, the return of the plug ‘n chug. This is by no means a complaint regarding the quality of the program and does not diminish in any way my appreciation of what I am learning, I will continue to recommend the program to anyone interested.
I’ve been able to apply what I learned in high school, that is how to think, far more readily than the majority of things I learned in college. The important thing I learned was how to apply knowledge, to ask questions like why are we doing X, what outcome are we hoping for, etc. and using that information to help me find the best solution to a problem. We learned the scientific method early and applied it often. Data analysis is all about hypothesis testing, so there is a clear link, but unfortunately many of our students in this area aren’t learning about how to create the hypothesis, just what model to use to test it. Maybe that’s the difference between the data scientist and the data analyst? One designs the hypothesis and tests it, the other tests a hypothesis given to them? I’m not sure. I will say it will do our industry, and business in general, a lot more good to have professionals with a broader scope of abilities. This means that we’ve got start by changing the way that we teach these skills. This is broader than just the teaching of analytics of course, we have a fundamental education problem here in the U.S. that isn’t helped by the rise of “new math” and grading based on “effort” rather than actual knowledge attainment. But that’s a different conversation.
Mark Thiele over at SwitchScribe had a great post on the “Six Requirements of an Effective Leader” that got me to thinking about how we as leaders should be asking our people to apply their skills. I think this falls under his requirement of “Hire Great Potential – Then find the best fit for them and develop them”, but it is important as leaders that we not expect the plug ‘n chug approach to things. If we hand a very talented person an assignment like “validate that we’re not growing our market share in this segment” we are biasing whatever result we are going to get. And we are also going to frustrate our people by boxing them in with a request like that. It may seem trivial, but it really isn’t. A better request is “how has our market share changed over X period”, if you don’t box people in you can get some staggering insights from the results.
There are many of our people who are frustrated because we don’t allow them to apply their knowledge to come up with the solution, we present them with a problem and the equation or model we want them to use to solve it. We should hire potential and talent and then we should let them apply that talent as they see fit to solve problems. We can’t hold someone accountable for the result if we dictate how they are to arrive at it. An organization does not foster creativity and innovation by hiring creative people and shackling them with the old way of doing things. Let’s not hire someone who has the big box of 120 crayons and tell them they get to use 2 colors to do their work. One of the best things I’ve read in this program has been an admonition from the guy who literally wrote the book on Exploratory Data Analysis, John Tukey – “data analysts fail only if they fail to try many things”. I’m going to take it one step further and say that we fail as leaders if we don’t allow our people to try many things.