As an analytics practitioner, I am struck by the essential practices used by Thomas Edison and Henry Ford. These titans of industry undertook major initiatives, launched from the confines of their 21-acre winter estate in Ft. Myers, Florida, by first starting with strategy and then insightfully blending operations and analytics with resources to get results. Along the way, they avoided the typical pitfalls of rigid practice, misapplied skills, and poorly framed problems that stall or sink well-intended efforts.

Edison and Ford selected their projects strategically, applying critical thought but not dictating method. Ford, who was a process innovator and who revolutionized factory assembly lines, did not impose rigid process on his friend and neighbor Edison. Edison, the inventive genius, used a methodical approach incorporating data and observations that led to great insights spawning life-changing products. Many of today’s analytics programs could benefit from their approaches.

A case in point was their search for a source of synthetic rubber that could be grown quickly in the U.S. This effort started during World War I and took a decade to complete. Ford needed a reliable source of rubber for tires so he could make and sell more cars (strategic value). Edison was concerned about the supply of rubber, so essential to commerce, being threatened by war (critical issue). Their quest for a solution started with known types of trees that might produce synthetic rubber, which in turn led to others (data and resources). Then they teamed up with Harvey Firestone (external resources) who happened to make tires. That’s the right way to apply analytics: start with a critical strategic issue, assemble the data, and analyze the data looking for insights into what might yield value/desired results. Companies often need external sources of data, information, and resources (like Harvey Firestone) for successful projects. In this example, the experimental work was done in the industrialists’ Ft. Myers, Florida laboratories (akin to building predictive models), and promising results were sent to Edison’s “Invention Factory” in West Orange, New Jersey for rollout (imbedding models into operations).

What doesn’t work in analytics today? In short, it’s not the analytics components of tools and people; it’s the ability to set all that in motion in a way that yields the sought after value. Consider these:

  • Treating analytics like traditional systems development. While objectives, scope, requirements, design, data, and project management are all essential components (follow a process), in an analytics project they are arranged differently with different emphasis and timing (think with data). Issue, data, insights, models, and implementation are more the order of the day in analytics.
  • Improper framing; not managing time; knowing when you’re done. A company can spend far too much time perfecting a model with little incremental value. It can also give up too soon or too late when it doesn’t have the effort framed appropriately.
  • Cultural norms that don’t support a different way of working…or thinking. You have to be willing to take risks. And knowing when to plow ahead or cut the cord is a key skill…and an art.

What does work in analytics today is a blend of art and science. The science rests with clear strategic direction for the analytics program along with projects enabled by a focused structure of people, governance, and implementation capabilities and tools. The art centers around a supportive culture that respects data and uses different measures of success, picking the right projects, and knowing when you are done.

A recent case in point: We worked with a client that had all the essential components yet few results; they knew what they wanted but not how to get it. We evaluated their current-state analytics program using Nolan’s Analytics Maturity Model, which incorporates best practices of both art and science on a maturity curve. We then helped them design their future-state analytics program including governance, organization, process, and an implementation roadmap. We mapped in-flight projects into the new process, and we worked with their team on how to proceed with completion.

The key components of this engagement blended art and science, and set the program in motion:

  • Best practices process design: Nolan tailored the future-state process to the client’s business needs and expected capabilities, and tested it using the client’s unresolved advanced/predictive analytics opportunities.
  • Governance: We recommended a two-tier governance structure consisting of an executive level and a working level. This structure emphasizes business ownership while developing much-needed business skills.
  • Organizational design: A simple organizational structure was implemented to link business operations with the operational analytics unit to raise the quality and relevance of all analytics activities…data scientists don’t naturally sprout in an organization; the environment has to be created.
  • Operating practices: A set of operating practices were designed to help integrate the business owners into the process.
  • Opportunity map: A list of all opportunities were evaluated and mapped into the new process, with recommendations on a “Go/No Go” decision, and suggested steps for completing each project.
  • Implementation roadmap: All recommendations and opportunities were prioritized for implementation, including resource and sequencing and dependencies.
  • Overcoming barriers to implementation: Identifying and removing implementation roadblocks is essential for success. In this client’s case, data governance and data management were key items to address. Also, analytics practices and processes along with skills and resource availability had to be upgraded. Importantly, the culture was also identified as a key stumbling block: risk averse while striving for perfection at the expense of realized value.

This project created valuable impacts for our client, including:

  • Significant improvements in operational results from building and embedding predictive analytics models into business processes.
  • Increased value received from basic analytics across all business units.
  • An organization embracing an evolving culture that increasingly gains insight and value from analytics.

Undertaking a successful analytics project starts with recognizing two basic truths: An old Chinese proverb holds “If you don’t know where you’re going, any road will do.” And in Managing the Software Process, author Watts Humphrey said, “If you don’t know where you are, a map won’t help.”

If your analytics efforts aren’t delivering business impact, take action. Start with a critical issue, perhaps a stalled program or one that is gobbling resources but is not delivering value or saving time. Couple that with your vision of greater insight and value that only a successful analytics program can provide. Some companies also benchmark against peer companies who are driving business results with the aid of analytics. Now inventory and map your current processes, practices, data, tools, and staff against best practices to gauge where you are. Then implement the future-state that fits your unique environment and balances the art and science of analytics. Thomas Edison and Henry Ford would approve.

I would appreciate hearing about your experiences with analytics. Please contact me at craig_loughrige@renolan.com [2].